چكيده به لاتين
In applied science and engineering, we are often dealing with solving ill-posed inverse problems where the number of observations is far fewer than the signal dimension. However, in many practical scenarios, the signals of interest are structured which means that they have fewer degrees of freedom than the signal dimension. These low-dimensional structures can be extracted from few observations by identifying structure-inducing convex functions and solving the corresponding convex optimization. In many applications including MRI, radar, machine learning, computer vision and recommender systems, besides the inherent structure, there exists some prior information about the signal of interest. In this thesis, we aim at incorporating this prior information into the recovery procedure optimally. For this purpose, first, we introduce weighted convex functions promoting both the inherent structure and prior information. Then, we obtain the optimal weights such that the required number of observations for exact recovery is minimized. Generally, the results of this thesis can be divided into two main parts: Finding tight bounds that describe the required number of observations and obtaining the unique optimal weights in models with prior information. The low-dimensional models that we consider in this thesis are: sparse signals, block-sparse signals, sparse signals in redundant dictionaries, gradient-sparse signals and low-rank matrices. Simulation and analytical results show that by using optimal weights, the number of required observations substantially decreases.