Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2011, Corr
…
16 pages
1 file
Orthogonal Matching Pursuit (OMP) has long been considered a powerful heuristic for attacking compressive sensing problems; however, its theoretical development is, unfortunately, somewhat lacking. This paper presents an improved Restricted Isometry Property (RIP) based performance guarantee for ܶ-sparse signal reconstruction that asymptotically approaches the conjectured lower bound given in Davenport et al. We also further extend the state-of-the-art by deriving reconstruction error bounds for the case of general non-sparse signals subjected to measurement noise. We then generalize our results to the case of K-fold Orthogonal Matching Pursuit (KOMP). We finish by presenting an empirical analysis suggesting that OMP and KOMP outperform other compressive sensing algorithms in average case scenarios. This turns out to be quite surprising since RIP analysis (i.e. worst case scenario) suggests that these matching pursuits should perform roughly T^0.5 times worse than convex optimization, CoSAMP, and Iterative Thresholding.
2011 National Conference on Communications (NCC), 2011
Compressed Sensing (CS) provides a set of mathematical results showing that sparse signals can be exactly reconstructed from a relatively small number of random linear measurements. A particularly appealing greedy-approach to signal reconstruction from CS measurements is the so called Orthogonal Matching Pursuit (OMP). We propose two modifications to the basic OMP algorithm, which can be handy in different situations.
Information Theory, IEEE …, 2010
Orthogonal matching pursuit (OMP) is the canonical greedy algorithm for sparse approximation. In this paper we demonstrate that the restricted isometry property (RIP) can be used for a very straightforward analysis of OMP. Our main conclusion is that the RIP of order K+1 (with isometry constant δ 1 / (3 K^(1/2))) is sufficient for OMP to exactly recover any K-sparse signal. The analysis relies on simple and intuitive observations about OMP and matrices which satisfy the RIP. For restricted classes of K-sparse signals (those that are highly compressible), a relaxed bound on the isometry constant is also established. A deeper understanding of OMP may benefit the analysis of greedy algorithms in general. To demonstrate this, we also briefly revisit the analysis of the regularized OMP (ROMP) algorithm.
IEEE Transactions on Signal Processing, 2015
Greed is good. However, the tighter you squeeze, the less you have. In this paper, a less greedy algorithm for sparse signal reconstruction in compressive sensing, named orthogonal matching pursuit with thresholding is studied. Using the global 2-coherence , which provides a "bridge" between the well known mutual coherence and the restricted isometry constant, the performance of orthogonal matching pursuit with thresholding is analyzed and more general results for sparse signal reconstruction are obtained. It is also shown that given the same assumption on the coherence index and the restricted isometry constant as required for orthogonal matching pursuit, the thresholding variation gives exactly the same reconstruction performance with significantly less complexity.
Accepted in Signal Processing Letter on 21.8.2013
"Generalized Orthogonal Matching Pursuit (gOMP) is a natural extension of OMP algorithm where unlike OMP, it may select $N (\geq1)$ atoms in each iteration. In this paper, we demonstrate that gOMP can successfully reconstruct a $K$-sparse signal from a compressed measurement $ {\bf y}={\bf \Phi x}$ by $K^{th}$ iteration if the sensing matrix ${\bf \Phi}$ satisfies restricted isometry property (RIP) of order $NK$ where $\delta_{NK} < \frac {\sqrt{N}}{\sqrt{K}+2\sqrt{N}}$. Our bound offers an improvement over the very recent result shown in \cite{wang_2012b}. Moreover, we present another bound for gOMP of order $NK+1$ with $\delta_{NK+1} < \frac {\sqrt{N}}{\sqrt{K}+\sqrt{N}}$ which exactly relates to the near optimal bound of $\delta_{K+1} < \frac {1}{\sqrt{K}+1}$ for OMP (N=1) as shown in \cite{wang_2012a}."
IEEE Transactions on Signal Processing, 2013
Digital Signal Processing, 2012
Compressed sensing is a developing field aiming at reconstruction of sparse signals acquired in reduced dimensions, which make the recovery process under-determined. The required solution is the one with minimum ℓ 0 norm due to sparsity, however it is not practical to solve the ℓ 0 minimization problem. Commonly used techniques include ℓ 1 minimization, such as Basis Pursuit (BP) and greedy pursuit algorithms such as Orthogonal Matching Pursuit (OMP) and Subspace Pursuit (SP). This manuscript proposes a novel semi-greedy recovery approach, namely A* Orthogonal Matching Pursuit (A*OMP). A*OMP performs A* search to look for the sparsest solution on a tree whose paths grow similar to the Orthogonal Matching Pursuit (OMP) algorithm. Paths on the tree are evaluated according to a cost function, which should compensate for different path lengths. For this purpose, three different auxiliary structures are defined, including novel dynamic ones. A*OMP also incorporates pruning techniques which enable practical applications of the algorithm. Moreover, the adjustable search parameters provide means for a complexity-accuracy trade-off. We demonstrate the reconstruction ability of the proposed scheme on both synthetically generated data and images using Gaussian and Bernoulli observation matrices, where A*OMP yields less reconstruction error and higher exact recovery frequency than BP, OMP and SP. Results also indicate that novel dynamic cost functions provide improved results as compared to a conventional choice.
Constructive Approximation, 2016
This paper is concerned with the performance of Orthogonal Matching Pursuit (OMP) algorithms applied to a dictionary D in a Hilbert space H. Given an element f ∈ H, OMP generates a sequence of approximations f n , n = 1, 2,. . ., each of which is a linear combination of n dictionary elements chosen by a greedy criterion. It is studied whether the approximations f n are in some sense comparable to best n term approximation from the dictionary. One important result related to this question is a theorem of Zhang [8] in the context of sparse recovery of finite dimensional signals. This theorem shows that OMP exactly recovers n-sparse signal, whenever the dictionary D satisfies a Restricted Isometry Property (RIP) of order An for some constant A, and that the procedure is also stable in ℓ 2 under measurement noise. The main contribution of the present paper is to give a structurally simpler proof of Zhang's theorem, formulated in the general context of n term approximation from a dictionary in arbitrary Hilbert spaces H. Namely, it is shown that OMP generates near best n term approximations under a similar RIP condition.
IEEE Transactions on Signal Processing, 2015
Orthogonal Matching Pursuit (OMP) and Basis Pursuit (BP) are two well-known recovery algorithms in compressed sensing. To recover a d-dimensional m-sparse signal with high probability, OMP needs O (m ln d) number of measurements, whereas BP needs only O m ln d m number of measurements. In contrary, OMP is a practically more appealing algorithm due to its superior execution speed. In this piece of work, we have proposed a scheme that brings the required number of measurements for OMP closer to BP. We have termed this scheme as OMPα, which runs OMP for (m + αm)-iterations instead of m-iterations, by choosing a value of α ∈ [0, 1]. It is shown that OMPα guarantees a high probability signal recovery with O m ln d αm +1 number of measurements. Another limitation of OMP unlike BP is that it requires the knowledge of m. In order to overcome this limitation, we have extended the idea of OMPα to illustrate another recovery scheme called OMP∞, which runs OMP until the signal residue vanishes. It is shown that OMP∞ can achieve a close to 0-norm recovery without any knowledge of m like BP. Theorem 1 (Theorem 1 of [2]). Let N ≥ C 1 m ln d m , and Φ has N × d Gaussian i.i.d entries. The following statement is true with probability exceeding 1 − e −c1N. It is possible
—We consider the orthogonal matching pursuit (OMP) algorithm for the recovery of a high-dimensional sparse signal based on a small number of noisy linear measurements. OMP is an iterative greedy algorithm that selects at each step the column, which is most correlated with the current residuals. In this paper, we present a fully data driven OMP algorithm with explicit stopping rules. It is shown that under conditions on the mutual incoherence and the minimum magnitude of the nonzero components of the signal, the support of the signal can be recovered exactly by the OMP algorithm with high probability. In addition, we also consider the problem of identifying significant components in the case where some of the nonzero components are possibly small. It is shown that in this case the OMP algorithm will still select all the significant components before possibly selecting incorrect ones. Moreover, with modified stopping rules, the OMP algorithm can ensure that no zero components are selected.
2020
In this paper, we propose a new orthogonal matching pursuit algorithm called quasi-OMP algorithm which greatly enhances the performance of classical orthogonal matching pursuit (OMP) algorithm, at some cost of computational complexity. We are able to show that under some sufficient conditions of mutual coherence of the sensing matrix, the QOMP Algorithm succeeds in recovering the s-sparse signal vector x within s iterations where a total number of 2s columns are selected under the both noiseless and noisy settings. In addition, we show that for Gaussian sensing matrix, the norm of the residual of each iteration will go to zero linearly depends on the size of the matrix with high probability. The numerical experiments are demonstrated to show the effectiveness of QOMP algorithm in recovering sparse solutions which outperforms the classic OMP and GOMP algorithm.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Computing Research Repository, 2008
2014 IEEE Workshop on Statistical Signal Processing (SSP), 2014
2012
Communications (NCC), 2012 National …, 2012
Journal of Applied Computer Science Methods, 2014
IEEE Transactions on Information Theory, 2009
Applications of Digital Signal Processing, 2011
EURASIP Journal on Advances in Signal Processing, 2012