Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Accepted in Signal Processing Letter on 21.8.2013
…
8 pages
1 file
"Generalized Orthogonal Matching Pursuit (gOMP) is a natural extension of OMP algorithm where unlike OMP, it may select $N (\geq1)$ atoms in each iteration. In this paper, we demonstrate that gOMP can successfully reconstruct a $K$-sparse signal from a compressed measurement $ {\bf y}={\bf \Phi x}$ by $K^{th}$ iteration if the sensing matrix ${\bf \Phi}$ satisfies restricted isometry property (RIP) of order $NK$ where $\delta_{NK} < \frac {\sqrt{N}}{\sqrt{K}+2\sqrt{N}}$. Our bound offers an improvement over the very recent result shown in \cite{wang_2012b}. Moreover, we present another bound for gOMP of order $NK+1$ with $\delta_{NK+1} < \frac {\sqrt{N}}{\sqrt{K}+\sqrt{N}}$ which exactly relates to the near optimal bound of $\delta_{K+1} < \frac {1}{\sqrt{K}+1}$ for OMP (N=1) as shown in \cite{wang_2012a}."
Corr, 2011
Orthogonal Matching Pursuit (OMP) has long been considered a powerful heuristic for attacking compressive sensing problems; however, its theoretical development is, unfortunately, somewhat lacking. This paper presents an improved Restricted Isometry Property (RIP) based performance guarantee for ܶ-sparse signal reconstruction that asymptotically approaches the conjectured lower bound given in Davenport et al. We also further extend the state-of-the-art by deriving reconstruction error bounds for the case of general non-sparse signals subjected to measurement noise. We then generalize our results to the case of K-fold Orthogonal Matching Pursuit (KOMP). We finish by presenting an empirical analysis suggesting that OMP and KOMP outperform other compressive sensing algorithms in average case scenarios. This turns out to be quite surprising since RIP analysis (i.e. worst case scenario) suggests that these matching pursuits should perform roughly T^0.5 times worse than convex optimization, CoSAMP, and Iterative Thresholding.
Information Theory, IEEE …, 2010
Orthogonal matching pursuit (OMP) is the canonical greedy algorithm for sparse approximation. In this paper we demonstrate that the restricted isometry property (RIP) can be used for a very straightforward analysis of OMP. Our main conclusion is that the RIP of order K+1 (with isometry constant δ 1 / (3 K^(1/2))) is sufficient for OMP to exactly recover any K-sparse signal. The analysis relies on simple and intuitive observations about OMP and matrices which satisfy the RIP. For restricted classes of K-sparse signals (those that are highly compressible), a relaxed bound on the isometry constant is also established. A deeper understanding of OMP may benefit the analysis of greedy algorithms in general. To demonstrate this, we also briefly revisit the analysis of the regularized OMP (ROMP) algorithm.
Constructive Approximation, 2016
This paper is concerned with the performance of Orthogonal Matching Pursuit (OMP) algorithms applied to a dictionary D in a Hilbert space H. Given an element f ∈ H, OMP generates a sequence of approximations f n , n = 1, 2,. . ., each of which is a linear combination of n dictionary elements chosen by a greedy criterion. It is studied whether the approximations f n are in some sense comparable to best n term approximation from the dictionary. One important result related to this question is a theorem of Zhang [8] in the context of sparse recovery of finite dimensional signals. This theorem shows that OMP exactly recovers n-sparse signal, whenever the dictionary D satisfies a Restricted Isometry Property (RIP) of order An for some constant A, and that the procedure is also stable in ℓ 2 under measurement noise. The main contribution of the present paper is to give a structurally simpler proof of Zhang's theorem, formulated in the general context of n term approximation from a dictionary in arbitrary Hilbert spaces H. Namely, it is shown that OMP generates near best n term approximations under a similar RIP condition.
IEEE Transactions on Signal Processing, 2015
Orthogonal Matching Pursuit (OMP) and Basis Pursuit (BP) are two well-known recovery algorithms in compressed sensing. To recover a d-dimensional m-sparse signal with high probability, OMP needs O (m ln d) number of measurements, whereas BP needs only O m ln d m number of measurements. In contrary, OMP is a practically more appealing algorithm due to its superior execution speed. In this piece of work, we have proposed a scheme that brings the required number of measurements for OMP closer to BP. We have termed this scheme as OMPα, which runs OMP for (m + αm)-iterations instead of m-iterations, by choosing a value of α ∈ [0, 1]. It is shown that OMPα guarantees a high probability signal recovery with O m ln d αm +1 number of measurements. Another limitation of OMP unlike BP is that it requires the knowledge of m. In order to overcome this limitation, we have extended the idea of OMPα to illustrate another recovery scheme called OMP∞, which runs OMP until the signal residue vanishes. It is shown that OMP∞ can achieve a close to 0-norm recovery without any knowledge of m like BP. Theorem 1 (Theorem 1 of [2]). Let N ≥ C 1 m ln d m , and Φ has N × d Gaussian i.i.d entries. The following statement is true with probability exceeding 1 − e −c1N. It is possible
2011 National Conference on Communications (NCC), 2011
Compressed Sensing (CS) provides a set of mathematical results showing that sparse signals can be exactly reconstructed from a relatively small number of random linear measurements. A particularly appealing greedy-approach to signal reconstruction from CS measurements is the so called Orthogonal Matching Pursuit (OMP). We propose two modifications to the basic OMP algorithm, which can be handy in different situations.
Communications (NCC), 2012 National …, 2012
Compressed Sensing deals with recovering sparse signals from a relatively small number of linear measurements. Several algorithms exists for data recovery from the compressed measurements, particularly appealing among these is the greedy approach known as Orthogonal Matching Pursuit (OMP). In this paper, we propose a modified OMP based algorithm called Ordered Orthogonal Matching Pursuit (Ordered OMP). Ordered OMP is conceptually simpler and provides an improved performance when compared to OMP.
IEEE Transactions on Signal Processing, 2013
2020
In this paper, we propose a new orthogonal matching pursuit algorithm called quasi-OMP algorithm which greatly enhances the performance of classical orthogonal matching pursuit (OMP) algorithm, at some cost of computational complexity. We are able to show that under some sufficient conditions of mutual coherence of the sensing matrix, the QOMP Algorithm succeeds in recovering the s-sparse signal vector x within s iterations where a total number of 2s columns are selected under the both noiseless and noisy settings. In addition, we show that for Gaussian sensing matrix, the norm of the residual of each iteration will go to zero linearly depends on the size of the matrix with high probability. The numerical experiments are demonstrated to show the effectiveness of QOMP algorithm in recovering sparse solutions which outperforms the classic OMP and GOMP algorithm.
2012
Orthogonal Matching Pursuit (OMP) is a simple, yet empirically competitive algorithm for sparse recovery. Recent developments have shown that OMP guarantees exact recovery of K-sparse signals with K or more than K iterations if the observation matrix satisfies the restricted isometry property (RIP) with some conditions. We develop RIP-based online guarantees for recovery of a K-sparse signal with more than K OMP iterations. Though these guarantees cannot be generalized to all sparse signals a priori, we show that they can still hold online when the state-of-the-art K-step recovery guarantees fail. In addition, we present bounds on the number of correct and false indices in the support estimate for the derived condition to be less restrictive than the K-step guarantees. Under these bounds, this condition guarantees exact recovery of a K-sparse signal within 3 2 K iterations, which is much less than the number of steps required for the state-of-the-art exact recovery guarantees with more than K steps. Moreover, we present phase transitions of OMP in comparison to basis pursuit and subspace pursuit, which are obtained after extensive recovery simulations involving different sparse signal types. Finally, we empirically analyse the number of false indices in the support estimate, which indicates that these do not violate the developed upper bound in practice.
2014 IEEE Workshop on Statistical Signal Processing (SSP), 2014
In this paper we define a new coherence index, named the global 2-coherence, of a given dictionary and study its relationship with the traditional mutual coherence and the restricted isometry constant. By exploring this relationship, we obtain more general results on sparse signal reconstruction using greedy algorithms in the compressive sensing (CS) framework. In particular, we obtain an improved bound over the best known results on the restricted isometry constant for successful recovery of sparse signals using orthogonal matching pursuit (OMP).
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
2012 National Conference on Communications (NCC), 2012
2018 Twenty Fourth National Conference on Communications (NCC), 2018
Digital Signal Processing, 2012
IEEE Transactions on Signal Processing, 2015
Journal of Applied Computer Science Methods, 2014
Applied and Computational Harmonic Analysis, 2018
IEEE Transactions on Signal Processing
Computing Research Repository, 2008
Science China Information Sciences, 2012