Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2003
…
10 pages
1 file
LT codes are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In the original analysis of these codes, it was assumed that for each encoding symbol, the neighbors used to generate that encoding symbol are chosen uniformly at random. Practical implementations of LT codes cannot afford this amount of randomness, because all random bits must be communicated to the decoding party. Instead, they use a linear congruential generator to reduce the randomness used per encoding symbol to a seed consisting of two random numbers. We show that such limited randomness LT codes perform almost as well as the fully random version. Thus even limited randomness LT codes are asymptotically optimal.
Foundations of Computer Science, 1995
An (n, c, , r)-erasure code consists of an encoding algorithm and a decoding algorithm with the following properties. The encoding algorithm produces a set of-bit packets of total length cn from an n-bit message. The decoding algorithm is able to recover the message from any set of packets whose total length is r, i.e., from any set of r/ packets. We describe erasure codes where both the encoding and decoding algorithms run in linear time and where r is only slightly larger than n.
Proceedings of IEEE 36th Annual Foundations of Computer Science
An (n, c, , r)-erasure code consists of an encoding algorithm and a decoding algorithm with the following properties. The encoding algorithm produces a set of -bit packets of total length cn from an n-bit message. The decoding algorithm is able to recover the message from any set of packets whose total length is r, i.e., from any set of r/ packets. We describe erasure codes where both the encoding and decoding algorithms run in linear time and where r is only slightly larger than n.
2014
We present a linear time probabilistic erasure code with the following properties. The encoding algorithm produces cn letters from an n letter message. The decoding algorithm is able with high probability to reconstruct the message from any set of (1+ ǫ)n letters. Alon and Luby [5, 6] simultaneously developed a deterministic version but their running time is O(ǫ−4n) instead of our O(ǫ−1 ln(ǫ)n) time. We also decrease the minimum packet size from many to one letter. In some sense this erasure code technology has been completely subsumed by the latest generation of RaptorQ codes [12, 10]. On the other hand there are still some good ideas in this paper. One such idea is matching upper and lower bounds on the cost of a particular game having to do with randomly throwing balls into a hierarchy of bins. This game has application to information theory.
IEEE Communications Letters, 2013
The erasure floor performance of Luby Transform (LT) codes is mainly determined by the minimum variablenode degree. Thus we propose a modified encoding scheme that maximizes the minimum variable-node degree for transmission over binary erasure channels. The proposed scheme leads to an almost-regular variable-node degree distribution. The encoding process is generalized to accommodate arbitrary variable-node degree distributions for additional improved performance. The asymptotic performance is investigated using density evolution and compared with a conventional LT code. The scheme is further extended to enable a higher level of unequal erasure protection.
2005
The paper introduces ensembles of accumulate-repeat-accumulate (ARA) codes which asymptotically achieve capacity on the binary erasure channel (BEC) with bounded complexity per information bit. It also introduces symmetry properties which play a central role in the construction of capacity-achieving ensembles for the BEC. The results here improve on the tradeoff between performance and complexity provided by the first capacity-achieving ensembles of irregular repeat-accumulate (IRA) codes with bounded complexity per information bit; these IRA ensembles were previously constructed by Pfister, Sason and Urbanke. The superiority of ARA codes with moderate to large block length is exemplified by computer simulations which compare their performance with those of previously reported capacity-achieving ensembles of LDPC and IRA codes. The ARA codes also have the advantage of being systematic.
2011
Over the Internet, bit errors within the data packets translate into packet losses at the higher layers of the OSI model, yielding a packet erasure channel. Modern erasure correcting codes promise to offer a very simple and efficient solution to data transfers over these channels, opening up also other interesting applications. Amongst them one can enumerate reliable large scale content distribution, high quality real-time data transfers, distributed storage and others. These considerations make the study of such codes an actual and interesting topic.Most of the analyses presented in literature focus on the evaluation of the performances ensured by these codes. This paper presents an evaluation of the decoding complexity of some rateless erasure codes, which is another relevant issue that affects the applicability of these codes. The complexities of several decoding methods are evaluated using several metrics which reflect different operations performed during the decoding. All resu...
2005
The paper introduces ensembles of accumulate-repeat-accumulate (ARA) codes which asymptotically achieve capacity on the binary erasure channel (BEC) with bounded complexity per information bit. It also introduces symmetry properties which play a central role in the construction of capacity-achieving ensembles for the BEC. The results here improve on the tradeoff between performance and complexity provided by the first capacity-achieving ensembles of irregular repeat-accumulate (IRA) codes with bounded complexity per information bit; these IRA ensembles were previously constructed by Pfister, Sason and Urbanke. The superiority of ARA codes with moderate to large block length is exemplified by computer simulations which compare their performance with those of previously reported capacity-achieving ensembles of LDPC and IRA codes. The ARA codes also have the advantage of being systematic.
IEEE Transactions on Information Theory, 2001
We introduce a simple erasure recovery algorithm for codes derived from cascades of sparse bipartite graphs and analyze the algorithm by analyzing a corresponding discrete-time random process. As a result, we obtain a simple criterion involving the fractions of nodes of different degrees on both sides of the graph which is necessary and sufficient for the decoding process to finish successfully with high probability. By carefully designing these graphs we can construct for any given rate and any given real number a family of linear codes of rate which can be encoded in time proportional to ln(1 ) times their block length . Furthermore, a codeword can be recovered with high probability from a portion of its entries of length (1 + ) or more. The recovery algorithm also runs in time proportional to ln(1 ). Our algorithms have been implemented and work well in practice; various implementation issues are discussed.
The paper introduces ensembles of accumulate-repeat-accumulate (ARA) codes which asymptotically achieve capacity on the binary erasure channel (BEC) with bounded complexity, per information bit, of encoding and decoding. It also introduces symmetry properties which play a central role in the construction of capacity-achieving ensembles for the BEC with bounded complexity. The results here improve on the tradeoff between performance and complexity provided by previous constructions of capacity-achieving ensembles of codes defined on graphs. The superiority of ARA codes with moderate to large block length is exemplified by computer simulations which compare their performance with those of previously reported capacity-achieving ensembles of LDPC and IRA codes. The ARA codes also have the advantage of being systematic. Index terms - binary erasure channel (BEC), capacity, complexity, degree distribution (d.d.), density evolution (DE), iterative decoding, irregular repeat-accumulate (IRA...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
IEEE Transactions on Information Theory, 2000
IEEE Transactions on Information Theory, 1996
ETRI Journal, 2016
IEEE Communications Letters, 2015
Signals and Communication Technology, 2017
IEEE Transactions on Information Theory, 2004
IEEE Communications Letters, 2005
IET Communications, 2015
2009 IEEE Information Theory Workshop, 2009
European Transactions on Telecommunications, 2007
Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 2015
2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton), 2011
IEEE GLOBECOM 2007-2007 IEEE Global Telecommunications Conference, 2007
IEEE Transactions on Information Theory