LDPC Codes

For lack of a better word, I use the word "probabilistic" to mean a class of error-correcting codes that are decoded, in most practical applications, by computing conditional probabilities using real numbers.

All practical probabilistic codes are low-density parity-check (LDPC) codes because probabilistic codes require very long block lengths and if the parity check matrix was not low-density (with few nonzero elements in it), they would be too complex to implement.  LDPC codes are most effective when used with channels that make soft decisions or, in other words, channels that output a probability value for each received bit which is often provided by the Soft Output Viterbi Algorithm (SOVA) as in hard disk drives.

Most probabilistic codes currently are "binary" codes which correct "bits".

Generally speaking, it is not possible to determine the minimum distance of probabilistic codes and therefore it is not possible to guarantee that all error patterns with t or fewer errors will always be corrected.  In some cases it is possible to calculate a lower bound on the minimum distance.  It appears that there is no way to determine which error patterns will be correctly corrected by a particular LDPC code and decoder, and, if they could be identified, they would be different for each LDPC code and decoder.  With soft decisions, it probably does not make sense to talk about error patterns. (?)

LDPC codes tend to have "error floors" which means that even if one increases the signal to noise ratio, the output error rate only falls to a certain level and then sort of stays there.  The error floor phenomenon seems to contradict reason.  Reason says that if the signal to noise ratio is very high, there should be no errors.  It seems to me that error floors exist because LDPC decoders sometimes miscorrect even some very simple error patterns.

Probabilistic codes and algebraic codes have a number of things in common.  They are both linear block codes and use a generator matrix, G,  for encoding and a parity-check matrix, H,  for computing parity checks (or syndrome).

Understanding probabilistic codes requires knowledge of probability theory.


The subpage listed below goes through the examples presented in Sarah Johnson's paper at a very high level without going into details about the mathematics or how computations are done in each decoding function.  I believe it is helpful to understand the ideas at a high level first.  Details can be filled in later.