In coding theory, generalized minimum-distance (GMD) decoding provides an efficient algorithm for decoding concatenated codes, which is based on using an errors-and-erasures decoder for the outer code.
Contents
A naive decoding algorithm for concatenated codes can not be an optimal way of decoding because it does not take into account the information that maximum likelihood decoding (MLD) gives. In other words, in the naive algorithm, inner received codewords are treated the same regardless of the difference between their hamming distances. Intuitively, the outer decoder should place higher confidence in symbols whose inner encodings are close to the received word. David Forney in 1966 devised a better algorithm called generalized minimum distance (GMD) decoding which makes use of those information better. This method is achieved by measuring confidence of each received codeword, and erasing symbols whose confidence is below a desired value. And GMD decoding algorithm was one of the first examples of soft-decision decoders. We will present three versions of the GMD decoding algorithm. The first two will be randomized algorithms while the last one will be a deterministic algorithm.
Setup
Randomized algorithm
Consider the received word                               
Randomized_Decoder
Given :                               
- For every                     1 ≤ i ≤ N , computey i ′ = M L D C in ( y i ) .
- Set                     ω i = min ( Δ ( C in ( y i ′ ) , y i ) , d 2 ) .
- For every                     1 ≤ i ≤ N , repeat : With probability2 ω i d y i ″ ← ? , otherwise sety i ″ = y i ′ 
- Run errors and erasure algorithm for                     C out y ″ = ( y 1 ″ , … , y N ″ ) .
Theorem 1. Let y be a received word such that there exists a codeword                               
Note that a naive decoding algorithm for concatenated codes can correct up to                                           
Remark. If                     
Proof of lemma 1. For every                     
Next for every                     
We claim that we are done if we can show that for every                     
Clearly, by definition
Further, by the linearity of expectation, we get
To prove (2) we consider two cases:                     
Case 1:                     
Note that if                     
Further, by definition we have
Case 2:                     
In this case,                               
Since                     
Finally, this implies
In the following sections, we will finally show that the deterministic version of the algorithm above can do unique decoding of                     
Modified randomized algorithm
Note that, in the previous version of the GMD algorithm in step "3", we do not really need to use "fresh" randomness for each                     
Modified_Randomized_Decoder
Given :                               
- Set                     y i ′ = M L D C in ( y i ) .
- Compute                     ω i = min ( Δ ( C in ( y i ′ ) , y i ) , d 2 ) .
- If                     θ < 2 ω i d y i ″ ← ? , otherwise sety i ″ = y i ′ 
- Run errors and erasure algorithm for                     C out y ″ = ( y 1 ″ , … , y N ″ ) .
For the proof of Lemma 1, we only use the randomness to show that
In this version of the GMD algorithm, we note that
The second equality above follows from the choice of                     
Deterministic algorithm
Let                     
where                     
Deterministic_Decoder
Given :                               
- Compute                     y i ′ = M L D C in ( y i ) for1 ≤ i ≤ N .
- Set                     ω i = min ( Δ ( C in ( y i ′ ) , y i ) , d 2 ) for every1 ≤ i ≤ N .
- If                     θ < 2 ω i d y i ″ ← ? , otherwise sety i ″ = y i ′ 
- Run errors-and-erasures algorithm for                     C out y ″ = ( y 1 ″ , … , y N ″ ) . Letc θ C out ∘ C in 
- Among all the                     c θ y 
Every loop of 1~4 can be run in polynomial time, the algorithm above can also be computed in polynomial time. Specifically, each call to an errors and erasures decoder of                     
