MLSE in a single path channel MLSE - Maximum Lielihood Sequence Estimation T he op timal detector is the one w hich selects from all p ossib le transmitted b it sequences the one w ith hig hest p rob ab ility In A W G N channel the sequence p rob ab ility can b e calculated as a multip lication of indiv idual sy mb ol p rob ab ilities p (y, y,..., y N x, x,..., x N ) A W = G N S in g le p a th = p (y x ) p (y x, x,..., x N ) MLSE in a multipath channel A multipath channel mixes the symbols and therefore the probabilities of the symbols can not be calculated independently p (y, y,..., y N x, x,..., x N ) A W = G N L ch an n el p ath s = p (y x, x,..., x N ) p (y x, x,..., x L ) F or applying MLSE we have to now how to create a state based model for the channel calculate the conditional probability for the observed symbols State model for a multipath channel The observed value y is modelled as a convolution of the channel impulse response h ch and input symbols x observed in the noise n y = h x + n In a L tap channel: to the value of y contribute only L previous symbols x, x,..., x L we defi ne a state as a combination of L previous bits s = [ x, x,..., x L ] For a noise value with mean 0 the mean value of the sample y is defi ned by f (x, s, h ch ) and calculated as State model for a multipath channel The symbols x,..., x L defining the state s can tae values from the possible symbol set For example for a B P SK modulation x ± Since each symbol could have two possible values we have in total L possible states At one stage the trellis will have L possible states The x and the state s and h ch define the new state s + The new state is combination of the new bit and L youngest bit from the state s s + = [ x x... x (L ) ] = [ x s,ex lud ing x L ] f (x, s, h ch ) = h x + h x +... h L x L
Transition probabilities In a channel with only one path the observed value y is a function of only one symbol x y = h x + n In a AWGN channel probability of the observed sample is p(y x, h ) = e (y h x ) σn π σ n In a multipath channel the mean value of the observed value y is computed based on the state s and the possible new bit x. The state transition probability is p(y x, s, h ch ) = e (y (h x +h x + +h L x L )) σn π σ n Probabilty of a symbol sequence The probability of a possible transmitted symbols sequence calculated based on the observed symbols is p(y, y,..., y n x, x,..., x N, h ch ) = = p (y x, s, h ch ) We can calculate it in logarithmic domain p (y x, x,..., x N, h ch ) log (p(y, y,..., y n x, x,..., x N )) = log (p (y x, s, h ch )) = ( ( ) ) log πσ n σn (y f (x, s, h ch )) Calculation of the transition probabilities... ( ) The terms log are common for all the sequences πσ n and we can drop them The scaling constant is σn the same for all the sequences: it scales all the sequences since we want to find maximum over the sequences we can drop the scaling since we dropped also multiplication by the maximiz ation becomes minimiz ation log (p(y, y,..., y n x, x,..., x N )) ((y f (x, s, h ch )) ) MLSE The maximum lielihood sequence estimation (MLSE) becomes a problem of finding the sequence with the minimum weight as following max (p(y, y,..., y n x, x,..., x N )) o v er all the sequences o f x ( ) min (y f (x, s, h ch )) o v er all the sequences o f x Where f (x, s, h ch ) = h x + h x + + h L x (L )
Viterbi algorithm The decoder has to find the bit sequence that generates the state sequence that is nearest to the received sequence y Each transition in the trellis depends only on the starting state and the end state We now what would be the output from a state (without a noise) The noiseless state output gives the mean value for the observation y The noise is Gaussian and we can calculate the probability based on the state transition (x, s, h ch ) and the received symbol y The total probability for a state sequence is the multiplication of all the probabilities along the path of the sta te seq u ence in the Marov chain Viterbi algorithm... For all transitions in the trellis compute the sum of the metrix in the initial state and in the transition L(s, s + ) = L(s ) + L(y s, s ) At each state select among the incoming paths the one with the minimum metrix (the survival) L(s + ) = min L(s, s + ) At each state remember the previous surviving state (in previous stage) and the bit corresponding to the surviving path Example: Viterbi Algorithm Use a Viterbi decoder for estimating the transmitted bit sequence when the channel is h ch = δ(0) + 0.3 δ(). The receid signal is y rec = [.3, 0.,,.4, 0.9 ] The input is a sequence of binary bits b modulated as x 0. Estimate the most liely transmitted bit sequence System description Two path channel generates two scaled copies of the transmitted bit sequence. R eceiver sees sum of these copies plus noise h x x x 3 x 4 h x x x 3 x 4 n n + n 3 n 4 n 5 y = h ch x + n y y = y 3 y 4 y 5
Construction of the Marov model Problem visualisation The received sample y is a function of L transmitted symbols We can construct a state model where the state is defined by L previous symbols and transition is defined by the new incoming symbol L is the length of the channel response [+ + ] z h h y = h x + h x no ise + [.3, 0.,,.4, 0.9 ] System state model for the trellis middle section 0 -.3-0.7.3 0 0.7 Trellis diagram construction We assign costs µ Stage,initial state, fi nal state for each transition S tage= S tage= S tage=3 S tage=4 S tage=5 Transition calculation Assign the weights for each transition At the trellis middle section Normal opertional mode µ,0,0 µ,0, µ,0,0 µ,0, µ,,0 µ,, µ 3,,0 µ 3,0,0 µ 3,0, µ 3,, µ 4,,0 µ 4,0,0 µ 4,0, µ 4,, µ 5,0,0 µ 5,, y y y 3 y 4 y 5 Transitions in diff erent trellis sections B eginning M iddle End At the beginning of the trellis Not all the channel taps are accounted Last taps are not assigned any symbol since the signal along these paths have not yet arrived At the end of the trellis The signal at the first taps have ended we have to account only the signal from the last channel taps 0 -... 0 -.3 0.7 0-0.7... 0-0.3.3 0.3
Example Trellis with the weights in transitions Decoding: Stage S ta g e = S ta g e = S ta g e = 3 S ta g e = 4 S ta g e = 5.9 6 0.0 4.4 0.6 4.8 9 0.49 7.9 y :.3 y : 0. y 3 : y 4 :.4 y 5 = 0.9 y =.3 Decoding: Stage Decoding: Stage 3.96 0.73 0.73 4.4 0.64 y = 0..89.53 y 3 = 0.73.53 y = 0..53 0.8 y 3 =
Decoding: Stage 4 Decoding: Stage 5 4.4 0.0.3.3.67 0.49 0.8 y 4 =.4 8. y 4 = 0.9 4.4.3 4.4 0.8 7.9 8. y 4 =.4 8. 9.55 y 4 = 0.9 Example: Finding the ML sequence At the final Stage = 5 w e hav e tw o states L (.6 7 ) S elect the o ne w ith m inim u m rem em b ered su rv ing p ath L (9.5 5 ) v alu e and fo llo w b ac alo ng the T he M L seq u ence is the b it seq u ence co rresp o nd ing to this p ath. In o u r ex am p le it is 0 0 0 Example: Finding the ML sequence... The detected bit sequecne has 5 symbols the transmitted sequence has only 4 symbols O ne additional symbol, at the end of the sequence, is due to the convolution C onvolution of the symbol sequence with the channel has prolonged the sequence by L elements In our case L =, and it gives additional element The additional symbols are the same as the last L symbols