Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 20 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
20
Dung lượng
119,91 KB
Nội dung
Digital Communications I: Modulation andCoding Course Period 3 - 2007 Catharina Logothetis Lecture10Lecture10 2 Last time, we talked about: Channel coding Linear block codes The error detection and correction capability Encoding and decoding Hamming codes Cyclic codes Lecture10 3 Today, we are going to talk about: Another class of linear codes, known as Convolutional codes. We study the structure of the encoder. We study different ways for representing the encoder. Lecture10 4 Convolutional codes Convolutional codes offer an approach to error control coding substantially different from that of block codes. A convolutional encoder: encodes the entire data stream, into a single codeword. does not need to segment the data stream into blocks of fixed size ( Convolutional codes are often forced to block structure by periodic truncation ). is a machine with memory. This fundamental difference in approach imparts a different nature to the design and evaluation of the code. Block codes are based on algebraic/combinatorial techniques. Convolutional codes are based on construction techniques. Lecture10 5 Convolutional codes-cont’d A Convolutional code is specified by three parameters or where is the coding rate, determining the number of data bits per coded bit. In practice, usually k=1 is chosen and we assume that from now on. K is the constraint length of the encoder a where the encoder has K-1 memory elements. There is different definitions in literatures for constraint length. ),,( Kkn ),/( Knk nkR c /= Lecture10 6 Block diagram of the DCS Information source Rate 1/n Conv. encoder Modulator Information sink Rate 1/n Conv. decoder Demodulator 4434421 sequenceInput 21 , .), .,,( i mmm=m 4434421 444344421 bits) coded ( rdBranch wo 1 sequence Codeword 321 , .), .,,,( n nijiii i , .,u, .,uuU UUUU = = = G(m)U , .) ˆ , ., ˆ , ˆ ( ˆ 21 i mmm=m { 4434421 444344421 dBranch worper outputs 1 dBranch worfor outputsr Demodulato sequence received 321 , .), .,,,( n nijii i i i , .,z, .,zzZ ZZZZ = =Z Channel Lecture10 7 A Rate ½ Convolutional encoder Convolutional encoder (rate ½, K=3) 3 shift-registers where the first one takes the incoming data bit and the rest, form the memory of the encoder. Input data bits Output coded bits m 1 u 2 u First coded bit Second coded bit 21 ,uu (Branch word) Lecture10 8 A Rate ½ Convolutional encoder 1 0 0 1 t 1 u 2 u 11 21 uu 0 1 0 2 t 1 u 2 u 01 21 uu 1 0 1 3 t 1 u 2 u 00 21 uu 0 1 0 4 t 1 u 2 u 01 21 uu )101(=m Time Output OutputTime Message sequence: (Branch word) (Branch word) Lecture10 9 A Rate ½ Convolutional encoder Encoder )101(=m )1110001011(=U 0 0 1 5 t 1 u 2 u 11 21 uu 0 0 0 6 t 1 u 2 u 00 21 uu Time Output Time Output (Branch word) (Branch word) Lecture1010 Effective code rate Initialize the memory before encoding the first bit (all- zero) Clear out the memory after encoding the last bit (all- zero) Hence, a tail of zero-bits is appended to data bits. Effective code rate : L is the number of data bits and k=1 is assumed: data Encoder codewordtail ceff R KLn L R < −+ = )1( [...]... 0 00 10 11 0/00 0/00 0/00 Output bits 11 10 0/00 0/00 1/11 1/11 0/11 1/00 1/01 0/11 1/00 0 /10 1/01 0/01 t1 1/11 0/11 1/00 0 /10 1/01 0/01 t 2 1/11 0/11 1/00 0 /10 1/01 0/01 t Lecture10 0/11 1/00 0 /10 1/01 0/01 t 3 1/11 4 0 /10 0/01 t t 5 19 6 Trellis – cont’d Tail bits Input bits 1 0 1 0 0 00 10 11 0/00 0/00 0/00 0/11 0/11 Output bits 11 10 0/00 0/00 1/11 1/11 1/11 0 /10 0/11 1/00 1/01 1/01 0 /10 0 /10 0/01... 0 /10 1/01 S3 11 S3 11 1 /10 Lecture10 input 0 1 0 1 0 1 0 1 Next state S0 S2 S0 S2 S1 S3 S1 S3 output 00 11 11 00 10 01 01 10 17 Trellis – cont’d Trellis diagram is an extension of the state diagram that shows the passage of time Example of a section of trellis for the rate ½ code State S 0 = 00 0/00 1/11 S 2 = 10 0/11 S1 = 01 1/01 1/00 0 /10 0/01 S3 = 11 1 /10 ti ti +1 Lecture10 Time 18 Trellis –cont’d... adder, and “0” otherwise Example: g1 = (111) g 2 = (101 ) u1 m u1 u 2 u2 Lecture10 11 Encoder representation – cont’d Impulse response representaiton: The response of encoder to a single “one” bit that goes through it Example: Register contents Input sequence : 1 0 0 Output sequence : 11 10 11 Input m Branch word u1 u2 100 010 1 1 1 0 001 1 1 Output 1 11 10 11 0 00 00 00 1 Modulo-2 sum: 11 10 11 11 10. .. states Lecture10 15 State diagram – cont’d A state diagram is a way to represent the encoder A state diagram contains all the states and all possible transitions between them Only two transitions initiating from a state Only two transitions ending up in a state Lecture10 16 State diagram – cont’d 0/00 Input Output (Branch word) Current state 0/11 S0 00 S2 S1 10 01 S1 01 0/01 S2 10 1/11 S0 00 1/00 0 /10. .. interlaced with m( X )g 2 ( X ) Lecture10 13 Encoder representation –cont’d In more details: m( X )g1 ( X ) = (1 + X 2 )(1 + X + X 2 ) = 1 + X + X 3 + X 4 m( X )g 2 ( X ) = (1 + X 2 )(1 + X 2 ) = 1 + X 4 m( X )g1 ( X ) = 1 + X + 0 X 2 + X 3 + X 4 m( X )g 2 ( X ) = 1 + 0 X + 0 X 2 + 0 X 3 + X 4 U( X ) = (1,1) + (1,0) X + (0,0) X 2 + (1,0) X 3 + (1,1) X 4 U = 11 10 00 10 11 Lecture10 14 State diagram A finite-state... Input m Branch word u1 u2 100 010 1 1 1 0 001 1 1 Output 1 11 10 11 0 00 00 00 1 Modulo-2 sum: 11 10 11 11 10 00 10 11 Lecture10 12 Encoder representation – cont’d Polynomial representation: We define n generator polynomials, one for each modulo-2 adder Each polynomial is of degree K-1 or less and describes the connection of the shift registers to the corresponding modulo-2 adder Example: ( ( g1 ( X )... 0/01 t t 5 19 6 Trellis – cont’d Tail bits Input bits 1 0 1 0 0 00 10 11 0/00 0/00 0/00 0/11 0/11 Output bits 11 10 0/00 0/00 1/11 1/11 1/11 0 /10 0/11 1/00 1/01 1/01 0 /10 0 /10 0/01 t1 t 2 t 0/01 t 3 Lecture10 4 t t 5 20 6 . it. Example: 1100 1 0101 0 1 1100 1 1101 1 :sequenceOutput 001 :sequenceInput 21 uu Branch word Register contents 1 1100 0101 1 1 1101 11 0000000 1 1101 11 OutputInput. Communications I: Modulation and Coding Course Period 3 - 2007 Catharina Logothetis Lecture 10 Lecture 10 2 Last time, we talked about: Channel coding Linear