1. Trang chủ
  2. » Giáo Dục - Đào Tạo

Tài liệu Modulation and coding course- lecture 5 doc

21 437 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 21
Dung lượng 128,4 KB

Nội dung

Digital Communications I: Modulation and Coding Course Period - 2007 Catharina Logothetis Lecture Last time we talked about: Receiver structure Impact of AWGN and ISI on the transmitted signal Optimum filter to maximize SNR Matched filter and correlator receiver Signal space used for detection Orthogonal N-dimensional space Signal to waveform transformation and vice versa Lecture Today we are going to talk about: Signal detection in AWGN channels Minimum distance detector Maximum likelihood Average probability of symbol error Union bound on error probability Upper bound on error probability based on the minimum distance Lecture Detection of signal in AWGN Detection problem: Given the observation vector z , perform a ˆ mapping from z to an estimate m of the transmitted symbol, mi , such that the average probability of error in the decision is minimized n mi Modulator si Lecture z Decision rule ˆ m Statistics of the observation Vector AWGN channel model: z = s i + n Signal vector s i = (ai1 , , , aiN ) is deterministic Elements of noise vector n = (n1 , n2 , , nN ) are i.i.d Gaussian random variables with zero-mean and variance N / The noise vector pdf is ⎛ n2⎞ ⎟ pn (n) = exp⎜ − (πN )N / ⎜ N ⎟ ⎝ ⎠ The elements of observed vector z = ( z1 , z , , z N ) are independent Gaussian random variables Its pdf is ⎛ z − si ⎞ ⎟ pz ( z | s i ) = exp⎜ − N /2 ⎜ N0 ⎟ (πN ) ⎝ ⎠ Lecture 5 Detection Optimum decision rule (maximum a posteriori probability): ˆ Set m = mi if Pr( mi sent | z ) ≥ Pr(mk sent | z ), for all k ≠ i where k = 1, , M Applying Bayes’ rule gives: ˆ Set m = mi if pz (z | mk ) pk , is maximum for all k = i pz ( z ) Lecture Detection … Partition the signal space into M decision regions, Z1 , , Z M such that Vector z lies inside region Z i if pz (z | mk ) ], is maximum for all k = i ln[ pk pz ( z ) That means ˆ m = mi Lecture Detection (ML rule) For equal probable symbols, the optimum decision rule (maximum posteriori probability) is simplified to: ˆ Set m = mi if pz (z | mk ), is maximum for all k = i or equivalently: ˆ Set m = mi if ln[ pz (z | mk )], is maximum for all k = i which is known as maximum likelihood Lecture Detection (ML)… Partition the signal space into M decision regions, Z1 , , Z M Restate the maximum likelihood decision rule as follows: Vector z lies inside region Z i if ln[ pz (z | mk )], is maximum for all k = i That means ˆ m = mi Lecture Detection rule (ML)… It can be simplified to: Vector z lies inside region Z i if z − s k , is minimum for all k = i or equivalently: Vector r lies inside region Z i if N ∑ z j akj − Ek , is maximum for all k = i j =1 where Ek is the energy of sk (t ) Lecture 10 Maximum likelihood detector block diagram 〈⋅,s1 〉 − E1 z Choose the largest ˆ m 〈⋅, s M 〉 − EM Lecture 11 Schematic example of ML decision regions ψ (t ) Z2 s2 s3 s1 Z3 Z1 ψ (t ) s4 Z4 Lecture 12 Average probability of symbol error Erroneous decision: For the transmitted symbol mi or equivalently signal vector s i , an error in decision occurs if the observation vector z does not fall inside region Z i Probability of erroneous decision for a transmitted symbol or equivalently ˆ Pe (mi ) = Pr(m ≠ mi and mi sent) ˆ Pr(m ≠ mi ) = Pr(mi sent)Pr(z does not lie inside Z i mi sent) Probability of correct decision for a transmitted symbol ˆ Pr(m = mi ) = Pr(mi sent)Pr(z lies inside Z i mi sent) Pc (mi ) = Pr(z lies inside Z i mi sent) = Pe (mi ) = − Pc (mi ) ∫ p (z | m )dz z i Zi Lecture 13 Av prob of symbol error … Average probability of symbol error : M ˆ PE ( M ) = ∑ Pr (m ≠ mi ) i =1 For equally probable symbols: PE ( M ) = M M ∑ Pe (mi ) = − M i =1 = 1− M M ∑ P (m ) i =1 c i M ∑ ∫ p (z | m )dz i =1 Z i z Lecture i 14 Example for binary PAM pz (z | m2 ) pz (z | m1 ) s2 − Eb s1 ψ (t ) Eb ⎛ s1 − s / ⎞ ⎟ Pe (m1 ) = Pe (m2 ) = Q⎜ ⎜ N /2 ⎟ ⎝ ⎠ ⎛ Eb PB = PE (2) = Q⎜ ⎜ N ⎝ Lecture ⎞ ⎟ ⎟ ⎠ 15 Union bound Union bound The probability of a finite union of events is upper bounded by the sum of the probabilities of the individual events Let Aki denote that the observation vector z is closer to the symbol vector s k than s i , when s i is transmitted Pr( Aki ) = P2 (s k , s i ) depends only on s i and s k Applying Union bounds yields M PE ( M ) ≤ M Pe (mi ) ≤ ∑ P2 (s k , s i ) k =1 k ≠i Lecture M M ∑∑ P (s i =1 k =1 k ≠i k , si ) 16 Example of union bound Pe (m1 ) = ∫ p (r | m )dr r Z ∪Z3 ∪Z r Z2 ψ2 Z1 s2 s1 ψ1 Union bound: s4 Z4 s3 Z3 Pe (m1 ) ≤ ∑ P2 (s k , s1 ) k =2 ψ2 A2 r ψ2 r s2 r s2 s1 s4 s3 P2 (s , s1 ) = ∫ p (r | m )dr r A2 s2 s1 ψ1 ψ2 s1 ψ1 s3 A3 P2 (s , s1 ) = ∫ p (r | m )dr r A3 Lecture s4 ψ1 s3 P2 (s , s1 ) = A4 s4 ∫ p (r | m )dr r A4 17 Upper bound based on minimum distance P2 (s k , s i ) = Pr(z is closer to s k than s i , when s i is sent) ⎛ d ik / u2 exp(− )du =Q⎜ ⎜ N /2 N0 πN 0 ⎝ ∞ = ∫ d ik ⎞ ⎟ ⎟ ⎠ d ik = s i − s k PE ( M ) ≤ M ⎛ d / ⎞ ∑∑ P2 (s k , si ) ≤ (M − 1)Q⎜ N / ⎟ ⎜ ⎟ i =1 k =1 ⎝ ⎠ k ≠i M M Minimum distance in the signal space: d = d ik Lecture i ,k i≠k 18 Example of upper bound on av Symbol error prob based on union bound ψ (t ) s i = Ei = Es , i = 1, ,4 d i ,k = Es i≠k Es d = Es s2 d1, d 2,3 s3 − Es s1 d 3, d1, ψ (t ) Es s4 − Es Lecture 19 Eb/No figure of merit in digital communications SNR or S/N is the average signal power to the average noise power SNR should be modified in terms of bit-energy in DCS, because: Signals are transmitted within a symbol duration and hence, are energy signal (zero power) A merit at bit-level facilitates comparison of different DCSs transmitting different number of bits per symbol Eb STb S W = = N N / W N Rb Lecture Rb W : Bit rate : Bandwidth 20 Example of Symbol error prob For PAM signals Binary PAM s2 s1 − Eb s4 −6 Eb ψ (t ) Eb 4-ary PAM s3 s2 E −2 b E b s1 Eb ψ1 (t ) T Lecture T t 21 ψ (t ) ... z = ( z1 , z , , z N ) are independent Gaussian random variables Its pdf is ⎛ z − si ⎞ ⎟ pz ( z | s i ) = exp⎜ − N /2 ⎜ N0 ⎟ (πN ) ⎝ ⎠ Lecture 5 Detection Optimum decision rule (maximum a posteriori... Z i z Lecture i 14 Example for binary PAM pz (z | m2 ) pz (z | m1 ) s2 − Eb s1 ψ (t ) Eb ⎛ s1 − s / ⎞ ⎟ Pe (m1 ) = Pe (m2 ) = Q⎜ ⎜ N /2 ⎟ ⎝ ⎠ ⎛ Eb PB = PE (2) = Q⎜ ⎜ N ⎝ Lecture ⎞ ⎟ ⎟ ⎠ 15 Union... deterministic Elements of noise vector n = (n1 , n2 , , nN ) are i.i.d Gaussian random variables with zero-mean and variance N / The noise vector pdf is ⎛ n2⎞ ⎟ pn (n) = exp⎜ − (πN )N / ⎜ N ⎟

Ngày đăng: 27/01/2014, 08:20

TỪ KHÓA LIÊN QUAN