1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Digital Communication I: Modulation and Coding Course-Lecture 5 potx

21 416 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 21
Dung lượng 300,5 KB

Nội dung

Digital Communications I: Modulation and Coding Course Term - 2008 Catharina Logothetis Lecture Last time we talked about:  Receiver structure  Impact of AWGN and ISI on the transmitted signal  Optimum filter to maximize SNR  Matched filter and correlator receiver  Signal space used for detection  Orthogonal N-dimensional space  Signal to waveform transformation and vice versa Lecture Today we are going to talk about:  Signal detection in AWGN channels  Minimum distance detector  Maximum likelihood  Average probability of symbol error  Union bound on error probability  Upper bound on error probability based on the minimum distance Lecture Detection of signal in AWGN  Detection problem:  Given the observation vector , perform a mapping z from to an estimate of the transmitted ˆ m z the average probability of symbol, , such that mi error in the decision is minimized n mi Modulator si z Decision rule Lecture ˆ m Statistics of the observation Vector  AWGN channel model: z = si + n  Signal vector s i = (ai1 , , , aiN ) is deterministic  Elements of noise vector n = (n1 , n2 , , nN ) are i.i.d Gaussian random variables with zero-mean and variance N / The noise vector pdf is  n2  pn (n) = exp − ( πN ) N /  N     The elements of observed vector z = ( z1 , z , , z N ) are independent Gaussian random variables Its pdf is  z − si   pz ( z | s i ) = exp − N /2  N0  ( πN )   Lecture 5 Detection  Optimum decision rule (maximum a posteriori probability): ˆ Set m = mi if Pr(mi sent | z ) ≥ Pr(mk sent | z ), for all k ≠ i  where = 1, , rule ApplyingkBayes’ M gives: ˆ Set m = mi if pz (z | mk ) pk , is maximum for all k = i pz ( z ) Lecture Detection …  Partition the signal space into M decision regions, such that Z1 , , Z M Vector z lies inside region Z i if pz (z | mk ) ln[ pk ], is maximum for all k = i pz ( z ) That means ˆ m = mi Lecture Detection (ML rule)  For equal probable symbols, the optimum decision rule (maximum posteriori probability) is simplified to: ˆ Set m = mi if pz (z | mk ), is maximum for all k = i or equivalently: ˆ Set m = mi if ln[ pz (z | mk )], is maximum for all k = i which is known as maximum likelihood Lecture Detection (ML)…  Partition the signal space into M decision regions, Z1 , , Z M  Restate the maximum likelihood decision rule as follows: Vector z lies inside region Z i if ln[ pz (z | mk )], is maximum for all k = i That means ˆ m = mi Lecture Detection rule (ML)…  It can be simplified to: Vector z lies inside region Z i if z − s k , is minimum for all k = i or equivalently: Vector r lies inside region Z i if N ∑ z j akj − Ek , is maximum for all k = i j =1 where Ek is the energy of sk (t ) Lecture 10 Maximum likelihood detector block diagram 〈⋅,s1 〉 − E1 z Choose the largest ˆ m 〈⋅, s M 〉 − EM Lecture 11 Schematic example of the ML decision regions ψ (t ) Z2 s2 Z3 s3 s1 Z1 ψ (t ) s4 Z4 Lecture 12 Average probability of symbol error  Erroneous decision: For the transmitted symbol or m i equivalently signal vector , s error in decision occurs if an i the observation vector z does not fall inside region Z  i Probability of erroneous decision for a transmitted symbol or equivalently ˆ Pe (mi ) = Pr(m ≠ mi and mi sent) ˆ Pr(m ≠ mi ) = Pr(mi sent)Pr(z does not lie inside Z i mi sent)  Probability of correct decision for a transmitted symbol ˆ Pr(m = mi ) = Pr( mi sent)Pr(z lies inside Z i mi sent) Pc (mi ) = Pr(z lies inside Z i mi sent) = ∫ p (z | m )dz z i Zi Pe (mi ) = − Pc (mi ) Lecture 13 Av prob of symbol error …  Average probability of symbol error : M ˆ PE ( M ) = ∑ Pr (m ≠ mi )  i =1 For equally probable symbols: PE ( M ) = M M ∑ Pe (mi ) = − M i =1 = 1− M M ∑ P (m ) i =1 c i M ∑ ∫ p (z | m )dz i =1 Z i z i Lecture 14 Example for binary PAM pz (z | m2 ) pz (z | m1 ) s2 − Eb s1 ψ (t ) Eb  s1 − s /   Pe (m1 ) = Pe (m2 ) = Q  N /2     Eb PB = PE (2) = Q  N      Lecture 15 Union bound Union bound The probability of a finite union of events is upper bounded by the sum of the probabilities of the individual events    Let Aki denote that the observation vector z is closer to the symbol vector s k than s i , when s i is transmitted Pr( Aki ) = P2 (s k , s i ) depends only on s i and s k Applying Union bounds yields M Pe (mi ) ≤ ∑ P2 (s k , s i ) k =1 k ≠i PE ( M ) ≤ M Lecture M M ∑∑ P (s i =1 k =1 k ≠i k , si ) 16 Example of union bound Pe (m1 ) = ∫ p (r | m )dr r Z ∪Z3 ∪Z r Z2 ψ2 Z1 s2 s1 Union bound: ψ1 s3 s4 Z4 Z3 Pe (m1 ) ≤ ∑ P2 (s k , s1 ) k =2 A2 r ψ2 r ψ2 s2 r s2 s1 s3 P2 (s , s1 ) = s4 ∫ p (r | m )dr r A2 s2 s1 ψ1 ψ2 s1 ψ1 s3 P2 (s , s1 ) = A3 s4 ∫ p (r | m )dr r A3 Lecture ψ1 s3 P2 (s , s1 ) = A4 s4 ∫ p (r | m )dr r A4 17 Upper bound based on minimum distance P2 (s k , s i ) = Pr(z is closer to s k than s i , when s i is sent) =  d ik / u2 exp(− )du =Q  N /2 N0 πN 0  ∞ ∫ d ik     d ik = s i − s k PE ( M ) ≤ M  d /  ∑∑ P2 (s k , si ) ≤ (M − 1)Q N /    i =1 k =1   k ≠i M M Minimum distance in the signal space: d = d ik i ,k i≠k Lecture 18 Example of upper bound on av Symbol error prob based on union bound ψ (t ) s i = Ei = Es , i = 1, ,4 d i ,k = Es i≠k Es d = Es s3 − Es s2 d1, d 2,3 s1 d 3, d1, Es ψ (t ) s4 − Es Lecture 19 Eb/No figure of merit in digital communications  SNR or S/N is the average signal power to the average noise power SNR should be modified in terms of bit-energy in DCS, because:  Signals are transmitted within a symbol duration and hence, are energy signal (zero power)  A merit at bit-level facilitates comparison of different DCSs transmitting different number of bits per symbol Eb STb S W = = N N / W N Rb Rb W : Bit rate : Bandwidth Lecture 20 Example of Symbol error prob For PAM signals Binary PAM s2 s1 − Eb s4 −6 Eb ψ (t ) Eb 4-ary PAM s3 s2 E −2 b E b s1 Eb ψ (t ) T Lecture T t 21 ψ (t ) ... we talked about:  Receiver structure  Impact of AWGN and ISI on the transmitted signal  Optimum filter to maximize SNR  Matched filter and correlator receiver  Signal space used for detection... deterministic  Elements of noise vector n = (n1 , n2 , , nN ) are i.i.d Gaussian random variables with zero-mean and variance N / The noise vector pdf is  n2  pn (n) = exp − ( πN ) N /  N... z = ( z1 , z , , z N ) are independent Gaussian random variables Its pdf is  z − si   pz ( z | s i ) = exp − N /2  N0  ( πN )   Lecture 5 Detection  Optimum decision rule (maximum a posteriori

Ngày đăng: 23/03/2014, 10:21

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN