1. Trang chủ
  2. » Giáo Dục - Đào Tạo

Tài liệu Telecommunication Networks - Information Theory pdf

51 389 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 51
Dung lượng 2,92 MB

Nội dung

Telecommunication Networks Information Theory By: Vinh Dang 12/12/13 Outline Introduction of Telecommunication networks  Basic concepts  Information  Entropy  Source Encoding  12/12/13 Introduction    What is telecommunications? This word is derived from the Greek word, "tele", which means, "far off"; and the Latin word, "communicate", which means, "to share" Hence Telecommunication is distance communication The true nature of telecommunications is the passing of information to one or more others in any form that may be used 12/12/13 Telecommunications   People tend to think of telecommunications in terms of telephones, computer networks, Internet, and maybe even cable television This includes the often-considered electrical, electromagnetic, and optical means already mentioned, but it also includes simple wire, radio, or even other visual forms 12/12/13 Early Telecommunications     Drum and horn Smoke/Fire Signal Light Pigeons 12/12/13 Smoke Drum Tower using mirrors Pigeons Advancing telecommunications     Telegraph (Morse) Telephone (Bell) Wireless communication Satellite Communication 12/12/13 A BTS (Mobile Communication) VINASAT-1 satellite Mobile communication systems Wireless communication systems Data communication systems Telecommunication Networks Satellite communication systems Switching systems 12/12/13 Optical communiation systems Voice&Television systems  A telecommunicaton networks is a network of telecommunication links and nodes arranged so that messages may be passed from one part of the network to another over multiple links and through various modes 12/12/13 Telecommunication Networks  Telecommunication networks are subdivided into the following areas: Transport Transmission facilities Switching Switch or Exchange or Central Office (CO) Access Equipment for the access of subscribers (Access networks AN) Customer Premises Equipment (CPE) Subscriber terminal equipment Switching equipment and transmission facilities together form the core network  The customer premises equipment is connected to the core network via the access network   Transport and switching are the two basic functions for the transfer of user information 12/12/13 AN N T S IEM EN S N IX DO RF Transport AN AN N T N T S IE M E N S N IX DO RF Exchange S IEM EN S N IX DO RF Access Network (AN) with subscriber terminals (CPE) AN N T S IE M E N S N IXD O R F 12/12/13 Transport channels (inter exchange connections): • Trunks for transmitting user information • Signaling links for transmitting the control information 10 Shannon-Fano coding L = ∑ pi li = 2.41 i =1 H (U ) = −∑ pi log pi = 2.37 i =1 eff = H (U ) 2.37 = = 0.98 L 2.41 The code generated is prefix code due to equiprobable partitioning Not lead to an unique prefix code Many prefix codes have the same efficiency 37 12/12/13 Huffman Coding [1][2][3]  Procedure: steps  Listing the source symbols in order of decreasing probability The two source symbols of lowest probabilities are assigned a and a These two source symbols are combined into a new source symbol with probability equal to the sum of the two original probabilities The new probability is placed in the list in accordance with its value Repeat until the final probability of new combined symbol is 1.0 Example: 12/12/13 38 Examples of Huffman Coding Ui pi U1 34 U2 U3 U4 U5 U6 U7 0 23 19 07 06 01 12/12/13 0 07 1 14 24 1 42 58 1.0 Ui Codewords U1 00 U2 10 U3 11 U4 011 U5 0100 U6 01010 U7 01011 39 Huffman Coding: disadvantages When source have many symbols (outputs/messages), the code becomes bulkyHufman code + fixed-length code  Still some redundancy and redundancy is large with a small set of messagesgrouping multiple independent messages  12/12/13 40 Huffman Coding: disadvantages Example 9.8 and 9.9 ([2],pp 437-438)  Grouping make redundancy small but the number of codewords grows exponentially, code become more complex and delay is introduced  12/12/13 41 Entropy example  Horse race with horses, with winning probabilities ẵ, ẳ, 1/8, 1/16, 1/64, 1/64, 1/64, 1/64 Entropy H(X) = bits  How many bits we need?  (a) Index each horse  log8 = bits  (b) Assign shorter codes to horses with higher probability: 0, 10, 110, 1110, 111100, 111101, 111110, 111111  average description length = bits!  12/12/13 42 Entropy Need at least H(X) bits to represent X  H(X) is a lower bound on the required descriptor length  Entropy = uncertainty of a random variable  12/12/13 43 Joint and conditional entropy Joint entropy: H(X,Y) = ∑x ∑y p(x,y) log p(x,y)  simple extension of entropy to RVs  Conditional Entropy: H(Y|X) = ∑x p(x) H(Y|X=x) = ∑x ∑y p(x,y) log p(y|x) “What is uncertainty of Y if X is known?”  Easy to verify:    If X, Y independent, then H(Y|X) = H(Y) If Y = X, then H(Y|X) = H(Y|X) = extra information between X & Y 44 12/12/13  Fact: H(X,Y) = H(X) + H(Y|X) Mutual Information  I(X;Y) = H(X) – H(X|Y) = reduction of uncertainty due to another variable  I(X;Y) = ∑x ∑y p(x,y) log p(x,y)/{p(x)p(y)}  “How much information about Y is contained in X?”   If X,Y independent, then I(X;Y) = If X,Y are same, then I(X;Y) = H(X) = H(Y)  Symmetric and non-negative 12/12/13 45 Mutual Information Relationship between entropy, joint and mutual information 12/12/13 46 Mutual Information I(X;Y) is a great measure of similarity between X and Y  Widely used in image/signal processing  Medical imaging example:   MI  based image registration Why? MI is insensitive to gain and bias 12/12/13 47 Homework  Calculate H(X) for a discrete memoryless source having six symbols with probabilities: PA=1/2, PB=1/4, PC=1/8, PD=PE=1/20,PF=1/40  Then find the amount of information contained in the messages ABABBA and FDDFDF and compare with the expect amount of information in a six-smbol message 12/12/13 48 Homework  A certain data source has 16 equiprobable symbols, each 1ms long The symbols are produced in blocks of 15, separated by 5-ms spaces Find the source symbol rate 12/12/13 49 Homework  Obtain the Shannon-Fano code for the source in Homework 1, and caculate the efficiency 12/12/13 50 References [1] R E Ziemer & W H Transter, “Information Theory and Coding”, Principles of Communications: Systems, Modulation, and Noise, 5th edition John Wiley, pp 667-720, 2002 [2] A Bruce Carlson, “Communications Systems”, Mc Graw-Hill, 1986, ISBN 0-07100560-9 [3] S Haykin, “Fundamental Limits in Information Theory”, Communication Systems, 4th edition, John Wiley & Sons Inc, pp 567-625, 2001 12/12/13 51 ... 21 Example    For a binary source (M=2), p(1)=α and p(0)= 1-? ? = β From (2), we have the binary entropy: H(X)= -? ?.logα -( 1-? ?).log( 1-? ?) 12/12/13 22 Source coding theorem   Information from a... edition John Wiley, pp 66 7-7 20, 2002 [2] A Bruce Carlson, “Communications Systems”, Mc Graw-Hill, 1986, ISBN 0-0 710056 0-9 [3] S Haykin, “Fundamental Limits in Information Theory? ??, Communication... network to another over multiple links and through various modes 12/12/13 Telecommunication Networks  Telecommunication networks are subdivided into the following areas: Transport Transmission

Ngày đăng: 12/12/2013, 14:15

TỪ KHÓA LIÊN QUAN

w