1. Trang chủ
  2. » Công Nghệ Thông Tin

Tài liệu An Introduction to Statistical Signal Processing ppt

460 420 1

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 460
Dung lượng 1,72 MB

Nội dung

An Introduction to Statistical Signal Processing Pr(f ∈ F )=P ({ω : ω ∈ F})=P(f −1 (F )) f −1 (F ) f F ✲ May 5, 2000 ii An Introduction to Statistical Signal Processing Robert M. Gray and Lee D. Davisson Information Systems Laboratory Department of Electrical Engineering Stanford University and Department of Electrical Engineering and Computer Science University of Maryland iv c 1999 by the authors. v to our Families vi Contents Preface xi Glossary xv 1 Introduction 1 2 Probability 11 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.2 Spinning Pointers and Flipping Coins . . . . . . . . . . . . 15 2.3 Probability Spaces . . . . . . . . . . . . . . . . . . . . . . . 23 2.3.1 Sample Spaces . . . . . . . . . . . . . . . . . . . . . 28 2.3.2 Event Spaces . . . . . . . . . . . . . . . . . . . . . . 31 2.3.3 Probability Measures . . . . . . . . . . . . . . . . . . 42 2.4 Discrete Probability Spaces . . . . . . . . . . . . . . . . . . 45 2.5 Continuous Probability Spaces . . . . . . . . . . . . . . . . 56 2.6 Independence . . . . . . . . . . . . . . . . . . . . . . . . . . 70 2.7 Elementary Conditional Probability . . . . . . . . . . . . . 71 2.8 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 3 Random Objects 85 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 3.1.1 Random Variables . . . . . . . . . . . . . . . . . . . 85 3.1.2 Random Vectors . . . . . . . . . . . . . . . . . . . . 89 3.1.3 Random Processes . . . . . . . . . . . . . . . . . . . 93 3.2 Random Variables . . . . . . . . . . . . . . . . . . . . . . . 95 3.3 Distributions of Random Variables . . . . . . . . . . . . . . 104 3.3.1 Distributions . . . . . . . . . . . . . . . . . . . . . . 104 3.3.2 Mixture Distributions . . . . . . . . . . . . . . . . . 108 3.3.3 Derived Distributions . . . . . . . . . . . . . . . . . 111 3.4 Random Vectors and Random Processes . . . . . . . . . . . 115 3.5 Distributions of Random Vectors . . . . . . . . . . . . . . . 117 vii viii CONTENTS 3.5.1 Multidimensional Events . . . . . . . . . . . . . . . 118 3.5.2 Multidimensional Probability Functions . . . . . . . 119 3.5.3 Consistency of Joint and Marginal Distributions . . 120 3.6 Independent Random Variables . . . . . . . . . . . . . . . . 127 3.6.1 IID Random Vectors . . . . . . . . . . . . . . . . . . 128 3.7 Conditional Distributions . . . . . . . . . . . . . . . . . . . 129 3.7.1 Discrete Conditional Distributions . . . . . . . . . . 130 3.7.2 Continuous Conditional Distributions . . . . . . . . 131 3.8 Statistical Detection and Classification . . . . . . . . . . . . 134 3.9 Additive Noise . . . . . . . . . . . . . . . . . . . . . . . . . 137 3.10 Binary Detection in Gaussian Noise . . . . . . . . . . . . . 144 3.11 Statistical Estimation . . . . . . . . . . . . . . . . . . . . . 146 3.12 Characteristic Functions . . . . . . . . . . . . . . . . . . . . 147 3.13 Gaussian Random Vectors . . . . . . . . . . . . . . . . . . . 152 3.14 Examples: Simple Random Processes . . . . . . . . . . . . . 154 3.15 Directly Given Random Processes . . . . . . . . . . . . . . 157 3.15.1 The Kolmogorov Extension Theorem . . . . . . . . . 157 3.15.2 IID Random Processes . . . . . . . . . . . . . . . . . 158 3.15.3 Gaussian Random Processes . . . . . . . . . . . . . . 158 3.16 Discrete Time Markov Processes . . . . . . . . . . . . . . . 159 3.16.1 A Binary Markov Process . . . . . . . . . . . . . . . 159 3.16.2 The Binomial Counting Process . . . . . . . . . . . . 162 3.16.3 Discrete Random Walk . . . . . . . . . . . . . . . . 165 3.16.4 The Discrete Time Wiener Process . . . . . . . . . . 166 3.16.5 Hidden Markov Models . . . . . . . . . . . . . . . . 167 3.17 Nonelementary Conditional Probability . . . . . . . . . . . 168 3.18 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 4 Expectation and Averages 187 4.1 Averages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 4.2 Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . 190 4.2.1 Examples: Expectation . . . . . . . . . . . . . . . . 192 4.3 Functions of Several Random Variables . . . . . . . . . . . . 200 4.4 Properties of Expectation . . . . . . . . . . . . . . . . . . . 200 4.5 Examples: Functions of Several Random Variables . . . . . 203 4.5.1 Correlation . . . . . . . . . . . . . . . . . . . . . . . 203 4.5.2 Covariance . . . . . . . . . . . . . . . . . . . . . . . 205 4.5.3 Covariance Matrices . . . . . . . . . . . . . . . . . . 206 4.5.4 Multivariable Characteristic Functions . . . . . . . . 207 4.5.5 Example: Differential Entropy of a Gaussian Vector 209 4.6 Conditional Expectation . . . . . . . . . . . . . . . . . . . . 210 4.7  Jointly Gaussian Vectors . . . . . . . . . . . . . . . . . . . 213 CONTENTS ix 4.8 Expectation as Estimation . . . . . . . . . . . . . . . . . . . 216 4.9  Implications for Linear Estimation . . . . . . . . . . . . . 222 4.10 Correlation and Linear Estimation . . . . . . . . . . . . . . 224 4.11 Correlation and Covariance Functions . . . . . . . . . . . . 231 4.12 The Central Limit Theorem . . . . . . . . . . . . . . . . . 235 4.13 Sample Averages . . . . . . . . . . . . . . . . . . . . . . . . 237 4.14 Convergence of Random Variables . . . . . . . . . . . . . . 239 4.15 Weak Law of Large Numbers . . . . . . . . . . . . . . . . . 244 4.16 Strong Law of Large Numbers . . . . . . . . . . . . . . . . 246 4.17 Stationarity . . . . . . . . . . . . . . . . . . . . . . . . . . . 251 4.18 Asymptotically Uncorrelated Processes . . . . . . . . . . . . 256 4.19 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259 5 Second-Order Moments 281 5.1 Linear Filtering of Random Processes . . . . . . . . . . . . 282 5.2 Second-Order Linear Systems I/O Relations . . . . . . . . . 284 5.3 Power Spectral Densities . . . . . . . . . . . . . . . . . . . . 289 5.4 Linearly Filtered Uncorrelated Processes . . . . . . . . . . . 292 5.5 Linear Modulation . . . . . . . . . . . . . . . . . . . . . . . 298 5.6 White Noise . . . . . . . . . . . . . . . . . . . . . . . . . . . 301 5.7 Time-Averages . . . . . . . . . . . . . . . . . . . . . . . . . 305 5.8 Differentiating Random Processes . . . . . . . . . . . . . . 309 5.9 Linear Estimation and Filtering . . . . . . . . . . . . . . . 312 5.10 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326 6 A Menagerie of Processes 343 6.1 Discrete Time Linear Models . . . . . . . . . . . . . . . . . 344 6.2 Sums of IID Random Variables . . . . . . . . . . . . . . . . 348 6.3 Independent Stationary Increments . . . . . . . . . . . . . . 350 6.4 Second-Order Moments of ISI Processes . . . . . . . . . . 353 6.5 Specification of Continuous Time ISI Processes . . . . . . . 355 6.6 Moving-Average and Autoregressive Processes . . . . . . . . 358 6.7 The Discrete Time Gauss-Markov Process . . . . . . . . . . 360 6.8 Gaussian Random Processes . . . . . . . . . . . . . . . . . . 361 6.9 The Poisson Counting Process . . . . . . . . . . . . . . . . 361 6.10 Compound Processes . . . . . . . . . . . . . . . . . . . . . . 364 6.11 Exponential Modulation . . . . . . . . . . . . . . . . . . . 366 6.12 Thermal Noise . . . . . . . . . . . . . . . . . . . . . . . . . 371 6.13 Ergodicity and Strong Laws of Large Numbers . . . . . . . 373 6.14 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377 x CONTENTS A Preliminaries 389 A.1 Set Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . 389 A.2 Examples of Proofs . . . . . . . . . . . . . . . . . . . . . . . 397 A.3 Mappings and Functions . . . . . . . . . . . . . . . . . . . . 401 A.4 Linear Algebra . . . . . . . . . . . . . . . . . . . . . . . . . 402 A.5 Linear System Fundamentals . . . . . . . . . . . . . . . . . 405 A.6 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410 B Sums and Integrals 417 B.1 Summation . . . . . . . . . . . . . . . . . . . . . . . . . . . 417 B.2 Double Sums . . . . . . . . . . . . . . . . . . . . . . . . . . 420 B.3 Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . 421 B.4 The Lebesgue Integral . . . . . . . . . . . . . . . . . . . . 423 C Common Univariate Distributions 427 D Supplementary Reading 429 Bibliography 434 Index 438 [...]... Statistical Signal Processing Much of the basic content of this course and of the fundamentals of random processes can be viewed as the analysis of statistical signal processing systems: typically one is given a probabilistic description for one random object, which can be considered as an input signal An operation or mapping or filtering is applied to the input signal (signal processing) to produce a new random... “input signal, ” to produce a new signal, an “output signal, ” is generally referred to as signal processing, a topic easily illustrated by examples • A time varying voltage waveform is produced by a human speaking into a microphone or telephone This signal can be modeled by a random process This signal might be modulated for transmission, it might be digitized and coded for transmission on a digital link,... between an average engineer and an outstanding engineer is the ability to derive effective models providing a good balance between complexity and accuracy Random processes usually occur in applications in the context of envi1 2 CHAPTER 1 INTRODUCTION ronments or systems which change the processes to produce other processes The intentional operation on a signal produced by one process, an “input signal, ” to. .. transmitter but random to an observer at the receiver The theory of random processes quantifies the above notions so that one can construct mathematical models of real phenomena that are both tractable and meaningful in the sense of yielding useful predictions of future behavior Tractability is required in order for the engineer (or anyone else) to be able to perform analyses and syntheses of random processes,... this approach is thus to use intuitive arguments that accurately reflect the underlying mathematics and which will hold up under scrutiny if the student continues to more advanced courses Another goal is to enable the student who might not continue to more advanced courses to be able to read and generally follow the modern literature on applications of random processes to information and communication theory,... the first author, to the Stanford University Information Systems Laboratory Industrial Affiliates Program which supported the computer facilities used to compose this book, and to the generations of students who suffered through the ever changing versions and provided a stream of comments and corrections Thanks are also due to Richard Blahut and anonymous referees for their careful reading and commenting... been transmitted through this very poor channel • Signals produced by biomedical measuring devices can display specific behavior when a patient suddenly changes for the worse Signal processing systems can look for these changes and warn medical personnel when suspicious behavior occurs How are these signals characterized? If the signals are random, how does one find stable behavior or structure to describe... development and implications should be accessible, and the most common examples of random processes and classes of random processes should be familiar In particular, the student should be well equipped to follow the gist of most arguments in the various Transactions of the IEEE dealing with random processes, including the IEEE Transactions on Signal Processing, IEEE Transactions on Image Processing, IEEE Transactions... collegues, readers, students, and former students It is important when doing the problems to justify any “yes/no” answers If an answer is “yes,” prove it is so If the answer is “no,” provide a counterexample 10 CHAPTER 1 INTRODUCTION Chapter 2 Probability 2.1 Introduction The theory of random processes is a branch of probability theory and probability theory is a special case of the branch of mathematics known... important to engineers tend to be neglected In addition, too much time can be spent with the formal details, obscuring the often simple and elegant ideas behind a proof Often little, if any, physical motivation for the topics is given 4 CHAPTER 1 INTRODUCTION This book attempts a compromise between the two approaches by giving the basic, elementary theory and a profusion of examples in the language and . An Introduction to Statistical Signal Processing Pr(f ∈ F )=P ({ω : ω ∈ F})=P(f −1 (F )) f −1 (F ) f F ✲ May 5, 2000 ii An Introduction to Statistical. considered as an input signal. An operation or mapping or filtering is applied to the input signal (signal processing) to produce a new random object, the out- put signal.

Ngày đăng: 16/01/2014, 16:33

TỪ KHÓA LIÊN QUAN