Recalling the definition of energy spectral density, Equation(2.73), we see that it is of use only for energy signals for which the integral ofπΊ(π)over all frequencies gives total energy, a finite quantity. For power signals, it is meaningful to speak in terms ofpower spectral density.
Analogous toπΊ(π), we define the power spectral densityπ(π)of a signalπ₯(π‘)as a real, even,
that is,
π =β«
β
ββπ(π) ππ =β¨ π₯2(π‘)β©
(2.139) whereβ¨
π₯2(π‘)β©
= limπββ 1
2π β«βππ π₯2(π‘) ππ‘denotes the time average ofπ₯2(π‘). Sinceπ(π)is a function that gives the variation of density of power with frequency, we conclude that it must consist of a series of impulses for the periodic power signals that we have so far considered.
Later, in Chapter 7, we will consider power spectra of random signals.
EXAMPLE 2.18
Considering the cosinusoidal signal
π₯(π‘) = π΄ cos(2ππ0π‘ + π) (2.140)
we note that its average power per ohm,12π΄2, is concentrated at the single frequencyπ0hertz. However, since the power spectral density must be an even function of frequency, we split this power equally between+π0andβπ0hertz. Thus, the power spectral density ofπ₯(π‘)is, from intuition, given by
π(π) =1
4 π΄2πΏ(π β π0) + 1 4 π΄2πΏ(
π + π0
) (2.141)
Checking this by using(2.139), we see that integration over all frequencies results in the average power per ohm of1
2π΄2.
β
2.5.1 The Time-Average Autocorrelation Function
To introduce the time-average autocorrelation function, we return to the energy spectral density of an energy signal,(2.73). Without any apparent reason, suppose we take the inverse Fourier transform ofπΊ(π), letting the independent variable beπ:
π(π)βββ1[πΊ(π)] =ββ1[π(π)πβ(π)]
=ββ1[π(π)] βββ1[πβ(π)] (2.142) The last step follows by application of the convolution theorem. Applying the time-reversal theorem (Item 3b in Table F.6 in Appendix F) to writeββ1[πβ(π)] = π₯(βπ)and then the convolution theorem, we obtain
π(π) = π₯(π) β π₯(βπ) =β«
β
ββπ₯(π)π₯(π + π) ππ
= lim
πβββ«
π
βπ π₯(π)π₯(π + π) ππ(energy signal) (2.143) Equation(2.143)will be referred to as thetime-average autocorrelation function for energy signals. We see that it gives a measure of the similarity, or coherence, between a signal and a delayed version of the signal. Note thatπ(0) = πΈ, the signal energy. Also note the similarity of the correlation operation to convolution. The major point of(2.142)is that the autocorrelation function and energy spectral density are Fourier-transform pairs. We forgo further discussion
of the time-average autocorrelation function for energy signals in favor of analogous results for power signals.
The time-average autocorrelation functionπ (π)of a power signalπ₯(π‘)is defined as the time average
π (π) =β¨π₯(π‘)π₯(π‘ + π)β©
β lim
πββ
1 2π β«
π
βππ₯(π‘)π₯(π‘ + π) ππ‘(power signal) (2.144) Ifπ₯(π‘)is periodic with periodπ0, the integrand of(2.144)is periodic, and the time average can be taken over a single period:
π (π) = 1
π0β«π0π₯(π‘)π₯(π‘ + π) ππ‘ [π₯(π‘)periodic]
Just likeπ(π), π (π)gives a measure of the similarity between a power signal at timeπ‘and at timeπ‘ + π; it is a function of the delay variableπ, since time,π‘, is the variable of integration.
In addition to being a measure of the similarity between a signal and its time displacement, we note that the total average power of the signal is
π (0) =β¨ π₯2(π‘)β©
=β«
β
ββπ(π) ππ (2.145)
Thus, we suspect that the time-average autocorrelation function and power spectral density of a power signal are closely related, just as they are for energy signals. This relationship is stated formally by theWiener--Khinchine theorem, which says that the time-average autocorrelation function of a signal and its power spectral density are Fourier-transform pairs:
π(π) =β[π (π)] =β«
β
ββπ (π) πβπ2πππππ (2.146) and
π (π) =ββ1[π(π)] =β«
β
ββπ(π)ππ2πππππ (2.147)
A formal proof of the Wiener--Khinchine theorem will be given in Chapter 7. We simply take(2.146)as the definition of power spectral density at this point. We note that(2.145) follows immediately from(2.147)by settingπ = 0.
2.5.2 Properties ofR(π)
The time-average autocorrelation function has several useful properties, which are listed below:
1. π (0) =β¨ π₯2(π‘)β©
β₯|π (π)|, for allπ; that is, an absolute maximum ofπ (π)exists atπ = 0. 2. π (βπ) =β¨π₯(π‘)π₯(π‘ β π)β©= π (π); that is,π (π)is even.
3. lim|π|ββπ (π) =β¨π₯(π‘)β©2ifπ₯(π‘)does not contain periodic components.
4. Ifπ₯(π‘)is periodic inπ‘with periodπ0, thenπ (π)is periodic inπwith periodπ0.
5. The time-average autocorrelation function of any power signal has a Fourier transform that is nonnegative.
Property 5 results by virtue of the fact that normalized power is a nonnegative quantity.
These properties will be proved in Chapter 7.
The autocorrelation function and power spectral density are important tools for systems analysis involving random signals.
EXAMPLE 2.19
We desire the autocorrelation function and power spectral density of the signal π₯ (π‘) = Re[2 + 3 exp(π10ππ‘) + 4π exp(π10ππ‘)]orπ₯ (π‘) = 2 + 3 cos (10ππ‘) β 4 sin (10ππ‘). The first step is to write the signal as a constant plus a single sinusoid. To do so, we note that
π₯ (π‘) = Re [2 +β
32+ 42exp[
π tanβ1(4β3)]
exp (π10ππ‘)]
= 2 + 5 cos[
10ππ‘ + tanβ1(4β3)] We may proceed in one of two ways. The first is to find the autocorrelation function ofπ₯ (π‘)and Fourier-transform it to get the power spectral density. The second is to write down the power spectral density and inverse Fourier-transform it to get the autocorrelation function.
Following the first method, we find the autocorrelation function:
π (π) = 1
π0β«π0π₯(π‘)π₯(π‘ + π) ππ‘
= 10.2β«
0.2 0
{2 + 5 cos[
10ππ‘ + tanβ1(4β3)]} {
2 + 5 cos[
10π (π‘ + π) + tanβ1(4β3)]}
ππ‘
= 5β«
0.2 0
{4 + 10 cos[
10ππ‘ + tanβ1(4β3)]
+ 10 cos[
10π (π‘ + π) + tanβ1(4β3)] +25 cos[
10ππ‘ + tanβ1(4β3)] cos[
10π (π‘ + π) + tanβ1(4β3)] }
ππ‘
= 5β«
0.2
0 4ππ‘ + 50β«
0.2 0
cos[
10ππ‘ + tanβ1(4β3)] ππ‘
+50β«
0.2 0
cos[
10π (π‘ + π) + tanβ1(4β3)] ππ‘
+ 125 2 β«
0.2
0 cos (10ππ) ππ‘ +125 2 β«
0.2 0 cos[
20ππ‘ + 10ππ + 2 tanβ1(4β3)] ππ‘
= 5β«
0.2
0 4ππ‘ + 0 + 0 +125 2 β«
0.2
0 cos (10ππ) ππ‘ + 125
2 β«
0.2 0
cos[
20ππ‘ + 10ππ + 2 tanβ1(4β3)] ππ‘
= 4 + 25
2 cos (10ππ) (2.148)
where integrals involving cosines ofπ‘are zero by virtue of integrating a cosine over an integer number of periods, and the trigonometric relationshipcos π₯ cos π¦ =12cos (π₯ + π¦) +12cos (π₯ β π¦)has been used.
The power spectral density is the Fourier transform of the autocorrelation function, or π(π) =β[
4 + 25
2 cos (10ππ)]
= 4β[1] + 25
2β[cos (10ππ)]
= 4πΏ (π) +25
4 πΏ(π β 5) +25
4 πΏ(π + 5) (2.149)
Note that integration of this over allπ givesπ = 4 +252 = 16.5watts/ohm, which is the DC power plus the AC power (the latter is split between5andβ5hertz). We could have proceeded by writing down the power spectral density first, using power arguments, and inverse Fourier-transforming it to get the autocorrelation function.
Note that all properties of the autocorrelation function are satisfied except the third, which does not apply.
β
EXAMPLE 2.20
The sequence 1110010 is an example of a pseudonoise or m-sequence; they are important in the implementation of digital communication systems and will be discussed further in Chapter 9. For now, we use thism-sequence as another illustration for computing autocorrelation functions and power spectra. Consider Figure 2.11(a), which shows the waveform equivalent of thism-sequence obtained by replacing each 0 byβ1, multiplying each sequence member by a square pulse functionΞ (
π‘βπ‘0 Ξ
) , summing, and assuming the resulting waveform is repeated forever thereby making it periodic. To compute the autocorrelation function, we apply(2.145), which is
π (π) = 1
π0β«π0π₯(π‘)π₯(π‘ + π) ππ‘
since a periodic repetition of the waveform is assumed. Consider the waveform π₯ (π‘)multiplied by π₯ (π‘ + πΞ)[shown in Figure 2.11(b) forπ = 2]. The product is shown in Figure 2.11(c), where it is seen that the net area under the productπ₯ (π‘) π₯ (π‘ + πΞ)isβΞ, which givesπ (2Ξ) = β7ΞΞ = β17for this case.
In fact, this answer results for anyπequal to a nonzero integer multiple ofΞ. Forπ = 0, the net area under the productπ₯ (π‘) π₯ (π‘ + 0)is7Ξ, which givesπ (0) =7Ξ7Ξ = 1. These correlation results are shown in Figure 2.11(d) by the open circles where it is noted that they repeat eachπ = 7Ξ. For a given noninteger delay value, the autocorrelation function is obtained as the linear interpolation of the autocorrelation function values for the integer delays bracketing the desired delay value. One can see that this is the case by considering the integralβ«π0π₯(π‘)π₯(π‘ + π) ππ‘and noting that the area under the productπ₯(π‘)π₯(π‘ + π) must be a linear function ofπdue toπ₯(π‘)being composed of square pulses. Thus, the autocorrelation function is as shown in Figure 2.11(d) by the solid line. For one period, it can be expressed as
π (π) =8 7Ξ(π
Ξ )β 1
7 , |π|β€ π0
2
The power spectral density is the Fourier transform of the autocorrelation function, which can be obtained by applying (2.146). The detailed derivation of it is left to the problems. The result is
π (π) = 8 49
ββ π=ββ
sinc2 ( π
7Ξ )πΏ(
π β π7Ξ )β 1
7 πΏ(π)
0 1 0 β1
2 4 6 8 10 12 14
t, s
x(t)
0 1 0 β1
2 4 6 8 10 12 14
t, s
x(t β 2Ξ)x (t) x(t β 2Ξ)
0 1 0
β1 2 4 6 8 10 12 14
t, s
β6 β4 β2
1 0.5 0
2 4
0 6
t, s
R(Ο)
β1 0.2 0.1
0 β0.8 β0.6 β0.4 β0.2 0 0.2 0.4 0.6 0.8 1
f , Hz
S(f)
(a)
(b)
(c)
(d)
(e)
Figure 2.11
Waveforms pertinent to computing the autocorrelation function and power spectrum of anm-sequence of length 7.
and is shown in Figure 2.11(e). Note that nearπ = 0,π (π) =(8
49β17)
πΏ (π) =491πΏ (π), which says that the DC power is491 = 712 watts. The student should think about why this is the correct result. (Hint:
What is the DC value ofπ₯(π‘)and to what power does this correspond?)
β The autocorrelation function and power spectral density are important tools for systems analysis involving random signals.