150 likes | 275 Vues
This lecture series covers essential concepts in communication systems, focusing on random variables (RVs) and random signals. Key topics include an overview of RVs, probability density functions (PDFs), cumulative distribution functions (CDFs), moments, and functions of RVs. It delves into the correlation of random signals, including methods to determine their power spectral density. The Central Limit Theorem and its implications for binary communication in noisy environments are discussed, as well as correlation properties for stationary processes. Examples and real-world applications illustrate these concepts.
E N D
EE354 : Communications System I Lecture 5,6,7: Random variables and signals Aliazam Abbasfar
Outline • Random variables overview • Random signals • Signals correlation • Power spectral density
Random variables (RV) • PDF, CDF • fX(x) = d/dx [ FX(x) ] • Mean, variance, moments E[x], Var[x], E[xn] • Functions of RVs • Y = g(X) • Several RVs • Joint PDF, CDF • Conditional probability • Sum • Independent RVs • Correlation of 2 RVs E[x y] • Example : Binary communication with noise
Binomial distribution • X = # of successes in N independent trials • p : success probability (1-p : failure) • Sum of N binary RVs : X = Sxi • If N is large, it becomes a Gaussian PDF • mx=Np • sx2=Npq • Example : Error probability in binary packets
Z~N(0,1) Tails decrease exponentially Gaussian RVs and the CLT • PDF (mean and variance) • CDF defined by error function (erf(•)) • Central Limit Theorem: X1,…,Xni.i.d • Let Y=iXi, Z=(Y-mY)/sY • As n, Z becomes Gaussian, mx=0, sx2=1. • Uncorrelated Gaussian RVs are independent N(mx,sx2) sx mx
Random Processes • Ensemble of random signals (sample functions) • Deterministic signals with RVs • Voltage waveforms • Message signals • Thermal noise • Samples of a random signal • x(t) ; a random variable • E[x(t)], Var[x(t)] • x(t1), x(t2) joint random variables
Correlation • Correlation = statistic similarity • Cross correlation of two random signals • RXY(t1,t2)=E[x(t1)y(t2)] • Uncorrelated/Independent RSs • Autocorrelation • R(t1,t2)=E[x(t1)x(t2)] • RX(t,t) = E[x2(t)] = Var[x(t)]+E[x]2 • Average power • P = E[Pi] = E[<xi2(t)>] = <RX(t,t)> • Most of RSs are power signals ( 0< P < )
Wide Sense Stationary (WSS) • A process is WSS if • E[x(t)]=mX • RX(t1,t2)= E[x(t1)x(t2)]=RX(t2-t1)= RX(t) • RX(0)=E[x2(t)]< • Stationary in 1st and 2nd moments • Autocorrelation • RX(t)= RX(-t) • |RX(t)| RX(0) • RX(t)=0 : samples separated by t uncorrelated • Average power • P = <E[x2(t)]> = Rx(0)
Ergodic process • Time average of any sample function = Ensemble average ( any i and any g) <g(xi(t))> = E[g(x(t))] • Ensemble averages are time-independent • DC : <xi(t)> = E[ x(t) ] = mx • Total power : <xi2(t)> = E[ x2(t) ] = (sx)2 + (mx)2 • Average power : • P = E[<xi2(t)>] = Pi • Use one sample function to estimate signal statistics • Time-average instead of ensemble average
Examples • Sinusoid with random phase • DC signal with random level • Binary NRZ signaling
Power spectral density • Time-averaged autocorrelation • Power spectral density • Average power
Examples • Y(t) = X(t) cos(wct) • WSS ? • RY(t) and GY(f)
Correlations for LTI systems • If x(t) is WSS, x(t) and y(t) are jointly WSS • mY = H(0) mX • RYX(t) = h(t) Rxx(t) • RXY(t) = RYX(-t)= h(-t) Rxx(t) • RYY(t) = h(t) h(-t) Rxx(t) • GY(f) = |H(f)|2 GX(f)
Sum process • z(t) = x(t) + y(t) • RZ(t) = RX(t) + RY(t) + RXY(t) + RXY(-t) • GZ(f) = GX(f) + GY(f) + 2 Re[GXY(f)] • If X and Y are uncorrelated • RXY(t) = mXmY • GZ(f) = GX(f) + GY(f) + 2 mXmYd(f)
Reading • Carlson Ch. 9.1, 9.2 • Proakis&Salehi 4.1, 4.2, 4.3 4.4