1 / 43

Random variables

Random variables. Random variables are extensively used in communication circuits Let the random variable represents the functional relationship between a random variable A and a real number For simplicity the rand variable will be denoted by X

lyle
Télécharger la présentation

Random variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Random variables • Random variables are extensively used in communication circuits • Let the random variable represents the functional relationship between a random variable A and a real number • For simplicity the rand variable will be denoted by X • The random variable can be discrete or continuous

  2. Random variables • The distribution function of the random variable X is given by • Where is the probability taken by the random variable X is less than or equal to a real number x • The distribution function has the following properties

  3. Properties of the distribution function • 1

  4. Probability density function pdf • Another useful function relating to the random variable X is the probability density function(pdf)

  5. Probability density function The name density function arises from the fact that the probability of the event equals

  6. Properties of the probability density function The probability density function has the following properties The probability density function is always nonnegative function with a total area of one

  7. Ensemble averages The mean value mx or the expected value of a random variable X is defined by The is called the expected value operator The nth moment of a probability distribution of a random variable X is defined by

  8. Ensemble averages In communication system analysis, only the 1st (n=1) and 2nd (n=2) moment are used When n=1 the mean mx of the random variable can be obtained The mean corresponds to the DC value of a random voltage or current

  9. Ensemble averages When n=2 we obtain the mean square value of the X, as follows

  10. Central moments The central moments are defined as the moments of the difference between random variable X and mx

  11. Second central moment (Variance) and STD The second central moment, called the variance of X which is defined The variance of X is also denoted by and its square root, , is called the standard deviation of X

  12. The variance effect on pdf The variance is a measure of randmoness of the random variable X By specifying the variance of a random variable, we are constraining the width of its probability density function

  13. Relation between variance and the mean square value The variance and the mean square value are related by Thus the variance is the difference between the mean square value and the square of the mean

  14. Random process The random process is a function of two variables: an event A and time

  15. Statistical averages of a random process Because the value of a random process at any future time is unknown, a random process whose distribution functions are continuous can be described statistically with a probability density function In general the probability density function will be different for different times

  16. Statistical averages of a random process Therefore it will be impractical to determine the pdf of a given random process empirically However, a partial description consisting of the mean and the autocorrelation function will be sufficient for the needs of communication function The mean of the random process X(t) is

  17. Statistical averages of a random process Where X(tk) is the random variable obtained by observing the random process at time tk and the pdf of X(tK) PxK(X) is the probability density function over the ensemble of events at time tk

  18. Autocorrelation definition The auto correlation function of a random process X(t) is defined as X(t1) and X(t2) are random variables obtained by observing X(t) at times t1and t2 The autocorrelation function is a measure of the degree to which two time samples of the same random process are related

  19. Stationarity • A random process X(t) is said to be stationary if none of its statics are affected by a shift in the time origin • A random process is said to be a wide sense stationary (WSS) if its mean and its auto correlation do not vary with a shift in the time origin • =a constant

  20. Autocorrelation of a wide sense stationary random process For a wide sense stationary process, the autocorrelation function is only a function of the time difference For a zero mean WSS process, indicates the extent to which the random values of the random process are statistically correlated

  21. Autocorrelation of a wide sense stationary random process In other words the autocorrelation gives an idea of the frequency response that is associated with a random process If changes slowly as increase from zero to some value, it indicates that, on average, sample values of x(t) taken at and are nearly the same This means that the frequency domain representation of X(t) is low frequency

  22. Autocorrelation of a wide sense stationary random process On the other hand if decrease rapidly as is increased, this means that X(t) changes rapidly with time and therefore X(t) contains higher frequencies

  23. Properties of the Autocorrelation function

  24. Time averaging and ergodicity When a random process belongs to a special class, known as an ergodic process, its time averaging is equal to its ensemble averages This means that the statistical properties of the process can be determined by time averaging of a single sample function of the process

  25. Time averaging and ergodicity This means that the mean of the random process can be rewritten as The auto correlation now can be rewritten as

  26. Time averaging and ergodicity Since time averages equal to ensemble averages for ergodic processes, fundamental electrical engineering parameters, such as DC vlaue, rms value, and average power can be related to the moments of an ergodic random process

  27. Summery of electrical engineering properties

  28. Summery of electrical engineering properties

  29. Summery of electrical engineering properties

  30. Power spectral density and Autocorrelation of a random process A random process can generally be classified as a power signal having a power spectral density (PSD) is particularly useful in communication systems, because it describes the distribution of signal’s power in frequency domain The PSD enables us to evaluate the signal power that will pass through a network having a known frequency characteristics

  31. Features of the PSD functions The PSD enables us to evaluate the signal power that will pass through a network having a known frequency characteristics and is always real valued for X(t) real valued PSD and autocorrelation for a Fourier transform pair Relation between normalized average power and PSD

  32. Meaning of correlation Correlation between two phenomena means how closely do they correspond in behavior or appearance, how well do they match one another An autocorrelation function of a signal describes the correspondence of the signal to itself (in time domain) This can be achieved by producing an exact copy of the signal and located in time at minus infinity

  33. How to perform auto correlation graphically

  34. How to perform auto correlation graphically

  35. Noise in communication systems Noise is unwanted electrical signals that are always present in electrical systems The presence of noise in communication systems limits the receiver's ability to make correct symbol decision This limits the rate of information transmission

  36. Sources of noise Noise can be generated either by man made sources or natural sources Man made sources include spark-plug ignition noise, switching transients, and other radiating electromagnetic signal Natural noise includes such elements as the atmosphere, the sun, and other galactic sources

  37. Thermal noise One of the noise sources that hard or can not be eliminated even with good engineering design is the thermal or Johnson noise. This type of noise is caused by the motion of electrons in all dissipative components, resistors, wires and so on

  38. Gaussian pdf of thermal noise Thermal noise can be described as a zero mean Gaussian random process which has the following pdf In general a noisy digital bit, can be expressed as the sum of the bit a and the noise n

  39. Gaussian pdf of noisy digital bit The probability density function of the noisy digital bit is given by

  40. White noise The power spectral density of the thermal noise is the same for all frequencies of interest in most communications systems; This means that a thermal noise emanates an equal amount of noise power per unit bandwidth at all frequencies from DC to a bout 1 THz This why this kind of noise is called whit Gaussian noise

  41. PSD of white Gaussian noise The PSD of white Gaussian noise is given by

  42. Autocorrelation of white Gaussian noise The autocorrelation function of the noise is given by the inverse Fourier transform of the noise power spectral density as detailed

  43. Average power in white Gaussian noise The average power of white Gaussian noise is infinte

More Related