1 / 72

Chapter 4 Random Processes

Chapter 4 Random Processes. 4.2 Random processes : Basic concepts 4.2.1Description of Random Process. Def : for each S , we assign a time function depicted by X (t, ) or . (i.e. we assign a function X (t, ) to every outcome )

kato-haley
Télécharger la présentation

Chapter 4 Random Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 4 Random Processes

  2. 4.2 Random processes : Basic concepts 4.2.1Description of Random Process Def : for each S , we assign a time function depicted by X (t, ) or . (i.e. we assign a function X (t, ) to every outcome ) The family { ( ) : } forms a stochastic process Domain of , D ( ) = --- sample space Domain of t , D ( t ) = I --- A set of real number R

  3. 1 2 3 4 the meaning of X(t) : A family (an ensemble ) : t and are variables A single time function : t is a variable . is fixed . A random variable : t is a fixed , is a variable . A number : t , are fixed . , ...... state space

  4. Classification of a RP : depend upon three quantities : • The state spaces, • the index time parameter, • and the statistical dependences. • state space : ( The collection of every value ) • Discrete-state process (chain) • Continuous-state process • index ( time ) parameter : • discrete-time process , , X[n] • continuous-time process , , X(t) • statistical dependence (e.g. sss , wss , independence process , ... , etc .)

  5. Def. 4.2.1 A complete statistical description of a R.P. X(t) is known if for any integer n and any choice of ( , ,..., ) , the joint p.d.f. of is given and denoted by Def. 4.2.2 A process X(t) is described by its Mth order statistics if for all nM and all ( , ,..., ) ,the joint p.d.f. of is given . Note : M=2 : Second – order statistics

  6. Example 4.2.3 A R.P. X(t) = , is a R.V. uniformly distributed on [0 , 2 ]

  7. Example 4.2.4 X(t) = X, X is a r.v. uniformly distributed on [-1,1]

  8. Example 4.2.5 A complete statistical description X(t) , t>0 for any n and any ( , ,..., ) , is a jointly Gaussian distribution with mean 0 and covariance matrix described by

  9. 4.2.2 Statistical Averages Def: Mean of R.P. X(t) : (a determinate function of t )

  10. Example 4.2.6. The mean of R.P. in Example 4.2.3 = Def : Autocorrelation function of R.P. X(t)

  11. Example 4.2.7 Example 4.2.8 X(t)= X in Ex.4.2.4

  12. 4.2.3 Stationary Processes Def 4.2.5 :A strictly-sense stationary (SSS) process is a process in which for all n ,all ( , ,..., ), and all Notes : Mth-order stationary if the above equation holds for all . for Mth-order stationary process and SSS process the density function of X(t) is time independent. 1 2

  13. Def. 4.2.6 A process X(t) is wide-sense stationary (WSS) if =E[X(t)] is indep of t ( ) i ii Example 4.2.10 R.P. in Ex.4.2.3 Def.4.2.7 A random process X(t) with is called cyclostationary if and

  14. Example 4.2.11 ,where X(t) is a stationary random process with mean m and autocorrelation , then, Thm 4.2.1Properties of R( ) for a WSS R.P. i ii iii

  15. A B

  16. Ergodic Process X(t) : SSS R.P. g(X)=any function (i) Statistical average (ensemble average) of any function g(X) (ii) Time average :

  17. Def 4.2.8 A stationary R.P. X(t) is also ergodic if for all g(x) and Example 4.2.12 Consider in Ex.4.2.3. for any value of we have:

  18. Example 4.2.13 X(t)=X in Ex .4.2.4 Each sample has a different constant value, therefore, the time average for each depend on i the process is not ergodic.

  19. Power and Energy Let X(t) be a R.P. with sample function X(t, ) the energy and power of each sample function are defined as : It is clear that both energy and power are random variables denoted by and

  20. Def: The power content and the energy content of the random process X(t) are defined as

  21. X(t) Y(t) ht)

  22. Def: 4.2.10: R.P. X(t) and Y(t) are independent if for all the R.V. are independent. X(t) and Y(t) are uncorrelated if for all , the R.V are uncorrelated for all 1 2 Def: 4.2.11: The cross correlation between two R.P.es, X(t) and Y(t) is Def: 4.2.12 X(t) and Y(t) are jointly WSS, or simple jointly stationary. if: i) X(t) and Y(t) are individually stationary

  23. Example 4.2.18 X(t) and Y(t) are jointly stationary , Z(t)=X(t)+Y(t) then

  24. X(t) Y(t) h(t) 4.2.4 Rondom Process and linear systems Thm 4.2.2 X(t) is a WSS with and

  25. Example 3.2.19

  26. Example 4.2.20

  27. 4.3 Random Process in The Frequency Domain • : random process • : a sample function • power-spectral density for : exist

  28. energy spectral density for power-spectral density for power-spectral density for different different different random variables

  29. Define the power-spectral density as the ensemble • average of these values. • Example 3.3.1 (Example 3.2.4) • Let ,where is a random variable • uniformly distributed on [-1,1]

  30. Theorem 4.3.1 (Wiener-Khinchin) • If ,and any interval A of length , the • autocorrelation function of satisfies • then

  31. The second terms make no contribution.

  32. Corollary : If is a stationary process with Corollary : In a cyclostationary process,if

  33. Example 3.3.2 If is stationary,then is a cyclostationary process with

  34. Remarks : 1 For stationary and ergodic process 2

  35. 3

  36. 4.3.2 Transmission over LTI Systems h(t)

  37. Power-Spectral Density of a Sum Process

  38. 4.4 Guaussian and White Process • 4.4.1 Gaussian Process • Def 4.4.1 : is a Gaussian process if for • ,the random variable • have a joint Gaussian density function. • Thm 3.4.1 : For Gaussian process , knowledge of • gives a complete statistical • description of the process.

  39. Thm4.4.2 • If is a Gaussian process , then is • also a Gaussian process. LTI

  40. Thm 4.4.3 : For Gaussian process , WSS and SSS • are equivalent. • Thm 4.4.4 : A sufficient condition for the ergodicity • of the stationary zero-mean Gaussian process • Def 4.4.2 : are jointly Gaussian if • , the random • vector • is distributed according to an • dimensional jointly Gaussian distribution.

  41. Thm 4.4.5 For jointly Gaussian process , unccorrelatedness and independence are equivalent.

  42. 4.4.2 White Process • Def 4.4.3 : is called a white process if it has a flat • spectral, i.e

  43. Remark : 1 quantum mechanical analysis of therminal noise show : 2 at drops to of Its maximun at about

  44. sample a white processat any two point are uncorrelated. If , in addition to being white , the random process is also Gaussian , the sampled random variables will also be independent.

More Related