1 / 43

DIGITAL SPREAD SPECTRUM SYSTEMS

DIGITAL SPREAD SPECTRUM SYSTEMS. ENG-737 Lecture 1. Wright State University James P. Stephens. CONTACT INFORMATION. Email: james.stephens@wpafb.af.mil Phone: (937) 904-9216 Web Page: http://www.cs.wright.edu/~jstephen/ Fax: (937) 656-7027. COURSE SCHEDULE. COURSE SCHEDULE.

gunda
Télécharger la présentation

DIGITAL SPREAD SPECTRUM SYSTEMS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DIGITAL SPREAD SPECTRUM SYSTEMS ENG-737 Lecture 1 Wright State University James P. Stephens

  2. CONTACT INFORMATION • Email: james.stephens@wpafb.af.mil • Phone: (937) 904-9216 • Web Page: http://www.cs.wright.edu/~jstephen/ • Fax: (937) 656-7027

  3. COURSE SCHEDULE

  4. COURSE SCHEDULE

  5. WHAT WILL YOU LEARN ? • An overview of modern communications issues and trends • The definition of SS and a variety of applications • How to distinguish between various digital modulation types • An understanding of Spread Spectrum Anti Jam (AJ) / Low Probability-of-Intercept (LPI) waveforms • The concept of signal dimensionality and how it relates to processing gain and jamming margin • The properties of pseudonoise (PN) sequences, how to generate and exploit them • The function of each SS subsystem, i.e. modulation, demodulation, acquisition, tracking, etc • Optimal countermeasure techniques against SS communication systems • The concept of bit-error rate (BER) and how to determine system performance • New areas of digital comm. concepts such as cognitive radio, OFDM, multi-carrier CDMA, UWB, and AdHoc nets

  6. Digital Global Internet OFDM FH DSSS LPI/AJ Ad Hoc Networks Analog Networks AM/FM Net Worldwide Links Link Point-to-Point Present Past Future MODERN COMMUNICATIONS TRENDS PSK QAM FSK CODING Technologies Software Radio Cognitive Systems

  7. SOURCE TRANSMITTER CHANNEL RECEIVER DESTINATION COMMUNICATION SYSTEM BLOCK DIAGRAM Carrier signal  s(t) v(t) r(t) s(t) • Message Signal - Information signal or baseband signal • Transmitter – Converts and transmits message signal • Baseband converter – Filtering, encoding, and multiplexing • Carrier wave modulation • Power Amplification • Channel – Hard-wired or free space medium over which signal is transmitted (Signal is corrupted with noise and distorted) • Receiver – Demodulates, demultiplexes, and decodes received signal • Received Signal – Detected signal in the presence of noise is only an estimate of the message signal message n(t)

  8. Discrete (Memoryless) Channel Spread Spectrum Modulator Power Amplification Source Encoder (1, 2, . . . Q) Data Modulator Channel Encoder Information Source Encryptor Power Limited Source Rate: Rm = (1/Tm ) log2 M bits per second Code Rate: R output symbols per input bits Spreading Code Generator Waveform Channel Noise Bandwidth Limited Timing and Synchronization Spread Spectrum Despreader Receiver Front End Channel Decoder Data Demodulator Source Decoder Information Sink Decryptor BLOCK DIAGRAM OF TYPICAL DIGITAL COMMUNICATION SYSTEM Source: “Introduction to Spread Spectrum Communications” by Peterson, Ziemer, and Borth

  9. GENERIC DIGITAL COMMUNICATION SYSTEM

  10. 5 3 4 2 1 t (ms) 3T 2T 4T 5T T= 1 ms INFORMATION RATE vs. SYMBOL RATE • Information in its most fundamental form is measured in “bits” (binary digits) • A signal which conveys binary information is the binary digital waveform Rs = Symbol (baud) rate = 1/T symbols / sec Rb = Information rate = 1/T bits / sec Example: Rs = 1000 symbols / sec Rb = 1000 bits / sec

  11. t 0 0 1 0 0 0 1 0 1 1 1 1 T = 3ms t (ms) 3T 4T T 2T INFORMATION RATE vs. SYMBOL RATE (Cont.) Binary information can be conveyed by M-ary (multilevel) digital waveforms: Tb Rs = symbol (baud) rate = 1/T symbols/sec Rb = information rate = (1/T) Log2 M = 1/Tb Where M = # of levels in the M-ary waveform M = 2k = 23 = 8 ; k = number of bits For the example shown: Rs = 333 symbols/sec, Rb = 1000 bits/sec

  12. Bits Symbols (n, k) Data / Information AWGN Channel Transmitted Waveform Encode Modulate Bit Stream Bit Stream { ci } { bi } M = 2l l Bits / Symbol Rate = k / n RB or RD TS , ES = l Eb Rs = 1/TS = RC / l = n/(kl) RD Symbols / Sec k is information bits n is code word bits Bits / Sec CODEC Rc = (n/k) RD MODEM Digital Encoding

  13. AWGN Channel Symbols ^ ^ Bits {bi} {ci} r(t) Estimated Data / Info Bits Estimated Channel Bits Decode Demodulate ( Eb / N0) ^ RC , PC Si(t), PE • Channel Distortions • Amplitude • Phase • Frequency Bits / Sec PBPSK = Q 2 Eb N0 Digital Decoding RD , PB Bits / Sec

  14. COMM SYSTEM PERFORMANCE PARAMETERSAnalog Systems Performance Measure: Signal-to-Noise Ratio (SNR) 1 2 3 4 Modulator Channel Demod • Analog Message (tone) • AM Modulator Output • Received RF Signal • Demodulated Output

  15. COMM SYSTEM PERFORMANCE PARAMETERSDigital Systems Performance Measure: Bit Error Rate (BER) 1 2 3 4 5 6 Filter Mod Demod Filter Decision Channel • Digital Message • Filter Output • Modulator Output (ASK) • Demodulator Output (with noise) • Receiver Filter Output • Detector Output

  16. WHY DIGITAL? But let your communication be, Yea, yea; Nay, nay; for whatever is more than these, cometh of evil. - The Gospel According to St. Matthew (5:37) • Demand: Increased requirement for computer to computer communications • Signal Regeneration: Resistant to noise, interference, and distortion • Digital Signal Processing: Allows error detection and correction; permits ease of encryption and use of AJ techniques • Technology: Digital communication systems are more flexible and are better suited for future communication needs • Economics: Components are becoming increasingly more available and less expensive. Ability to rapidly troubleshoot and repair systems reduce overall maintenance costs as well as increasing system availability

  17. PERFORMANCE CRITERIADigital vs. Analog Systems • Analog communication systems reproduce waveforms (an infinite set) • A figure of merit for analog systems is a fidelity criterion (e.g. percent distortion between transmitted and recovered waveforms) • Digital communication systems transmit waveforms that represent digits (a finite set) • A figure of merit for digital systems is the quantity of incorrectly detected digits

  18. FOURIER ANALYSIS • Fourier analysis techniques are very useful for: • Determining the distribution of signal energy in the frequency domain • Determining the output of a linear system given its input • For Example: Output Input Linear System Linear System

  19. FOURIER ANALYSIS FOURIER SERIES s(t) = Fn e jnωt Where, Fn = 1/T ∫ f(t) e –jnωt dt FOURIER TRANSFORM F(ω) = ∫ f(t) e –jωt dt f(t) = ∫ F(ω) e jωt dω  n = -  Periodic Signals T  -  Non Periodic Signals  - 

  20. f(t) Fourier Transform Ω f(t) Periodic Continuous Time Fourier Series k f(n) Non-periodic Discrete Time Fourier Transform ω FOURIER ANALYSIS (Cont.) Non-periodic Continuous Time t t n Periodic Discrete Time f(n) Fourier Series k n

  21. X1 x1 ω X2 x2 ω X3 x3 ω FOURIER SERIES OF A PULSE TRAIN F F F

  22. IMPORTANT FOURIER PROPERTIES • Scaling in time and frequency: if f(t) F{ω} then F{f(t/)} = ∫ f(t/) e-jωtdt Changing the variables by letting x = t/ F{f(t/)} =  ∫ f(x) e-j(ω)xdx =  F{ω}  -   -  We see that if the time scale is reduced by a factor of  , the frequency scale is increased by . f (t/)  F{ω} Ex.: If the pulse width in time is reduced from msec to sec (factor of 1000), the bandwidth increases from kHz to MHz (a factor of 1000).

  23. IMPORTANT FOURIER PROPERTIES • Frequency Translation: If F{ω} is shifted to the right by ω radians/sec, then f(t) is multiplied by e-jω t Proof: F{ω – ω0} = ∫ f(t) e-j(ω – ω ) dt = ∫ [f(t) e jω t] e-jωt dt or F{ω – ω0} f(t) e jω t F{ω + ω0} f(t) e -jω t 0  0 -   0 -  0 0

  24. IMPORTANT FOURIER PROPERTIES Example of Frequency Translation: g(t) = f(t) cos ω0 t cos ω0 t = (1/2)(e jω t + e -jω t) Then, F{g(t)} = F{(f(t)/2) e jω t} + F{(f(t)/2) e -jω t} 0 0 0 0 Now applying the frequency translation property we observe that the first term on the right-hand side of the above equation is: F{(f(t)/2) e jω t } = ½ F(ω – ωo) Which is F(ω) shifted to the right on the frequency axis and scaled by a factor of 1/2 0

  25. F(ω) ω B -B F(ω+ω ) 0 IMPORTANT FOURIER PROPERTIES • The previous example is called the modulation property because it is used to translate signals from baseband to RF for transmission purposes F(ω-ω ) 0 ω ω0 ω0+B ω0-B -ω0 -ω0+B -ω0-B

  26. T[ · ] An output h(t) An impulse (t) An output y(t) An input x(t) h(t) LINEAR SYSTEMS • Linear systems are characterized by their • Impulse response, h(t) – time domain • Frequency response, H(ω) – frequency domain • The concept of impulse response: Fourier Transform of h(t) • The concept of frequency response: • if x(t) = e jωt ← a complex sinusoid • then y(t) = ∫h() e jω(t - )d = ejt F [h(t)] • y(t) = e jωt H(ω)  the system frequency response  - 

  27. SIGNALS, CIRCUITS, AND SPECTRA • Frequency spectra can be ascribed to both signals and circuits • Passing a signal x(t) through a filtering circuit yields g(t) in the time domain, or G(ω) in the frequency domain • where g(t) = x(t) * h(t) • and G(ω) = X(ω) H(ω) • The output bandwidth is always constrained by the smaller of the two bandwidths INPUT SIGNAL SPECTRUM FILTER TRANSFER FUNCTION CASE 1 Output bandwidth is constrained by input signal bandwidth CASE 2 Output bandwidth is constrained by filter bandwidth

  28. GRAPHICAL EXAMPLE OF CONVOLUTION “Joy of Convolution” http://www.jhu.edu/signals/convolve/

  29. T -1/T Example 1 Good Fidelity Output Example 2 Good Recognition Output Example 3 Poor Recognition Output FILTERED PULSE-SIGNAL EXAMPLES x(t) Vm R Input Low Pass Filter C t H(f) Input Pulse Filter Frequency Response (Transfer Function) 1 0.707 f f 1/2πRC 0 0 1/T Wp << Wf (T >> 2πRC) t Wp ≈ Wf (T ≈ 2πRC) t Wp >>Wf (T << 2πRC) t

  30. RANDOM SIGNALS • Fourier Analysis as previously described applies only to deterministic signals. • A random signal (random process) appears as follows:

  31. RANDOM SIGNALS • The two most important random signals we encounter in digital communications are: • Noise • Random binary waveforms • Noise that is completely random is called “white” noise

  32. RANDOM SIGNALS (Cont) • The noise encountered most in communications has a Gaussian amplitude distribution

  33. T/2 E{x(t)x(t+)} = lim 1/T ∫ x(t)x(t+)dt = A measure of the signal randomness T→ ∞ -T/2 Signal Mean Square: RANDOM SIGNALS Random Signals can only be described statistically T/2 E{x(t)} = lim 1/T ∫ x(t)dt = DC Value of signal Signal Mean: T→ ∞ -T/2 Signal Autocorrelation: T/2 E{x2(t)} = lim 1/T ∫ x2(t)dt = Total signal power T→ ∞ -T/2 Signal Variance: x2 = E{(x – E{x})2} = Signal AC power E{x2(t)} = x2 + E2{x(t)} Total power = AC power + DC power

  34. AUTOCORRELATION FUNCTION • Provides a measure of the similarity of a signal with a time-delayed version of itself • The autocorrelation function of a real-valued energy signal x(t) is defined as: Rx() = ∫ x(t) x(t + ) dt • Important properties of Rx() are: • Symmetrical in  about zero • Maximum value occurs at the origin • Autocorrelation and PSD form a Fourier transform pair • Value at the origin is equal to the average power of the signal  - 

  35. S(f) ↔ R() F AUTOCORRELATION AND POWER SPECTRAL DENSITY LOW BIT RATE HIGH BIT RATE x(t) = Random binary sequence x(t + ) R() = E[x(t) x(t + )] 1 - ||/T for || < T R() = { 0 for || > T ∞ -∞ S(f) = T [sin (π/T) / π/T]

  36. DISCRETE AUTOCORRELATION • Spectral analysis of random processes differs from that of deterministic signals • For stationary random processes, the autocorrelation function Rxx() tells us something about how rapidly we can expect the random signal to change as a function of time • The autocorrelation for discrete time-series signal is defined for real signal x[n]: • If the autocorrelation function decays rapidly to zero it indicates that the process can be expected to change rapidly with time • A slowly changing process will have an autocorrelation function that decays slowing • If the autocorrelation function has periodic components, then the underlying process will also have periodic components

  37. DISCRETE CROSSCORRELATION • The crosscorrelation for discrete time-series signals is defined for real signal x[n]: • The crosscorrelation function is even • Another important property: • This is useful for time-of-arrival (TOA) measurements (ranging) • Send out x[n] • Cross correlate x[n] with the delayed return signal • Peak in crosscorrelation will occur at n0 which is the amount of time delay

  38. ERROR-FREE CAPACITY • It is useful to explore briefly the concept of ‘capacity’of a digital communications link • Given the constraints of Power, Bandwidth, and AWGN, there exists a maximum rate at which information can be transmitted with high reliability (This rate is called ‘error-free capacity’) • Pioneering work by Claude Shannon in the late 1940’s found: C = W log2 (1 + P/N0W) C = W log2 (1 + Eb/N0(R/W)) Where, C = channel capacity in bits/sec W = transmission bandwidth in Hz P = EbR = received signal power in watts N0 = single-sided noise power density in watts/Hz Eb = energy per bit of received signal R = information rate in bits per signal • C is the maximum capacity at which information can be put through the channel • The goal is to make the information rate less than C in order to have reliable communications

  39. SHANNON’S 2ND THEOREM • Channel capacity may also be written: C = W log2 (1 + S/N) Where, N = N0W S = EbR • Spread spectrum systems typically operate with S/N < 1 (i.e. more noise than signal due to the wide bandwidth used)

  40. SHANNON’S 2ND THEOREM • We can derive the significance of this law as follows: • Change the logarithm base: log2 / loge = 1.44, (i.e. log2(2) / ln(2) = 1.44) • Therefore, C/W = 1.44 loge (1 + S/N) • Since S/N <0.1 for spread spectrum, loge(1 + S/N)  1.44(S/N) (You can verify this on your calculator) • C/W  1.44 (S/N) or C  1.44 W (S/N) (assuming low S/N) • Significance: For any given SNR, we get a low error rate by increasing the bandwidth used to transfer information • Shannon’s limit is based upon the assumption that the noise is AWGN (Since noise is the only cause of errors, capacity is directly proportional to SNR)

  41. NYQUIST CHANNEL CAPACITY • Another finding: C(Nyquist) = 2W log2 M Where, C is Nyquist Channel capacity in bits per sec M is the number of bits per symbol

  42. WHAT IS THE BANDWIDTH OF DIGITAL DATA ? General shape of power spectral density (PSD) S(t) = random digital sequence -1/T 1/T BANDWIDTH CRITERIA BW • Half-Power • Noise Equivalent • Null-to-Null • 99% of Power • Bounded PSD • - 35 dB • - 50 dB

  43. | X(ω)| X(0) X(0) Weq |X(ω)|dω  ∫ 0 BANDWIDTH DEFINITIONS 0 0 NOISE EQUIVALENT BANDWIDTH

More Related