1 / 58

RS-232 (Local Asynchronous Communication)

RS-232 (Local Asynchronous Communication). Based on Chapter 5 in Computer Networks and Internets by D. Comer. Human-Computer Communication. Computers are not designed to perform some particular large-scale task, instead they are designed to perform a set of small-small tasks.

azia
Télécharger la présentation

RS-232 (Local Asynchronous Communication)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RS-232 (Local Asynchronous Communication) Based on Chapter 5 in Computer Networks and Internets by D. Comer

  2. Human-Computer Communication • Computers are not designed to perform some particular large-scale task, instead they are designed to perform a set of small-small tasks. • Programming a computer consists of instructing it to perform the small-scale tasks in some particular order. • We must communicate our program to the computer. At a higher level we communicate which programs we want executed. • This level of communication is not between two computers but between a computer and its peripheral devices – and ultimately between computer and user.

  3. Categorization I: Directionality • One scheme for categorizing communication systems is based on the direction(s) in which information flows, i.e. one way or two way. • Simplex • Half-duplex • Full-duplex

  4. Simplex • Information flows in only one direction. • There are separate transmitters and receivers. • Example: radio • The radio station has the transmitter. • The listener only receives the signal – that is, your radio is a receiver.

  5. Half-duplex • Information flows in both directions, but only one direction at a time. • Each unit has a transmitter and receiver, but it is either in a transmitting or receiving mode. • Example: walkie-talkie • only one person can talk at a time

  6. Full-duplex • Information flows in two directions simultaneously. • The transmitter and receiver are both continuously operational. • Example: telephone • Each person can talk and listen all of the time.

  7. Modem: full or half • Many modems can work in either mode (half- or full-duplex). A switch determines which. • In half-duplex mode, there is a local echo, each character transmitted is also displayed locally • In full-duplex mode, there is a remote echo, each character is displayed only after it was transmitted and returned by the distant device (so you know it got there intact).

  8. Categorization II: Timing • Another scheme for categorizing communication systems is based on the timing between the transmission of characters or bytes. • Asynchronous • Synchronous • Isochronous

  9. “Synchronous” definitions • When we talked about satellites, we used definition #4. (Recall “geosynchronous.”) • For this lecture we will focus on definition #5 – concerning whether characters are transmitted as a group or individually. • You might use definition #1 if you were comparing a chat session (synchronous, i.e. requires that people logged on at same time) to a discussion database (asynchronous, i.e. people can be logged on at different times).

  10. Asynchronous (definition #4) • One character (byte of data) at a time. • There is an unknown length of time between the sending of one character and the next; the data flow is intermittent. A B C Different times between character transmissions

  11. Synchronous (definition #4) • Data is transmitted in large groups of characters (frame). • Within the frame, character transmissions are sent at regular intervals. • So one character immediately follows the next and so on. A B C D E F G H I J K

  12. Asynchronous vs. Synchronous • Asynchronous is less restrictive than synchronous • Synchronous requires more from the hardware and a greater degree of coordination between transmitter and receiver. • Asynchronous is slower/less efficient than synchronous • The percentage of the signal that is not actual data is higher in asynchronous.

  13. Asynchronous Example • The connection between the keyboard and the computer is often asynchronous. • The information is transmitted a character (keystroke) at a time. • The information arrives slowly and intermittently. • (It’s possible to buffer the keyboard input before transmitting it to the computer.)

  14. Beginning, middle and end • Synchronous and asynchronous both have • Beginning: I’m sending you data • Middle: the actual data • End: I’m done • Overhead: the percentage of information that is not actual data

  15. Overhead example • Let us assume that we are sending a byte (8 bits) of data, that there is no parity (error detection) bit, that there is one bit to indicate the start of the transmission, and that there is one bit to indicate the end of the transmission. Then the overhead is:

  16. Isochronous • Character transmissions are sent at regular intervals but some of them may be empty. • So the length of time between the sending of one character and the next is some multiple of the time it takes to send one character. A B C More restrictive than asynchronous, less restrictive than synchronous.

  17. Speed and accuracy • The desire is for the information to be conveyed as quickly and as accurately as possible. • These two factors compete: • Factors which improve the speed, such as representing more levels (states), can reduce the accuracy. • Factors which improve accuracy, such as adding error correcting codes, can slow down transmission (since extra bits are sent).

  18. Noise • Unpredictable energy (waves) that permeate the environment. They can never be completely eliminated. • The noise waves add to the waves carrying a signal, changing it. • Enough noise will corrupt the information being carried. • Occurs in digital and analog signals alike, but the information in a digital signal is more robust against the effect of noise.

  19. External and internal noise • Noise is sometimes divided into two categories: internal and external • Internal: generated within the communication devices • External: generated outside of the communication devices • Wired communication is less susceptible to external noise since the channel is somewhat isolated from its environment. • In wireless communication, on the other hand, the channel is the environment.

  20. Minimizing noise • One way to reduce noise is to reduce a signal’s bandwidth as much as possible. The smaller the range of frequencies making up the signal, the smaller the amount of noise.  • But reducing the bandwidth reduces the rate of information flowing. • Digitizing helps!

  21. Digital signals 0 1 1 0 1 0 0 0 These signals are different if they are analog signals, but they might be the same if they are digital signals.

  22. The competition • The competition between speed and accuracy can be seen in the theorems of Nyquist and Shannon-(Hartley). • Nyquist’s theorem tells one how to send information faster. • But Shannon’s theorem tells how that speed is ultimately limited by noise.

  23. A bit on logarithms • Both theorems involve a logarithm. • Recall that a bit (binary digit, a 0 or 1) is the smallest unit of digital information. • The relationship between the number of bits and the number of “states” is logarithmic.

  24. A bit on logarithms (Cont.) • One bit  Two states (0, 1) • Two bits  Four states (00, 01, 10, 11) • Three bits  Eight states (000, 001, 010, 011, 100, 101, 110, 111) • Number of States = 2(Number of bits) • Exponential • Number of bits = log2(Number of States) • Logarithmic

  25. Log2(1) = 0 Log2(2) = 1 Log2(4) = 2 Log2(8) = 3 Log2(16) = 4 Log2(32) = 5 Log2(64) = 6 Log2(128) = 7 Log2(256) = 8 Log2(512) = 9 Log2(1024) = 10 Log2(2048) = 11 Log2(4096) = 12 Log2(8192) = 13 Log2(16384) = 14 Etc. A bit on logarithms (Cont.)

  26. States  Bits • So if you know the number of states, you take the logarithm (base 2) to determine the number of bits required to represent those states. • The number of bits required is a measurement of the information contained therein.

  27. Nyquist’s theorem • D = (2) B Log2 K • Where D is the maximum data rate in bits per second (bps). • Where B is the bandwidth. • Where K is the number of “states.” • This maximum is a goal for communications engineers, but it is rarely achieved.

  28. The problem with higher frequencies • D = (2) B Log2 K • Recall the bandwidth can be increased by working at higher frequencies. So why not just work at high frequencies to increase the rate of data transmission? • Physics says that higher frequency means higher energy • Not only do higher frequency waves (like ultra-violet or x-rays) take more energy to generate, but they can also cause more damage (like cancer).

  29. K= 2  two bits per cycle 1 0 0 0 0 0 One cycle 1 1 2 possible amplitudes

  30. K= 4  four bits per cycle 11 01 01 00 00 01 01 10 4 possible amplitudes.

  31. More levels  less room for error

  32. More levels  less room for error • D = (2) B Log2K • Note that an error of approximately 0.2 would probably not cause one to confuse the states represented in the first case, but it could lead to confusion of states (i.e. data corruption) in the second case. • This is how Shannon’s theorem places a limit on Nyquist’s.

  33. K=2 with some noise

  34. K=4 with some noise

  35. Shannon’s theorem • C = B Log2 (1 + S/N) • Where C is the channel capacity in bits per second. • Where B is the bandwidth. • Where S is the signal power. • And where N is the noise power. • The ratio S/N is known as the signal-to-noise ratio.

  36. Signal-to-noise ratio • In communications (digital or analog), the signal-to-noise ratio, denoted S/N or SNR, measures a signal’s strength (power) relative to the background noise.

  37. Decibels • Sometimes the signal-to-noise ratio is reported in decibels (S/N)in decibels = 10 Log10 (S/N) When using Shannon’s theorem we do not want S/N expressed in decibels • In some cases, one can produce a larger signal, but that requires more energy and hence costs more money. • Lowering the temperature sometimes helps since some noise is thermal.

  38. Local Communication • Local communication is the exchange of information between two things which are close – here it typically refers to exchange between a computer and one of its peripheral devices. • Data from a keyboard or mouse does not flow at steady rate but rather is intermittent. • Thus asynchronous communication is appropriate.

  39. Fig. 5-1 A possible approach: bit-at-a-time

  40. Pros and Cons of the bit-at-a-time approach • +15 represents 0; -15 represents 1; and 0 corresponds to not transmitting a bit. • Pro: very lenient, not a lot of rules • Con: wastes a lot of time, there’s an extra state (0 V) that does not correspond to any information in the signal and one must spend time in this state to distinguish for example between two consecutive 1’s. • And while keyboard data is intermittent, it does tend to come in bytes instead of bits.

  41. Fig. 5.2 Framing a group of bits: a byte-plus-at-a-time

  42. Protocol • In order to improve the speed on the bit-at-a-time approach, • We want to reduce the number of states to two • We want to eliminate the time between bits within a group of bit that make up one character. • We need rules! • We need physical rules so one device does “fry” the other. • We need logical rules so the devices can agree on the information they are transmitting.

  43. Rules • In order to distinguish between two consecutive 1’s, we use timing, each bit lasts a set amount of time • If the signal is at –15V for (roughly) twice this interval, that corresponds to two consecutive 1’s. • Since the signal is asynchronous, we also need to distinguish between no signal and a signal of all 1’s.

  44. Frame • All 1’s is distinguished from no signal in that all 1’s will a frame. • In ordinary language, a frame is “an open border or case for enclosing a picture, mirror, etc.” • We “frame” the information, i.e. place it within a border that indicates where it begins and ends.

  45. RS-232 • Short for recommended standard-232C • Approved by the Electronic Industries Association (EIA) for connecting serial devices. • In 1987, the EIA released a new version with the name to EIA-232-D. • In 1991, the EIA teamed up with Telecommunications Industry association (TIA) and issued yet another version that goes by EIA/TIA-232-E. • But still commonly called RS-232.

  46. Newer standards • While EIA-232 remains the most common standard for serial communication, the EIA has developed successors called RS-422 and RS-423. • These newer standards are “backward compatible” so that devices adhering to the old standard (RS-232) can be used in a new RS-422 port.

  47. Serial Port • Most personal computers have an RS232 serial port. • Serial refers to sending data bit-by-bit (as opposed to parallel, in which several bits are sent at once). • A (physical) port is a computer interface to which one can connect a peripheral device (e.g. mouse, keyboard, modem, etc.)

  48. Serial ports

More Related