1 / 31

Computer Architecture Lecture 3 Combinational Circuits

NYU. Computer Architecture Lecture 3 Combinational Circuits. Ralph Grishman September 2015. Time and Frequency. time = 1 / frequency frequency = 1 / time units of time millisecond = 10 -3 second microsecond = 10 -6 second nanosecond = 10 -9 second picosecond = 10 -12 second

francine
Télécharger la présentation

Computer Architecture Lecture 3 Combinational Circuits

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NYU Computer ArchitectureLecture 3Combinational Circuits Ralph Grishman September 2015

  2. Time and Frequency • time = 1 / frequency • frequency = 1 / time • units of time • millisecond = 10-3 second • microsecond = 10-6 second • nanosecond = 10-9 second • picosecond = 10-12 second • units of frequency • kiloHertz (kHz) = 103 cycles / second • megaHertz (MHz) = 106 cycles / second • gigaHertz (GHz) = 109 cycles / second Computer Architecture lecture 3

  3. Today’s Problem • A typical clock frequency for current PCs is 2 GHz. What is the corresponding clock period? (a) 200 ps (b) 500 ps (c) 2 ns (d) 5 ns Computer Architecture lecture 3

  4. Solution • Frequency = 2 GHz = 2 * 109 Hz • Period = 1 / frequency = 1 / (2 * 109) sec = (1 / 2) * (1 / 109) sec  = 0.5 * 10-9 sec = 0.5 ns = 500 * 10-6 sec = 500 ps Computer Architecture lecture 3

  5. Assignment #1 • various short questions about combinational circuits Computer Architecture lecture 3

  6. Design tools • see lecture outline Computer Architecture lecture 3

  7. Propagation Delay • delay of individual transistor -- how fast it can switch -- determined by physical factors (e.g., size) • speed of transistor determines speed of gate voltage in out time Computer Architecture lecture 3

  8. Propagation Delay • the propagation delay (speed) of a combinatorial circuit is the length of time from the moment when all input signals are stable until the moment when all outputs have stabilized Computer Architecture lecture 3

  9. Propagation Delay • propagation delay of a combinatorial circuit can be determined as longest path (in number of gates) from any input to any output delay=2 Computer Architecture lecture 3

  10. A Very Rough Estimate • After transistor switches, it has to charge output wires • this may be a large part of total delay • so assuming all gate delays are the same produces a very rough estimate of circuit delays • but is good enough for understanding principles of circuit design • so we will make that assumption in this course Computer Architecture lecture 3

  11. Fan-in • sum-of-products form suggests any combinatorial function can be computed in 3 gate delays (one delay for inverters, one for ANDs, one for OR) Computer Architecture lecture 3

  12. Fan-in • but gates are limited in their fan-in (number of inputs a gate has) Computer Architecture lecture 3

  13. Fan-in • for example, if fan-in is f, it takes log (base f) n gate delaysto OR or AND together ninputs log2 8 = 3 gate delays Computer Architecture lecture 3

  14. Adders • The simplest case: adding two one-bit numbers • Sum = A xor B • Carry = A and B Computer Architecture lecture 3

  15. n-bit Adder • adding multi-bit numbers: • have to keep track of a carry out of one bit position and into the next position to the left 0 0 1 1 + 0 0 0 1 0 1 0 0 Computer Architecture lecture 3

  16. n-bit Adder • Do this with full adders, which have 3 inputs: A, B, and Cin, and 2 outputs, Sum and Cout. Computer Architecture lecture 3

  17. Full Adder • We will show the connections of the full adder as follows: A B Cout Cin Sum Computer Architecture lecture 3

  18. n-bit Adder • Then we can draw a 3-bit adder like so: A2 A1 A0 B2 B1 B0 Cin Cin Cout Cout Sum2 Sum0 Sum1 Computer Architecture lecture 3

  19. n-bit adder: delay • ripple-carry adder: carry ripples from bit 0 to high-order bit • total delay (for large n) = n * delay(CinCout) Computer Architecture lecture 3

  20. Signed Numbers • So far we assumed the bits represent positivve numbers: Computer Architecture lecture 3

  21. Signed Numbers • We could use some of the bit patterns to represent negative numbers, like so: sign and magnitude Computer Architecture lecture 3

  22. Signed Numbers • Or like so: two’s complement Computer Architecture lecture 3

  23. Signed Numbers • Or even like so: Computer Architecture lecture 3

  24. Why do we prefer two’s complement? Computer Architecture lecture 3

  25. Why do we prefer two’s complement? • Can use same logic as for unsigned addition Computer Architecture lecture 3

  26. Computing two’s complement • Given representation of v, how to compute representation of –v ? Computer Architecture lecture 3

  27. Computing two’s complement • Given representation of v, how to compute representation of –v: • flip every bit in representation of v • add 1 Computer Architecture lecture 3

  28. Computing two’s complement A2 A0 A1 0 0 1 Cin Cin Cout Cout Acomp2 Acomp0 Acomp1 Computer Architecture lecture 3

  29. Subtracting B – A = B + (-A) A2 A0 A1 0 0 1 Cin Cin Cout Cout B1 B2 B0 Acomp2 Acomp0 Acomp1 Computer Architecture lecture 3

  30. Can we simplify this? Computer Architecture lecture 3

  31. Subtracting: B – A = B + (-A) A2 A0 A1 B2 B1 B0 Cin Cin Cout Cout 1 (A-B)2 (A-B)0 (A-B)1 Computer Architecture lecture 3

More Related