1 / 27

Cache Memory

Cache Memory Yi-Ning Huang Principle of Locality A phenomenon that the recent used memory location is more likely to be used again soon. What is cache memory? According to locality principle, Scientists designed cache memory to make memory more efficient.

Thomas
Télécharger la présentation

Cache Memory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cache Memory Yi-Ning Huang

  2. Principle of Locality • A phenomenon that the recent used memory location is more likely to be used again soon.

  3. What is cache memory? • According to locality principle, Scientists designed cache memory to make memory more efficient. • Cache contains the information most frequently needed by the processor. • Cache memory is a small but fast memory module inserted between the processor and the main memory.

  4. Example of grocery shopping • If you buy one single item you need immediately in the grocery store. • Then you might have to go to grocery store again soon. • It is waste of time and waste of gas ( suppose you drive to grocery store)

  5. Example of grocery shopping • Most of people will do: • You buy any items you need immediately and additional items you will most likely need in the future.

  6. Example of grocery shopping • Grocery store is similar to main memory • Your home is similar to cache. • To speed up access time, cache would stores information the computer will need immediately and will most likely need in the future.

  7. What is cache memory? • According to locality principle, Scientists designed cache memory to make memory more efficient. • Cache contains the information most frequently needed by the processor. • Cache memory is a small but fast memory module inserted between the processor and the main memory.

  8. 3 general functions in cache • Address translation function • Address mapping function • Replacement algorithm

  9. Mapping Cache • Determines where the blocks are to be located in the cache. • 3 types of mapping • Fully associative mapping • Direct mapping • Set associative mapping

  10. Mapping Cache: Direct mapping • Each primary memory address maps to a unique cache block. • If cache has N blocks, then block address X of main memory maps to cache block Y = X mod N • ( Cache block = block address mod number of blocks)

  11. Mapping Cache: Direct mapping Tag Block Byte Main memory: 64kb Address 5 3 8 Frame 0 1 … … … … 256-byte cache Block 0 1 … … 31 31 32 ,,,, 63 64 ,,,, 8191 Note: tag area is not shown

  12. Mapping Cache: Direct mapping Tag Block Byte Main memory: 64kb Address 5 3 8 Frame 0 1 … … … … 256-byte cache Block 0 1 … … 31 31 32 ,,,, 63 64 ,,,, 8191 Note: tag area is not shown

  13. Mapping Cache: Direct mapping • Use least significant b bits of the tag to indicate the cache blocks in which a frame can reside. Then, the tags would be only (p-n-b) bits long. p-n-b b n Tag Block Word Main memory Address (A) Bits in main memory address

  14. Mapping Cache: Fully associative mapping • Main memory can occupy any of the cache blocks. • Disadvantages: All tags mush be searched in order to determine a hit or miss. If number of tags are large, then this search can be time consuming.

  15. For example, a system with 64KB of primary memory and a 1KB cache. There are 8 bytes per block Block 0 1 2 3 4 5 … 127 8 bytes/block 13 bits Main memory: 64kb Frame 0 1 2 3 4 5 … 8191 8 bytes/frame Cache Tag area Data Area Hit/Miss Comparator Main memory Address (A) Tag Word 13 3 Bits in main memory address: 16

  16. Disadvantages of direct mapping and fully associative mapping • Disadvantages of fully associative mapping : All tags mush be searched in order to determine a hit or miss. If number of tags are large, then this search can be time consuming. • Disadvantages of direct mapping: It overcomes the disadvantages of fully associative mapping, but It will continue replace blocks even though there are some free blocks.

  17. Mapping Cache: Set associative mapping • Set associative mapping combines advantages and disadvantages of direct mapping and fully associative mapping. • A block can placed in a restricted set of places in the cache. • It divides the main-memory addresses into Ksets.

  18. Mapping Cache: Set associative mapping Tag Number Set Number Word Address Main memory: 64kb Address 5 3 8 Frame 0 1 … … … … 4-way set-associative cache Set 0 1 … … 31 31 32 ,,,, 63 64 ,,,, 4 slot 8191

  19. Mapping Cache: Set associative mapping • The set is usually chosen by bit selection =(Block address) MOD (Number of sets in cache)

  20. Mapping Cache: Set associative mapping Tag Number Set Number Word Address Main memory: 64kb Address 5 3 8 Frame 0 1 … … … … 4-way set-associative cache Set 0 1 … … 31 31 32 ,,,, 63 64 ,,,, 4 slot 8191

  21. Mapping Cache: Set associative mapping Tag Number Set Number Word Address Main memory: 64kb Address 5 3 8 Frame 0 1 … … … … 4-way set-associative cache Set 0 1 … … 31 31 32 ,,,, 63 64 ,,,, 4 slot 8191

  22. Mapping Cache: Set associative mapping • Direct mapped is simply one-way set associative • A fully associative cache with m blocks could be called m -way set associative.

  23. Replacement algorithm • When there is a cache miss, it requires space for new blocks. • Replacement algorithm determines which block can be replaced by new blocks. • If use direct mapping, there is only one cache block the frame can occupy. Therefore, replacement algorithm is not needed in this case.

  24. Replacement algorithm • There are three replacement algorithm. • LRU (Least recently used) • FIFO (First in first out) • Random

  25. Replacement algorithm: LRU • Replaced the least recently used block in the cache. • To determine where is LRU block, a counter can be associated with each cache block. • Advantage: This algorithm follows locality principle, so it limits number of times the block to be replaced. • Disadvantage: Implementation is more complex.

  26. Replacement algorithm: FIFO • The first-in block in the cache is replaced first. • In the other word, the block that is in the cache longest is replaced. • Advantage: Easy to implement. • Disadvantage: In some condition, blocks are replaced too frequently.

  27. Reference • Computer Organization, Design, and Architecture by Sajjan G. Shiva • http://www.cs.sjsu.edu/~lee/cs147/cs147.htm • http://www.cs.iastate.edu/~prabhu/Tutorial/CACHE/bl_place_applet.html • http://www.articlesbase.com/hardware-articles/cache-memory-675304.html

More Related