1 / 23

10/16: Lecture Topics

10/16: Lecture Topics. Memory problem Memory Solution: Caches Locality Types of caches Fully associative Direct mapped n-way associative. Memory Problem. If all memory accesses (lw/sw) accessed main memory, programs would run 20 times slower And it’s getting worse

garran
Télécharger la présentation

10/16: Lecture Topics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 10/16: Lecture Topics • Memory problem • Memory Solution: Caches • Locality • Types of caches • Fully associative • Direct mapped • n-way associative

  2. Memory Problem • If all memory accesses (lw/sw) accessed main memory, programs would run 20 times slower • And it’s getting worse • processors speed up by 50% annually • memory accesses speed up by 9% annually • it’s becoming harder and harder to keep these processors fed

  3. fast, small, expensive storage slow, large, cheap storage Solution: Memory Hierarchy • Keep all data in the big, slow, cheap storage • Keep copies of the “important” data in the small, fast, expensive storage

  4. Memory Hierarchy

  5. What is a Cache? • A cache allows for fast accesses to a subset of a larger data store • Your web browser’s cache gives you fast access to pages you visited recently • faster because it’s stored locally • subset because the web won’t fit on your disk • The memory cache gives the processor fast access to memory that it used recently • faster because it’s located on the CPU

  6. Locality • Temporal locality: the principle that data being accessed now will probably be accessed again soon • Useful data tends to continue to be useful • Spatial locality: the principle that data near the data being accessed now will probably be needed soon • If data item n is useful now, then it’s likely that data item n+1 will be useful soon

  7. Memory Access Patterns • Memory accesses don’t look like this • random accesses • Memory accesses do look like this • hot variables • step through arrays

  8. Cache Terminology • Hit—the data item is in the cache • Miss—the data item is not in the cache • Hit rate—the percentage of time the data item is in the cache (hit ratio) • Miss rate—the percentage of time the data item is not in the cache (hit ratio) • Hit time—the time required to access data in the cache • Miss time—the time required to access data not in the cache (miss penalty) • Access time—the time required to access a level of the memory hierarchy

  9. Effective Access Time aka, Average Memory Access Time (AMAT) teffective = htcache + (1-h)tmemory cache access time memory access time effective access time cache hit rate cache miss rate

  10. Access Time Example • Suppose tmemory for main memory is 50ns • Suppose tcache for the processor cache is 2ns • We want an effective access time of 3ns • What hit rate h do we need?

  11. Cache Contents • When do we put something in the cache? • when it is used for the first time • When do we take something out of the cache? • when it hasn’t been used for a long time • subset because all of memory won’t fit on the CPU

  12. cache Concepts in Caching • Assume a two level hierarchy: • Level 1: a cache that can hold 8 words • Level 2: a memory that can hold 32 words 1110100 0000100 0001000 0001100 0000000 1111000 1111100 0010000 0010100 memory

  13. Fully Associative Cache • In a fully associative cache, • any memory word can be placed in any cache line • each cache line stores which address it contains • accesses are slow (but not as slow as you would think)

  14. Direct Mapped Caches • Fully associative caches are too slow • With direct mapped caches the address of the item determines where in the cache to store it • In this case, the lower five bits of the address dictate the cache entry

  15. Direct Mapped Cache • The lower 5 bits of the address determine where it is located in the table

  16. Address Tags • A tag is a label for a cache entry indicating where it came from • The upper bits of the data item’s address

  17. Cache with Address Tag

  18. Reference Stream Example index vb tag data 11010, 10111, 00001, 11010, 11011, 11111, 01101, 11010 000 001 010 011 100 101 110 111

  19. Sample L1 Cache • Suppose a L1 cache has the following characteristics • one word stored per entry • 1024 entries • direct mapped • How many total bytes are stored in the cache? • How many bits are used for the index and tag fields of the address? • How many total bits are stored in the cache?

  20. N-way Set Associative Caches • Direct mapped caches cannot store two addresses with the same index • If two addresses collide, then you have to kick one of them out • 2-way associative caches can store two different addresses with the same index • Reduces misses due to conflicts • Also, 3-way, 4-way and 8-way set associative • Larger sets imply slower accesses

  21. 2-way Associative Cache

  22. Associativity Spectrum Direct Mapped Fast to access Conflict Misses N-way Associative Slower to access Fewer Conflict Misses Fully Associative Slow to access No Conflict Misses

More Related