1 Lecture: Virtual Memory, DRAM Main Memory Topics: virtual memory, TLB/cache access, DRAM intro (Sections 2.2)

Slides:



Advertisements
Similar presentations
Lecture 19: Cache Basics Today’s topics: Out-of-order execution
Advertisements

1 Lecture 20: Cache Hierarchies, Virtual Memory Today’s topics:  Cache hierarchies  Virtual memory Reminder:  Assignment 8 will be posted soon (due.
CSCE 212 Chapter 7 Memory Hierarchy Instructor: Jason D. Bakos.
Lecture 12: DRAM Basics Today: DRAM terminology and basics, energy innovations.
1 Lecture 14: Cache Innovations and DRAM Today: cache access basics and innovations, DRAM (Sections )
1 Lecture 15: DRAM Design Today: DRAM basics, DRAM innovations (Section 5.3)
1 Lecture 14: Virtual Memory Topics: virtual memory (Section 5.4) Reminders: midterm begins at 9am, ends at 10:40am.
Computer ArchitectureFall 2008 © November 10, 2007 Nael Abu-Ghazaleh Lecture 23 Virtual.
1 Lecture 15: Virtual Memory and Large Caches Today: TLB design and large cache design basics (Sections )
Memory Management and Paging CSCI 3753 Operating Systems Spring 2005 Prof. Rick Han.
Computer ArchitectureFall 2007 © November 21, 2007 Karem A. Sakallah Lecture 23 Virtual Memory (2) CS : Computer Architecture.
1 Lecture 16: Virtual Memory Today: DRAM innovations, virtual memory (Sections )
The Memory Hierarchy II CPSC 321 Andreas Klappenecker.
Memory Organization.
Lecture 17: Virtual Memory, Large Caches
1 Lecture 13: Cache Innovations Today: cache access basics and innovations, DRAM (Sections )
©UCB CS 162 Ch 7: Virtual Memory LECTURE 13 Instructor: L.N. Bhuyan
1 Lecture 14: Virtual Memory Today: DRAM and Virtual memory basics (Sections )
1 Lecture 21: Virtual Memory, I/O Basics Today’s topics:  Virtual memory  I/O overview Reminder:  Assignment 8 due Tue 11/21.
1 Lecture 1: Introduction and Memory Systems CS 7810 Course organization:  5 lectures on memory systems  5 lectures on cache coherence and consistency.
Lecture 21 Last lecture Today’s lecture Cache Memory Virtual memory
Computer Architecture Lecture 28 Fasih ur Rehman.
The Memory Hierarchy 21/05/2009Lecture 32_CA&O_Engr Umbreen Sabir.
1 Lecture 13: Cache, TLB, VM Today: large caches, virtual memory, TLB (Sections 2.4, B.4, B.5)
Virtual Memory Part 1 Li-Shiuan Peh Computer Science & Artificial Intelligence Lab. Massachusetts Institute of Technology May 2, 2012L22-1
1 Virtual Memory Main memory can act as a cache for the secondary storage (disk) Advantages: –illusion of having more physical memory –program relocation.
1 Lecture 14: DRAM Main Memory Systems Today: cache/TLB wrap-up, DRAM basics (Section 2.3)
1 Lecture 16: Virtual Memory Topics: virtual memory, improving TLB performance (Sections )
Multilevel Caches Microprocessors are getting faster and including a small high speed cache on the same chip.
1  2004 Morgan Kaufmann Publishers Chapter Seven Memory Hierarchy-3 by Patterson.
CS2100 Computer Organisation Virtual Memory – Own reading only (AY2015/6) Semester 1.
Memory Management Continued Questions answered in this lecture: What is paging? How can segmentation and paging be combined? How can one speed up address.
1 Lecture: Virtual Memory Topics: virtual memory, TLB/cache access (Sections 2.2)
1 Lecture: Virtual Memory, DRAM Main Memory Topics: virtual memory, TLB/cache access, DRAM intro (Sections 2.2)
1 Lecture: DRAM Main Memory Topics: DRAM intro and basics (Section 2.3)
3/1/2002CSE Virtual Memory Virtual Memory CPU On-chip cache Off-chip cache DRAM memory Disk memory Note: Some of the material in this lecture are.
CDA 5155 Virtual Memory Lecture 27. Memory Hierarchy Cache (SRAM) Main Memory (DRAM) Disk Storage (Magnetic media) CostLatencyAccess.
1 Lecture: Large Caches, Virtual Memory Topics: cache innovations (Sections 2.4, B.4, B.5)
1 Lecture 20: OOO, Memory Hierarchy Today’s topics:  Out-of-order execution  Cache basics.
CMSC 611: Advanced Computer Architecture Memory & Virtual Memory Some material adapted from Mohamed Younis, UMBC CMSC 611 Spr 2003 course slides Some material.
1 Lecture: Memory Basics and Innovations Topics: memory organization basics, schedulers, refresh,
Lecture 11 Virtual Memory
Virtual Memory Chapter 7.4.
Memory COMPUTER ARCHITECTURE
CS161 – Design and Architecture of Computer
Lecture: Large Caches, Virtual Memory
CS703 - Advanced Operating Systems
CS 704 Advanced Computer Architecture
Lecture: Large Caches, Virtual Memory
Lecture 23: Cache, Memory, Security
Lecture 15: DRAM Main Memory Systems
Lecture 14 Virtual Memory and the Alpha Memory Hierarchy
Part V Memory System Design
CMSC 611: Advanced Computer Architecture
Lecture: DRAM Main Memory
Lecture 23: Cache, Memory, Virtual Memory
Lecture 22: Cache Hierarchies, Memory
Lecture: DRAM Main Memory
Lecture: DRAM Main Memory
Page that info back into your memory!
Lecture: Cache Innovations, Virtual Memory
Lecture 24: Memory, VM, Multiproc
Morgan Kaufmann Publishers Memory Hierarchy: Virtual Memory
Lecture 20: OOO, Memory Hierarchy
Lecture 15: Memory Design
Lecture 22: Cache Hierarchies, Memory
Lecture 24: Virtual Memory, Multiprocessors
Lecture 23: Virtual Memory, Multiprocessors
Lecture 8: Efficient Address Translation
Presentation transcript:

1 Lecture: Virtual Memory, DRAM Main Memory Topics: virtual memory, TLB/cache access, DRAM intro (Sections 2.2)

2 TLB Since the number of pages is very high, the page table capacity is too large to fit on chip A translation lookaside buffer (TLB) caches the virtual to physical page number translation for recent accesses A TLB miss requires us to access the page table, which may not even be found in the cache – two expensive memory look-ups to access one word of data! A large page size can increase the coverage of the TLB and reduce the capacity of the page table, but also increases memory waste

3 TLB and Cache Is the cache indexed with virtual or physical address?  To index with a physical address, we will have to first look up the TLB, then the cache  longer access time  Multiple virtual addresses can map to the same physical address – can we ensure that these different virtual addresses will map to the same location in cache? Else, there will be two different copies of the same physical memory word Does the tag array store virtual or physical addresses?  Since multiple virtual addresses can map to the same physical address, a virtual tag comparison can flag a miss even if the correct physical memory word is present

4 TLB and Cache

5 Virtually Indexed Caches 24-bit virtual address, 4KB page size  12 bits offset and 12 bits virtual page number To handle the example below, the cache must be designed to use only 12 index bits – for example, make the 64KB cache 16-way Page coloring can ensure that some bits of virtual and physical address match abcdefabbdef Page in physical memory Data cache that needs 16 index bits 64KB direct-mapped or 128KB 2-way… cdef bdef Virtually indexed cache

6 Cache and TLB Pipeline TLB Virtual address Tag arrayData array Physical tag comparion Virtual page number Virtual index Offset Physical page number Physical tag Virtually Indexed; Physically Tagged Cache

7 Problem 1 Assume that page size is 16KB and cache block size is 32 B. If I want to implement a virtually indexed physically tagged L1 cache, what is the largest direct-mapped L1 that I can implement? What is the largest 2-way cache that I can implement?

8 Problem 1 Assume that page size is 16KB and cache block size is 32 B. If I want to implement a virtually indexed physically tagged L1 cache, what is the largest direct-mapped L1 that I can implement? What is the largest 2-way cache that I can implement? There are 14 page offset bits. If 5 of them are used for block offset, there are 9 more that I can use for index. 512 sets  16KB direct-mapped or 32KB 2-way cache

9 Protection The hardware and operating system must co-operate to ensure that different processes do not modify each other’s memory The hardware provides special registers that can be read in user mode, but only modified by instrs in supervisor mode A simple solution: the physical memory is divided between processes in contiguous chunks by the OS and the bounds are stored in special registers – the hardware checks every program access to ensure it is within bounds Protection bits are tracked in the TLB on a per-page basis

10 Superpages If a program’s working set size is 16 MB and page size is 8KB, there are 2K frequently accessed pages – a 128-entry TLB will not suffice By increasing page size to 128KB, TLB misses will be eliminated – disadvantage: memory waste, increase in page fault penalty Can we change page size at run-time? Note that a single page has to be contiguous in physical memory

11 Superpages Implementation At run-time, build superpages if you find that contiguous virtual pages are being accessed at the same time For example, virtual pages may be frequently accessed – coalesce these pages into a single superpage of size 128KB that has a single entry in the TLB The physical superpage has to be in contiguous physical memory – the 16 physical pages have to be moved so they are contiguous … virtualphysicalvirtualphysical

12 Ski Rental Problem Promoting a series of contiguous virtual pages into a superpage reduces TLB misses, but has a cost: copying physical memory into contiguous locations Page usage statistics can determine if pages are good candidates for superpage promotion, but if cost of a TLB miss is x and cost of copying pages is Nx, when do you decide to form a superpage? If ski rentals cost $50 and new skis cost $500, when do I decide to buy new skis?  If I rent 10 times and then buy skis, I’m guaranteed to not spend more than twice the optimal amount

13 DRAM Main Memory Main memory is stored in DRAM cells that have much higher storage density DRAM cells lose their state over time – must be refreshed periodically, hence the name Dynamic DRAM access suffers from long access time and high energy overhead

14 Memory Architecture Processor Memory Controller Address/Cmd Data DIMM Bank Row Buffer DIMM: a PCB with DRAM chips on the back and front Rank: a collection of DRAM chips that work together to respond to a request and keep the data bus full A 64-bit data bus will need 8 x8 DRAM chips or 4 x16 DRAM chips or.. Bank: a subset of a rank that is busy during one request Row buffer: the last row (say, 8 KB) read from a bank, acts like a cache

15 DRAM Array Access 16Mb DRAM array = 4096 x 4096 array of bits 12 row address bits arrive first Column decoder 12 column address bits arrive next Some bits returned to CPU 4096 bits are read out Row Access Strobe (RAS) Column Access Strobe (CAS)Row Buffer

16 Title Bullet