Cache performance CS 147 Prof. Lee Hai Lin Wu Cache performance  Introduction  Primary components –Cache hits Hit ratio –Cache misses  Average memory.

Slides:



Advertisements
Similar presentations
IP Router Architectures. Outline Basic IP Router Functionalities IP Router Architectures.
Advertisements

Cache and Virtual Memory Replacement Algorithms
1 Cache and Caching David Sands CS 147 Spring 08 Dr. Sin-Min Lee.
Topics covered: Memory subsystem CSE243: Introduction to Computer Architecture and Hardware/Software Interface.
Cache Here we focus on cache improvements to support at least 1 instruction fetch and at least 1 data access per cycle – With a superscalar, we might need.
M. Mateen Yaqoob The University of Lahore Spring 2014.
Chapter 12 Memory Organization
Cache Memory Locality of reference: It is observed that when a program refers to memory, the access to memory for data as well as code are confined to.
Using one level of Cache:
1 Lecture 20 – Caching and Virtual Memory  2004 Morgan Kaufmann Publishers Lecture 20 Caches and Virtual Memory.
Memory Problems Prof. Sin-Min Lee Department of Mathematics and Computer Sciences.
CS 104 Introduction to Computer Science and Graphics Problems
Computer Organization Cs 147 Prof. Lee Azita Keshmiri.
EENG449b/Savvides Lec /13/04 April 13, 2004 Prof. Andreas Savvides Spring EENG 449bG/CPSC 439bG Computer.
Chapter IX Memory Organization CS 147 Presented by: Duong Pham.
1 CS150 Introduction to Computer Science 1 Professor: Chadd Williams
CS 61C: Great Ideas in Computer Architecture
Computer Architecture Part III-C: Memory Access and Management.
Lecture 33: Chapter 5 Today’s topic –Cache Replacement Algorithms –Multi-level Caches –Virtual Memories 1.
Maninder Kaur CACHE MEMORY 24-Nov
Memory Systems Architecture and Hierarchical Memory Systems
7-1 Chapter 7 - Memory Principles of Computer Architecture by M. Murdocca and V. Heuring © 1999 M. Murdocca and V. Heuring Principles of Computer Architecture.
7-1 Chapter 7 - Memory Department of Information Technology, Radford University ITEC 352 Computer Organization Principles of Computer Architecture Miles.
Memory Hierarchy and Cache Memory Jennifer Tsay CS 147 Section 3 October 8, 2009.
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 34 Paging Implementation.
Chapter Twelve Memory Organization
1 Memory Hierarchy The main memory occupies a central position by being able to communicate directly with the CPU and with auxiliary memory devices through.
Reuse Distance as a Metric for Cache Behavior Kristof Beyls and Erik D’Hollander Ghent University PDCS - August 2001.
L/O/G/O Cache Memory Chapter 3 (b) CS.216 Computer Architecture and Organization.
Computer Science and Engineering Copyright by Hesham El-Rewini Advanced Computer Architecture CSE 8383 January Session 2.
By Andrew Yee. Virtual Memory Memory Management What is Page Replacement?
CSE 241 Computer Engineering (1) هندسة الحاسبات (1) Lecture #3 Ch. 6 Memory System Design Dr. Tamer Samy Gaafar Dept. of Computer & Systems Engineering.
Cache Memory By Tom Austin. What is cache memory? A cache is a collection of duplicate data, where the original data is expensive to fetch or compute.
1 Memory Management. 2 Fixed Partitions Legend Free Space 0k 4k 16k 64k 128k Internal fragmentation (cannot be reallocated) Divide memory into n (possible.
MEMORY ORGANIZATION - Memory hierarchy - Main memory - Auxiliary memory - Cache memory.
Introduction: Memory Management 2 Ideally programmers want memory that is large fast non volatile Memory hierarchy small amount of fast, expensive memory.
CS /02 Semester II Help Session IIA Performance Measures Colin Tan S
Lecture 15 Calculating and Improving Cache Perfomance
Nov. 15, 2000Systems Architecture II1 Machine Organization (CS 570) Lecture 8: Memory Hierarchy Design * Jeremy R. Johnson Wed. Nov. 15, 2000 *This lecture.
Chapter 9 Memory Organization By Nguyen Chau Topics Hierarchical memory systems Cache memory Associative memory Cache memory with associative mapping.
Paging Paging is a memory-management scheme that permits the physical-address space of a process to be noncontiguous. Paging avoids the considerable problem.
Computer Organization CS224 Fall 2012 Lessons 41 & 42.
Cache Memory By Ed Martinez.  The fastest and most expensive memory on a computer system that is used to store collections of data.  Uses very short.
Additional Cache Notes Dan Nguyen Spring Lee.
What is it and why do we need it? Chris Ward CS147 10/16/2008.
Cache Small amount of fast memory Sits between normal main memory and CPU May be located on CPU chip or module.
Hierarchical Memory Systems Prof. Sin-Min Lee Department of Computer Science.
Cache memory Replacement Policy Prof. Sin-Min Lee Department of Computer Science.
Introduction to computer architecture April 7th. Access to main memory –E.g. 1: individual memory accesses for j=0, j++, j
نظام المحاضرات الالكترونينظام المحاضرات الالكتروني Cache Memory.
Virtual Memory By CS147 Maheshpriya Venkata. Agenda Review Cache Memory Virtual Memory Paging Segmentation Configuration Of Virtual Memory Cache Memory.
Characteristics Location Capacity Unit of transfer Access method Performance Physical type Physical characteristics Organisation.
Chapter 9 Memory Organization. 9.1 Hierarchical Memory Systems Figure 9.1.
Cache memory Replacement Policy, Virtual Memory Prof. Sin-Min Lee Department of Computer Science.
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 33 Paging Read Ch. 9.4.
Cache Issues Computer Organization II 1 Main Memory Supporting Caches Use DRAMs for main memory – Fixed width (e.g., 1 word) – Connected by fixed-width.
CS2100 Computer Organization
Computer Architecture
Cache Memory Presentation I
CGS 3763 Operating Systems Concepts Spring 2013
ECE 445 – Computer Organization
3.4 Computer systems Boolean logic Lesson 2.
Set-Associative Cache
Memory Organization.
Exercise (11).
Virtual Memory: Working Sets
Sarah Diesburg Operating Systems CS 3430
Chapter Contents 7.1 The Memory Hierarchy 7.2 Random Access Memory
Overview Problem Solution CPU vs Memory performance imbalance
Presentation transcript:

Cache performance CS 147 Prof. Lee Hai Lin Wu

Cache performance  Introduction  Primary components –Cache hits Hit ratio –Cache misses  Average memory access time

Why we use cache memory in a computer? The primary reason: to improve system performance by reducing the time needed to access memory.

Cache hits  Definition –Every time the CPU accesses memory, it checks the cache. If the requested data is in the cache, the CPU accesses the data in the cache, rather than physical memory; this is a cache hit.

A simple example Every time CPU access memory it checks cache first. CPU check accesses data (not in cache) cache Physical memory

Cache misses  Definition: –If the data is not in the cache, the CPU accesses the data from main memory(and usually writes the data into the cache as well). This is a cache miss.

Hit ratio  Definition: –The hit ratio is the percentage of memory accesses that are served from the cache. Hit Ratio = #of hits / total Strings

Average Memory access time  Symbol: T M

Formula for T M  T M = hT c + (1 – h) T p *Tc – cache access time *Tp – physical memory access time *h -- hit ratio (Note: Tc and Tp are always given)

Relationship between Hit ratios and Average memory access times  Table: So, increase the hit ratio reduce the average memory access time. h T M (ns)

Calculating the H and T M  We will learn how to calculate the H and T M with two different type of cache * associative cache (FIFO) * two-way set-associative cache(LRU)  Besides, there is a direct mapped cache

Example of associative cache (FIFO)

Example cont. Given: Tc = 10ns Tp = 60ns From the previous table we get: hits = 7 inputs = 18 ** h (hit ratio) = hits/ inputs = 7/18 = ** T M = h Tc + (1 – h) Tp = 0.89*10 + ( ) 60 = ns

Two-way associative set- associative cache(LRU)  Hit ratio h =  TM = ns

A simple example for LRU  Since the example from book is kind of hard to understand, we can see the following simple example.

Simple example of LRU  3 frame  10 inputs: 1, 2, 3, 0, 2, 3, 1, 3, 0, 4

Construct a table  Table Data Hit ***

Direct mapped cache  The hit ratio h =  The TM = ns

The End!!

BUT… You have to remember  How to set up the table for FIFO and LRU  How to calculate – Hit Ratio –Average memory access time Gook luck on finals!!