Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSC 213 – Large Scale Programming. Today’s Goals  Consider what new does & how Java works  What are traditional means of managing memory?  Why did.

Similar presentations


Presentation on theme: "CSC 213 – Large Scale Programming. Today’s Goals  Consider what new does & how Java works  What are traditional means of managing memory?  Why did."— Presentation transcript:

1 CSC 213 – Large Scale Programming

2 Today’s Goals  Consider what new does & how Java works  What are traditional means of managing memory?  Why did they change how this was done for Java?  What are the benefits & costs of these changes?  Examine real-world use of graphs & its benefits  How do all of those graph algorithms get used?  Can we take advantage of this knowledge somehow?  What occurs in real-world we have not covered?  And why is beer ALWAYS answer to life’s problems

3 Explicit Memory Management  Traditional form of memory management  Used a lot, but fallen out of favor  malloc / new  Commands used to allocate space for an object  free / delete  Return memory to system using these command  Simple to use

4 Explicit Memory Management  Traditional form of memory management  Used a lot, but fallen out of favor  malloc / new  Commands used to allocate space for an object  free / delete  Return memory to system using these command  Simple to use, but tricky to get right memory leak  Forget to free  memory leak dangling pointer  free too soon  dangling pointer

5 Dangling Pointers Node x = new Node(“happy”);

6 Dangling Pointers Node x = new Node(“happy”); Node ptr = x; Node x = new Node(“happy”); Node ptr = x;

7 Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet! Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet!

8 Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet! Node y = new Node(“sad”); Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet! Node y = new Node(“sad”);

9 Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet! Node y = new Node(“sad”); cout << ptr.data << endl; // sad  Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet! Node y = new Node(“sad”); cout << ptr.data << endl; // sad 

10 Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet! Node y = new Node(“sad”); cout << ptr.data << endl; // sad  Node x = new Node(“happy”); Node ptr = x; delete x; // But I’m not dead yet! Node y = new Node(“sad”); cout << ptr.data << endl; // sad 

11 Solution: Garbage Collection  Allocate objects into program’s heap  No relation to heap implementing a priority queue  This heap is simply a “pile of memory”  Garbage collector scans objects on heap  Starts at references in program stack & static fields  Finds objects reachable from those program roots  We consider the unreachable objects “garbage”  Cannot be used again, so safe to remove from heap  Need to include free command is eliminated

12 No More Dangling Pointers Node x = new Node(“happy”);

13 No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; Node x = new Node(“happy”); Node ptr = x;

14 No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim! Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim!

15 No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim! Node y = new Node(“sad”); Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim! Node y = new Node(“sad”);

16 No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim! Node y = new Node(“sad”); cout << ptr.data << endl; // happy! Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim! Node y = new Node(“sad”); cout << ptr.data << endl; // happy!

17 No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim! Node y = new Node(“sad”); cout << ptr.data << endl; // happy! Node x = new Node(“happy”); Node ptr = x; // x reachable through ptr so cannot reclaim! Node y = new Node(“sad”); cout << ptr.data << endl; // happy!

18  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

19  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

20  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

21  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

22  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

23  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

24  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

25  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

26  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

27  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

28  Static & locals are called root references  Must compute objects in their transitive closure Garbage Collection HEAP

29 Garbage Collection HEAP  Remove unmarked objects from the heap

30 Garbage Collection HEAP  Remove unmarked objects from the heap

31 Garbage Collection HEAP  Remove unmarked objects from the heap  New objects allocated into empty spaces

32 Why Not Always Use GC?  Garbage collection has obvious benefits  Eliminates some errors that often occurs  Added benefit: also makes programming easier

33 Why Not Always Use GC?  GC also has several drawbacks couldwill  Reachable objects could, not will, be used again  More memory needed to hold the extra objects  It takes time to compute reachable objects unreachable livedead reachable

34 Why Not Always Use GC?  GC also has several drawbacks couldwill  Reachable objects could, not will, be used again  More memory needed to hold the extra objects  It takes time to compute reachable objects unreachable livedead reachable obj = new Object(); free(obj)

35 Why Not Always Use GC?  GC also has several drawbacks couldwill  Reachable objects could, not will, be used again  More memory needed to hold the extra objects  It takes time to compute reachable objects unreachable livedead reachable free(obj) obj = new Object(); can be explicitly freed

36 Why Not Always Use GC?  GC also has several drawbacks couldwill  Reachable objects could, not will, be used again  More memory needed to hold the extra objects  It takes time to compute reachable objects unreachable livedead reachable can be garbage collected free(??)

37 Cost of Accessing Memory  How long memory access takes is also important  Will make a major difference in time program takes  Imaginary scenario used to consider this effect:

38 Cost of Accessing Memory  How long memory access takes is also important  Will make a major difference in time program takes  Imaginary scenario used to consider this effect: I want a beer

39 Registers and Caches  Inside the CPU, find first levels of memory  At the lowest level, are processor’s registers

40 Registers and Caches  Inside the CPU, find first levels of memory  At the lowest level, are processor’s registers

41 Registers and Caches  Inside the CPU, find first levels of memory  At the lowest level, are processor’s registers  Very, very fast but…  … number of beers held is limited

42 Registers and Caches  Inside the CPU, find first levels of memory  At the lowest level, are processor’s registers  Use caches at next level for dearest memory

43 Registers and Caches  Inside the CPU, find first levels of memory  At the lowest level, are processor’s registers  Use caches at next level for dearest memory

44 Registers and Caches  Inside the CPU, find first levels of memory  At the lowest level, are processor’s registers  Use caches at next level for dearest memory  More space than registers, but…  … not as fast (walk across room)  Will need more beer if party is good

45 Horrors!  Processor does its best to keep memory local  Caches organized to hold memory needed soon  Makes guesses, since this requires predicting future  Will eventually drink all beer in house

46 Horrors!  Processor does its best to keep memory local  Caches organized to hold memory needed soon  Makes guesses, since this requires predicting future  Will eventually drink all beer in house

47 Horrors!  Processor does its best to keep memory local  Caches organized to hold memory needed soon  Makes guesses, since this requires predicting future  Will eventually drink all beer in house  12MB is largest cache size at the moment  Many programs need more than this  What do we do?

48 When the House Runs Dry…  What do you normally do when all beer gone?  Must go to store to get more…  … but do not want a DUI so we must walk to store  Processor uses RAM to store data that cannot fit  RAM sizes are much, much larger than caches  100 x slower to access, however

49 When Store Is Out Of Beer...

50

51 Ein Glass Bier, Bitte  Get SCUBA gear ready for WALK to Germany  Should find enough beer to handle any situation  But buzz destroyed by the very long wait per glass  If Germany runs out, you're drinking too much

52 Non-Beer Example of Hierarchy but only 3 can fit in RAM Program takes 4 pages, but only 3 can fit in RAM RAM Hard Disk

53 GC follows references from object to object Non-Beer Example of Hierarchy RAM Hard Disk

54 Bringing memory into RAM when references require it Non-Beer Example of Hierarchy RAM Hard Disk

55 But we don't have space, Non-Beer Example of Hierarchy RAM Hard Disk

56 But we don't have space, so another page is evicted Non-Beer Example of Hierarchy RAM Hard Disk

57 But we don't have space, so another page is evicted Non-Beer Example of Hierarchy RAM Hard Disk

58 Now GC can continue Non-Beer Example of Hierarchy RAM Hard Disk

59 Which requires walking to Germany again Non-Beer Example of Hierarchy RAM Hard Disk

60 Which requires walking to Germany again Non-Beer Example of Hierarchy RAM Hard Disk

61 Before GC continues to find the reachable objects Non-Beer Example of Hierarchy RAM Hard Disk

62 And we find that we again walk to Germany Non-Beer Example of Hierarchy RAM Hard Disk

63 Walking To Germany Is Slow…

64 What Does This Mean?  Large data sets require more thought & care  Start with, but do not end at, big-Oh notation  Consider memory costs and how to limit them  Most data structures do not grow this large  S TACK, Q UEUE, S EQUENCE rarely get above 1GB  Using very, very large G RAPH is not typical  Databases are largest data sets anywhere  Which data structures & implementations affected?

65 For Next Lecture  Remember, design for your program #3 due  Does it make sense? Think before submitting  Reading on memory hierarchy for Wednesday  How can we use experience of wanting a beer?  Useful ways to see what happens & take advantage


Download ppt "CSC 213 – Large Scale Programming. Today’s Goals  Consider what new does & how Java works  What are traditional means of managing memory?  Why did."

Similar presentations


Ads by Google