Introduction to Algorithms

Slides:



Advertisements
Similar presentations
Chapter 11. Hash Tables.
Advertisements

Hash Tables.
Hash Tables CIS 606 Spring 2010.
CSCE 3400 Data Structures & Algorithm Analysis
CS 253: Algorithms Chapter 11 Hashing Credit: Dr. George Bebis.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
1.1 Data Structure and Algorithm Lecture 9 Hashing Topics Reference: Introduction to Algorithm by Cormen Chapter 12: Hash Tables.
11.Hash Tables Hsu, Lih-Hsing. Computer Theory Lab. Chapter 11P Directed-address tables Direct addressing is a simple technique that works well.
Hashing COMP171 Fall Hashing 2 Hash table * Support the following operations n Find n Insert n Delete. (deletions may be unnecessary in some applications)
Tirgul 9 Hash Tables (continued) Reminder Examples.
Tirgul 7. Find an efficient implementation of a dynamic collection of elements with unique keys Supported Operations: Insert, Search and Delete. The keys.
COMP 171 Data Structures and Algorithms Tutorial 10 Hash Tables.
Lecture 10: Search Structures and Hashing
Hashing General idea: Get a large array
Spring 2015 Lecture 6: Hash Tables
Algorithm Course Dr. Aref Rashad February Algorithms Course..... Dr. Aref Rashad Part: 4 Search Algorithms.
Implementing Dictionaries Many applications require a dynamic set that supports dictionary-type operations such as Insert, Delete, and Search. E.g., a.
Data Structures Hash Tables. Hashing Tables l Motivation: symbol tables n A compiler uses a symbol table to relate symbols to associated data u Symbols:
David Luebke 1 10/25/2015 CS 332: Algorithms Skip Lists Hash Tables.
Hashing1 Hashing. hashing2 Observation: We can store a set very easily if we can use its keys as array indices: A: e.g. SEARCH(A,k) return A[k]
Hashing Sections 10.2 – 10.3 CS 302 Dr. George Bebis.
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Hashing Hashing is another method for sorting and searching data.
David Luebke 1 11/26/2015 Hash Tables. David Luebke 2 11/26/2015 Hash Tables ● Motivation: Dictionaries ■ Set of key/value pairs ■ We care about search,
1 Hashing - Introduction Dictionary = a dynamic set that supports the operations INSERT, DELETE, SEARCH Dictionary = a dynamic set that supports the operations.
Hashing Basis Ideas A data structure that allows insertion, deletion and search in O(1) in average. A data structure that allows insertion, deletion and.
Tirgul 11 Notes Hash tables –reminder –examples –some new material.
October 5, 2005Copyright © by Erik D. Demaine and Charles E. LeisersonL7.1 Prof. Charles E. Leiserson L ECTURE 8 Hashing II Universal hashing Universality.
Introduction to Algorithms 6.046J/18.401J LECTURE7 Hashing I Direct-access tables Resolving collisions by chaining Choosing hash functions Open addressing.
Hashtables. An Abstract data type that supports the following operations: –Insert –Find –Remove Search trees can be used for the same operations but require.
Midterm Midterm is Wednesday next week ! The quiz contains 5 problems = 50 min + 0 min more –Master Theorem/ Examples –Quicksort/ Mergesort –Binary Heaps.
CS6045: Advanced Algorithms Data Structures. Hashing Tables Motivation: symbol tables –A compiler uses a symbol table to relate symbols to associated.
1 Hash Tables Chapter Motivation Many applications require only: –Insert –Search –Delete Examples –Symbol tables –Memory management mechanisms.
CSC 413/513: Intro to Algorithms Hash Tables. ● Hash table: ■ Given a table T and a record x, with key (= symbol) and satellite data, we need to support:
CE 221 Data Structures and Algorithms
Hashing Jeff Chastine.
Hashing, Hash Function, Collision & Deletion
Hash table CSC317 We have elements with key and satellite data
CS 332: Algorithms Hash Tables David Luebke /19/2018.
Hashing - resolving collisions
Hashing Alexandra Stefan.
Introduction to Algorithms 6.046J/18.401J
Handling Collisions Open Addressing SNSCT-CSE/16IT201-DS.
Hashing Alexandra Stefan.
Data Structures Using C++ 2E
Hash functions Open addressing
Hashing and Hash Tables
CSE 2331/5331 Topic 8: Hash Tables CSE 2331/5331.
Dictionaries and Their Implementations
Introduction to Algorithms 6.046J/18.401J
Resolving collisions: Open addressing
Hash Tables “hash collision n. [from the techspeak] (var. ‘hash clash’) When used of people, signifies a confusion in associative memory or imagination,
Data Structures and Algorithms
Hash Tables – 2 Comp 122, Spring 2004.
CSCE 3110 Data Structures & Algorithm Analysis
CH 9.2 : Hash Tables Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and.
Data Structures – Week #7
Hashing Alexandra Stefan.
CH 9.2 : Hash Tables Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and.
CS202 - Fundamental Structures of Computer Science II
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Hashing Sections 10.2 – 10.3 Lecture 26 CS302 Data Structures
EE 312 Software Design and Implementation I
CS 5243: Algorithms Hash Tables.
Data Structures – Week #7
CS 3343: Analysis of Algorithms
Slide Sources: CLRS “Intro. To Algorithms” book website
EE 312 Software Design and Implementation I
Hash Tables – 2 1.
Lecture-Hashing.
Presentation transcript:

Introduction to Algorithms 6.046J/18.401J LECTURE 7 Hashing I Direct-access tables Resolving collisions by chaining Choosing hash functions Open addressing Prof. Charles E. Leiserson Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson October 3, 2005 L7.1

Symbol-table problem Symbol table S holding n records: x Operations on S: INSERT(S, x) DELETE(S, x) SEARCH(S, k) key[x] key[x] Other fields containing satellite data How should the data structure S be organized? October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.2

Direct-access table IDEA: Suppose that the keys are drawn from the set U  {0, 1, …, m–1}, and keys are distinct. Set up an array T[0 . . m–1]: x if x  K and key[x] = k, NIL otherwise. T[k] = Then, operations take (1) time. Problem: The range of keys can be large: 64-bit numbers (which represent 18,446,744,073,709,551,616 different keys), character strings (even larger!). October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.3

Hash functions Solution: Use a hash function h to map the universe U of all keys into {0, 1, …, m–1}: T h(k1) h(k4) h(k2) = h(k5) h(k3) m–1 k1 k4 k5 S k 2 k3 U When a record to be inserted maps to an already occupied slot in T, a collision occurs. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.4 As each key is inserted, h maps it to a slot of T.

Resolving collisions by chaining Link records in the same slot into a list. T Worst case: Every key hashes to the same slot. Access time = (n) if |S| = n i 49 86 52 49 86 52 h(49) = h(86) = h(52) = i October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.5

Average-case analysis of chaining We make the assumption of simple uniform hashing: Each key k  S is equally likely to be hashed to any slot of table T, independent of where other keys are hashed. Let n be the number of keys in the table, and let m be the number of slots. Define the load factor of T to be  = n/m = average number of keys per slot. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.6

Search cost The expected time for an unsuccessful search for a record with a given key is = (1 + ). October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.7

Search cost The expected time for an unsuccessful search for a record with a given key is = (1 + ). search the list apply hash function and access slot October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.8

Search cost The expected time for an unsuccessful search for a record with a given key is = (1 + ). search the list apply hash function and access slot Expected search time = (1) if  = O(1), or equivalently, if n = O(m). October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.9

Search cost The expected time for an unsuccessful search for a record with a given key is = (1 + ). search the list apply hash function and access slot Expected search time = (1) if  = O(1), or equivalently, if n = O(m). A successful search has same asymptotic bound, but a rigorous argument is a little more complicated. (See textbook.) October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.10

Choosing a hash function The assumption of simple uniform hashing is hard to guarantee, but several common techniques tend to work well in practice as long as their deficiencies can be avoided. Desirata: A good hash function should distribute the keys uniformly into the slots of the table. Regularity in the key distribution should not affect this uniformity. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.11

Division method Assume all keys are integers, and define h(k) = k mod m. Deficiency: Don’t pick an m that has a small divisor d. A preponderance of keys that are congruent modulo d can adversely affect uniformity. Extreme deficiency: If m = 2r, then the hash doesn’t even depend on all the bits of k: If k = 10110001110110102 and r = 6, then h(k) = 0110102 . h(k) October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.12

Division method (continued) h(k) = k mod m. Pick m to be a prime not too close to a power of 2 or 10 and not otherwise used prominently in the computing environment. Annoyance: Sometimes, making the table size a prime is inconvenient. But, this method is popular, although the next method we’ll see is usually superior. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.13

Multiplication method Assume that all keys are integers, m = 2r, and our computer has w-bit words. Define h(k) = (A·k mod 2w) rsh (w – r), where rsh is the “bitwise right-shift” operator and A is an odd integer in the range 2w–1 < A < 2w. Don’t pick A too close to 2w–1 or 2w. Multiplication modulo 2w is fast compared to division. The rsh operator is fast. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.14

Multiplication method example h(k) = (A·k mod 2w) rsh (w – r) Suppose that m = 8 = 23 and that our computer has w = 7-bit words: 3A . . 1 0 1 1 0 0 1 = A  1 1 0 1 0 1 1 = k 1 0 0 1 0 1 0 0 1 1 0 0 1 1 7 0 1 . 6 2 5 4 3 . A h(k) 2A Modular wheel Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson October 3, 2005 L7.15

Resolving collisions by open addressing No storage is used outside of the hash table itself. Insertion systematically probes the table until an empty slot is found. The hash function depends on both the key and probe number: h : U  {0, 1, …, m–1}  {0, 1, …, m–1}. The probe sequence h(k,0), h(k,1), …, h(k,m–1) should be a permutation of {0, 1, …, m–1}. The table may fill up, and deletion is difficult (but not impossible). October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.16

Example of open addressing Insert key k = 496: T 586 133 204 481 0. Probe h(496,0) collision m–1 204 October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.17

Example of open addressing Insert key k = 496: T 586 133 204 481 0. Probe h(496,0) 1. Probe h(496,1) collision 586 m–1 October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.18

Example of open addressing Insert key k = 496: T 586 133 204 496 481 0. Probe h(496,0) 1. Probe h(496,1) 2. Probe h(496,2) insertion m–1 October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.19

Example of open addressing Search for key k = 496: T 586 133 204 496 481 0. Probe h(496,0) 1. Probe h(496,1) 2. Probe h(496,2) Search uses the same probe sequence, terminating suc- m–1 cessfully if it finds the key and unsuccessfully if it encounters an empty slot. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.20

Probing strategies Linear probing: Given an ordinary hash function h(k), linear probing uses the hash function h(k,i) = (h(k) + i) mod m. This method, though simple, suffers from primary clustering, where long runs of occupied slots build up, increasing the average search time. Moreover, the long runs of occupied slots tend to get longer. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.21

Probing strategies Double hashing Given two ordinary hash functions h1(k) and h2(k), double hashing uses the hash function h(k,i) = (h1(k) + i h2(k)) mod m. This method generally produces excellent results, but h2(k) must be relatively prime to m. One way is to make m a power of 2 and design h2(k) to produce only odd numbers. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.22

Analysis of open addressing We make the assumption of uniform hashing: Each key is equally likely to have any one of the m! permutations as its probe sequence. Theorem. Given an open-addressed hash table with load factor  = n/m < 1, the expected number of probes in an unsuccessful search is at most 1/(1–). October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.23

Proof of the theorem Proof. At least one probe is always necessary. With probability n/m, the first probe hits an occupied slot, and a second probe is necessary. With probability (n–1)/(m–1), the second probe hits an occupied slot, and a third probe is necessary. With probability (n–2)/(m–2), the third probe hits an occupied slot, etc. Observe that n  i  n   for i = 1, 2, …, n. m  i m October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.24

1  n ⎛1  n  1 ⎛⎜1  n  2 ⎛⎜L⎛⎜1 1 ⎞⎟L⎟⎞⎞⎞ m ⎜ ⎠⎟⎟ Proof (continued) Therefore, the expected number of probes is 1  n ⎛1  n  1 ⎛⎜1  n  2 ⎛⎜L⎛⎜1 1 ⎞⎟L⎟⎞⎞⎞ m ⎜ ⎠⎟⎟ ⎝ m  1⎝ m  2 ⎝ m  n  1⎠ ⎠⎠ ⎝  1   1   1   L1   L  1     2   3 L    i i0 The textbook has a more rigorous proof and an analysis of successful searches. 1 . 1   October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.25

Implications of the theorem If  is constant, then accessing an open- addressed hash table takes constant time. If the table is half full, then the expected number of probes is 1/(1–0.5) = 2. If the table is 90% full, then the expected number of probes is 1/(1–0.9) = 10. October 3, 2005 Copyright © 2001-5 by Erik D. Demaine and Charles E. Leiserson L7.26