## ← Cuckoo Hashing Part2 - Intro to Parallel Programming

• 4 Followers
• 37 Lines

### Get Embed Code x Embed video Use the following code to embed this video. See our usage guide for more details on embedding. Paste this in your document somewhere (closest to the closing body tag is preferable): ```<script type="text/javascript" src='https://amara.org/embedder-iframe'></script> ``` Paste this inside your HTML body, where you want to include the widget: ```<div class="amara-embed" data-url="http://www.youtube.com/watch?v=0xx9S8fFlzc" data-team="udacity"></div> ``` 2 Languages

• English [en] original
• Chinese, Simplified [zh-cn]

Showing Revision 6 created 05/24/2016 by Udacity Robot.

1. No, it definitely will not always succeed.
2. There are some nice probabilistic guarantees about how often it will succeed
3. depending on the size and number of the hash tables,
4. but the easy counterexample is to say that, well, here we have 2 hash tables.
5. If we had 3 items, each of which had the same H1 and H2--
6. so for instance if we had 3 items where H1 and H2 were both 0,
7. there's no possible way that we can fit them into the hash table,
8. because we only have 2 slots where any hash function is equal to 0.
9. So in practice we choose a certain number of iterations, and we continue to iterate,
10. trying to fill up this hash table until we decide that we've done too many iterations.
11. And so if that's the case then we just stop, we choose new hash functions, and we start over.
12. And again, there's very nice probabilistic guarantees about how often this is going to finish.
13. So in the research that inspired this work,
14. the guarantee that we tried to use
15. was that we could guarantee that it was going to fail less than 1 out of every million times.
16. So once we construct the hash table, the lookup procedure is really simple.
17. We're going to calculate all the hash functions for the item that we want to look up.
18. So for instance, if I want to look up item B, I calculate item B's hash functions.
19. Here hash function 1 is equal to 0, and hash function 2 is equal to 1.
20. So what I'm going to do is I'm going to look in all tables
21. using the particular hash functions until I find what I'm looking for.
22. So first, I'm going to look in table 1, and I know that I'm going to look in slot 0.
23. Here I look in slot 0, and I say, "Wait a second, that's not B."
24. So then I have to go to table 2, look and see that its hash value is equal to 1,
25. so I'll look in slot 1 in table 2, and I'll say, "Ah, there's B!"
26. I've now found the value that I'm looking for.
27. Now if we don't find it in any of these locations, it's just not in the has table.
28. Now the nice part here is that this is a constant time lookup.
29. It just requires T lookups, and T is a constant.
30. It might be 2, it might be 3, and so on.
31. This is different than chaining.
32. Chaining has a variable time lookup.
33. It depends on how many items are in the bucket,
34. and if we have many items in the bucket, and we have to look all the way to the end,
35. it can potentially take a very long time,
36. whereas we can guarantee exactly how much work,
37. and it's a constant amount of work to look up any item in these hash tables.