Undergraduate Cracks 40-Year-Old Puzzle and Creates a Faster Hash Table!
In a surprising turn of events, a young computer science student, Andrew Krapivin, and his colleagues have made a groundbreaking discovery in the world of hash tables, a fundamental data structure used in computing. Their new approach challenges decades-old assumptions and could Puzzle drastically improve the speed of searches in hash tables. Here’s how it all came about and what it means for the future of computer science.
A Life-Changing Paper
Back in 2021, Andrew Krapivin, an undergraduate student at Rutgers University, stumbled upon a paper titled “Tiny Pointers.” Initially, the paper didn’t Puzzle capture his attention. However, two years later, he revisited it “just for fun” and began experimenting with its ideas. This would lead him down a path that would change the way we think about searching for data.

The core idea of the paper was to make pointers, which are Conjecture references to data in memory, smaller so they consume less space. But Krapivin quickly realized that in order to make the pointers smaller, he would need to rethink how the data they point to is organized.
Enter the Hash Table
To solve this problem, Krapivin turned to hash tables, a Puzzle common and widely-used data structure in computer science. Hash tables allow for fast lookups, insertions, and deletions, which are essential for many computer applications.
However, as Krapivin worked through his experiments, he discovered that he had inadvertently created a new kind of hash table—one that was faster and more efficient than anything previously considered possible. This innovation, however, would require proving that the new design was actually viable.

A Historic Discovery
Krapivin wasn’t sure whether his new design was groundbreaking, so he Conjecture turned to his former professor, Martín Farach-Colton, and his colleague William Kuszmaul for guidance. Farach-Colton was initially skeptical; after all, hash tables are among the most studied structures in computer science. However, Kuszmaul, who has extensive experience in the field, recognized the significance of Krapivin’s work. He quickly pointed out that the new design didn’t just offer a cool improvement—it had just disproven a 40-year-old conjecture about hash table operations.
This was no small feat. The conjecture, put forward by Puzzle computer scientist Andrew Yao in 1985, stated that the best possible way to search for an element in a hash table (or find an empty spot) could not be faster than a certain threshold. Yao’s conjecture had stood for decades, and most computer scientists accepted it as fact. But Krapivin’s new hash table design challenged this notion.

Breaking the Boundaries of Speed
In a paper published in January 2025, Krapivin, Farach-Colton, and Kuszmaul demonstrated that the new hash table could indeed perform searches much faster than previously thought possible. While traditional hash tables have operations that scale with the fullness of the table (denoted by x), Krapivin’s new hash table allows Puzzle for search and insertion times that scale much more efficiently, at a rate of (log x)², which is far faster than the previous x-bound.
This discovery doesn’t just challenge old assumptions; it redefines the limits of hash table performance. The team’s work has far-reaching implications for any system that relies on hash tables, from databases to search engines.
A Deeper Dive Into Yao’s Conjecture
The breakthrough goes beyond just disproving Yao’s conjecture about worst-case scenarios. In his 1985 paper, Yao also addressed the average time it takes to perform queries in hash tables. He had shown that for Puzzle certain types of hash tables, the average query time could never improve beyond the logarithmic scale of x.

However, Krapivin’s team found that this limitation doesn’t apply to non-greedy hash tables. They demonstrated that there are scenarios where the average query time can remain constant, regardless of how full the hash table is. This was another unexpected and surprising result, one that even the authors themselves didn’t anticipate.
The Importance of the Discovery
While these results may not immediately lead to practical applications, they have Puzzle profound implications for the future of computer science. Alex Conway of Cornell Tech emphasized that understanding these data structures more thoroughly can unlock new ways to improve computing systems down the line. Even if this breakthrough doesn’t lead to an immediate change in real-world applications, it pushes the boundaries of what we know about hash tables and how data can be organized and accessed more efficiently.
Conclusion
Krapivin’s work, along with Farach-Colton and Kuszmaul’s contributions, has Puzzle rewritten the rulebook for hash tables. By showing that searches can be much faster than previously believed, they have not only disproven a 40-year-old conjecture but also set the stage for future innovations in data storage and retrieval systems. This research exemplifies how a fresh perspective can lead to breakthroughs that change our understanding of long-established problems.