DEV Community

Cover image for Why Read/Write in Hash Maps Has O(1) Time Complexity⁉️🚀
Moein Salari
Moein Salari

Posted on

Why Read/Write in Hash Maps Has O(1) Time Complexity⁉️🚀

Hash maps are one of the most powerful data structures in programming, known for their efficient read and write operations. But why do these operations have an average time complexity of O(1)? Let’s break it down step by step. 🚀


What Is a Hash Map?

A hash map (or object in JavaScript) is a data structure that stores data in key-value pairs. You can think of it as a super fast lookup table that maps each key to its corresponding value.

const myMap = {
  key1: "value1",
  key2: "value2",
};

console.log(myMap.key1); // Output: value1
Enter fullscreen mode Exit fullscreen mode

How Does a Hash Map Work?

1. Hashing and the Hash Function

When you insert a key-value pair into a hash map:

  1. A hash function takes the key and computes a unique integer (the hash).
  2. The hash is then used to determine the index in an internal array where the value is stored.

For example:

function hash(key) {
  let hashValue = 0;
  for (let i = 0; i < key.length; i++) {
    hashValue += key.charCodeAt(i);
  }
  return hashValue % 100; // Modulo to fit array size
}

console.log(hash("apple")); // Example output: 42
Enter fullscreen mode Exit fullscreen mode

If the hash map’s internal array has 100 slots, the value for "apple" will be stored at index 42.

2. Direct Array Access

Accessing a value from an array using an index is a constant-time operation O(1). Hash maps leverage this feature by converting keys into indices.


Why Read/Write is O(1) on Average?

Insert (Write) Operation

When you insert a key-value pair:

  1. The key is hashed to find the appropriate array index.
  2. The value is stored at that index.
const myMap = {};
myMap.name = "Moein"; // 'name' is hashed, and the value is stored
console.log(myMap); // Output: { name: 'Moein' }
Enter fullscreen mode Exit fullscreen mode

Lookup (Read) Operation

When you retrieve a value:

  1. The key is hashed again to find the same array index.
  2. The value at that index is directly accessed.
console.log(myMap.name); // Output: Moein
Enter fullscreen mode Exit fullscreen mode

Both insert and lookup involve only a few operations (hashing and accessing an index), which are O(1).


What About Collisions?

Collisions Happen!

Sometimes, two different keys can produce the same hash (called a collision). For example:

hash("apple") === hash("banana"); // Both might hash to the same index
Enter fullscreen mode Exit fullscreen mode

When this happens, multiple key-value pairs need to be stored at the same index. To handle collisions, hash maps use techniques like:

  1. Chaining A linked list (or another data structure) is stored at the index to hold multiple key-value pairs.
const hashMap = [
  [], // Index 0
  [], // Index 1
  [["apple", "fruit"], ["banana", "yellow fruit"]], // Index 2 (collision)
];
Enter fullscreen mode Exit fullscreen mode
  1. Open Addressing The hash map searches for the next available slot to store the value.

Resizing and Amortized O(1)

When the hash map becomes too full (exceeding a load factor threshold), it resizes:

  1. A larger internal array is created.
  2. All existing keys are rehashed and reinserted into the new array.
function resizeHashMap(oldMap) {
  const newMap = new Array(oldMap.length * 2);
  oldMap.forEach(bucket => {
    if (bucket) {
      bucket.forEach(([key, value]) => {
        const newIndex = hash(key) % newMap.length;
        newMap[newIndex] = newMap[newIndex] || [];
        newMap[newIndex].push([key, value]);
      });
    }
  });
  return newMap;
}
Enter fullscreen mode Exit fullscreen mode

Resizing is an expensive operation, but it happens infrequently. The cost is spread over many operations, so the average complexity is still O(1).


Worst-Case Scenario

In the worst case, if:

  • The hash function is poorly designed.
  • All keys hash to the same index.
  • The hash map is overloaded.

Then every read or write requires traversing all stored items, degrading to O(n). 😢

To avoid this:

  • Use a well-designed hash function.
  • Maintain a low load factor by resizing as needed.

Key Takeaways

  • Hash maps achieve O(1) read/write time on average by leveraging hashing and direct array access.
  • Collisions are handled using techniques like chaining or open addressing.
  • Proper design and resizing ensure efficient performance.

What are your thoughts on hash maps? Do you have any tips or tricks for using them efficiently? Let me know in the comments! 💬

Top comments (0)