Hey, fellow devs👋!
I just tackled a problem that taught me a valuable lesson about time complexity: always consider the worst-case scenario, even when average-case performance seems good ⚠️.
Here's the gist:
- Problem: Find a subarray with a sum of zero.
- Constraints
- Array size: 1 to 2*10^5
- Element range: 1 to 10^9
- Total sum of elements: ≤ 2*10^5
My initial approach:
- Used unordered_map to store sums for efficient lookup.
- Average-case time complexity: O(1) ✨
- Worst-case time complexity of entire code: O(n) (this led to the TLE)
The TLE surprise:
- Got a Time Limit Exceeded (TLE) verdict, specifically during lookup operations in the unordered_map.
- Why? The worst-case lookup time in unordered_map can be linear O(n), so with the given constraint worst-case complexity of the program becomes O(n^2).
The solution:
- Switched to map (ordered_map with worst-case complexity O(log n)):
- Now Worst-case time complexity of my code become: O(nlogn), O(n) to traverse the array and O(long) for lookup.
- Accepted solution!
Key takeaways:
just look at the complexity of ordered and unordered_map
This complexity tells you a lot about when to use which data structure, map stores the data in a sorted manner while unordered will not, here we do not need sorted data which's why in the first attempt I went with unordered_map but later I realized my mistake and correct it.
instead of 10^5 if the number of elements is 10^9 then our ordered_map will not work so,
- Always analyze worst-case time complexity, especially when dealing with large inputs.
- Don't rely solely on average-case performance.
- Choose data structures wisely based on problem constraints and potential worst-case scenarios.
Happy coding!!
Top comments (0)