In this video I explain how in a language with normalization (also known as hashconsing) Maps and Sets are much better/faster.
The idea is that you can simply avoid arbitrarily long equality tests.
An object is normalized if it is deeply immutable and we are guaranteed that there is no other structurally equivalent immutable object in memory. Thus all the pointers to something with that structure are pointing to the same object in memory.
If an hashmap/hashset has only normalized keys, we can just check for pointer equality instead of checking the 'equals'.