This is an extension of problem we have discussed in an earlier post.
I like this problem more because there iss more thinking involved and it forces us to write down the algorithm without using an extra space.
First Hint to solve this problem is how could we utilize the fact that input array is SORTED?
Whenever I see SORTED in a problem statement 'Binary Search' is the first thing that comes to my mind but that won't work in this case because we have to find all the duplicates that exists inside the array so we have to parse all the items inside an array.
Parsing all the items inside an array when array contains 'n' items gives us O(n) time complexity thus defeats the purpose of using Binary Search which is famous of it's logarithm O(logn) time complexity.
Could we utilize the SORTED behavior of the items inside array to get linear time algorithm without using an extra space and without using Binary search?
How about we compare each element with it's neighbor and if they are similar then add them into the list of duplicate elements otherwise insert distinct element back in array.
We are going to use two pointers to solve this, one pointer to compare element and it's neighbor other is to maintain distinct element.
Let's walk through above code with an example-:
We utilized SORTED property of an array by comparing neighboring elements with each other. If it was an unsorted array we had to use extra data-structure like Set and won't be able to do in place algo in linear time.
There are couple of more variation of finding duplicate problem that I want to try and write about in coming days.
Stay tuned. :)
It's now possible to pull in and display any of your GitHub repositories on your dev.to profile. Profiles get a lot of traffic on the site because people like to know who's behind the posts and comments, and we want to make this experience richer. Future integrations may include Stack Overflow questions you've answered, or any other indications of your software development work that might be scattered across the web.