Yesterday, I came across one problem in Hackerrank platform, which needs to be solved as part of pair programming challenge.
Sharing the problem and solution with all of you.
A friend of Alex has gifted a movie collection, and Alex is excited to watch them all as quickly as possible. The duration of the movies is given in array durations[n], where n is the number of movies, and each movie duration lies between 1.01 and 3.00 units of time (up to two decimal places). Every day, Alex wants to spend no more than 3.00 units of time watching the movies but also wants to complete the movies in the least number of days possible. Alex does not leave a movie in between. That is, if Alex has picked up a movie, Alex watches the complete movie on the same day. Find the minimum number of days needed to watch all the movies.
I hope, most of us will have more optimized solution.
Seeking to hear from all of you on the same.