Focusing on performance from the beginning on as often little benefits. Worst: it makes your code more complicated, and you should fight that instead of trying to get back some nanoseconds.
I agree that over-optimization can be counter-productive. I will say, though, that the kind of performance "optimization" that squeezes out nanoseconds or milliseconds is not the same as algorithm optimizations measured by big-O. The later often actually makes a meaningful difference between whether you can get a result within a reasonable time frame.
Coding for 20 years | Working for startups for 10 years | Team leader and mentor | More information about me: https://thevaluable.dev/page/about/
Twitter: @Cneude_Matthieu
It depends on the context. Going from linear time to logarithmic time for example won't bring much, except if your input is very big. Of course, going from factorial time to logarithmic time will do more than improving your execution time, it will allow you to run your algorithm :D
Coding for 20 years | Working for startups for 10 years | Team leader and mentor | More information about me: https://thevaluable.dev/page/about/
Twitter: @Cneude_Matthieu
I agree that over-optimization can be counter-productive. I will say, though, that the kind of performance "optimization" that squeezes out nanoseconds or milliseconds is not the same as algorithm optimizations measured by big-O. The later often actually makes a meaningful difference between whether you can get a result within a reasonable time frame.
It depends on the context. Going from linear time to logarithmic time for example won't bring much, except if your input is very big. Of course, going from factorial time to logarithmic time will do more than improving your execution time, it will allow you to run your algorithm :D
Logarithmic time is actually faster than linear time. You are of course correct that these differences are meaningful only when the input is big.
Ooops you're right. My bad. I've edited my answer :) thanks!