DEV Community

Zach
Zach

Posted on

Data Structures and Algorithms

This might be a bit of a rambling post. I'm new to this. I'll try my best.

Do you want to work for one of the big boys?

Then you better get good at data structures and algorithms (DSA). Google, Facebook, Apple, etc - from what I've gathered, they all run you through a gauntlet of questions / toy problems to test your reasoning skills and ability to apply algorithms to coding challenges. Think LeetCode, etc.

Those problems are really hard.

Do I want to work for one of those companies?

I'm not sure. But maybe I will when my bootcamp is over and if I do, I know I'd thank myself for getting an early start.

Let's start at the start.

While researching Hack Reactor I came across a great youtube resource. There's this guy that interviews graduates of the many coding bootcamps, and during his HR interview (video below), all of the panelists said that they wish they had started sooner on learning DSA. So I'm like, ok! This sounds like advice.

After getting through Precourse, I had a week before the immersive began - a week of free time - and so I set myself on getting the ball rolling. There are a lot of learning resources out there, and I had a bit of analysis paralysis choosing among them so I thought I'd go with the shortest one that covered what I wanted to learn.

I've only gotten through Binary Trees (maybe a third of the way through?) but I think I've got an understanding of what this intimidating sounding stuff is all about.

Binary Trees

Binary trees are pretty intuitive (at the level I understand them). Take an ordered list and search through it for a specific list item (a number). But don't start at the front and work through to the end. Instead, examine the halfway item and if that item is smaller than the number you're looking for, throw out the first half of the list. If it's too big, throw out the second half. And repeat.

Ok cool. Why search in that way? What's wrong with traversing a list from front to back? Well for whatever reason at this point, my brain remembered a video I had watch some time ago. So I gave it a re-watch. Here are two google engineers role-playing their interview process:

The 'interviewee' makes references to quadratic, linear, and logarithmic solutions and describes how some of these are quicker and less computationally expensive for a computer to process. What do those terms mean? Well I went on a hunt to find out and found these articles that helped me make some sense of it all.

algorithm-time-complexity-and-big-o-notation
a-beginners-guide-to-big-o-notation
how-to-calculate-time-complexity-with-big-o-notation

Ok, I think I get it. Binary search is O(log n). I'm missing some words in that sentence but you get the drift. It solves large data sets with logarithmic efficiency.

Remember how the list is ordered? If our target number is smaller than the halfway point, it's also smaller than all of the numbers to the right of the halfway point. In this way, we can throw away list the one-half at a time without wasting cycles looking through numbers that are either too big or too small.

What does that mean? Well let's say it takes one second to go through a computing cycle. If I have a list that is 10 items long, it'll take 10 seconds to find an item at the end of the list. If I extend the list to 20 items, it'll take 20 seconds to traverse the list. The computing time grows linearly with the size of the list.

Well with binary search, as the item list grows, the computing time only grows logarithmically, which, as the numbers get large, provides a huge savings in cycles. Here's what I mean. Let's say it takes our binary search 10 seconds to find one number in a list of ten. How long will it take to find one item out of twenty? Not twice as long, not even close. Because the algorithm throws away half the list at each step, it only takes one cycle (which we were very flexibly saying took one second) to halve our list from twenty items to ten.

log vs linear growth

This is dope.

So alright, that makes sense, let's apply it.

That's the cool thing about code. You learn something, you want to try it out. Open a code editor and boom. And a great way to learn something isn't just to read about it. It's to do it. It's way too tempting to read something and because you can follow the explanation, assume you know it. But I don't think I really know something unless I can explain it, or in the case of code, implement it on my own.

In my next post I'll share some of the code I wrote to compare the speeds of binary and linear solutions to the simple task of finding one single number in an ordered list of numbers.

Ok, I think that's enough for now.

Conclusion

I really really enjoyed the process I wrote about above. It answered so many questions and opened up so many more. What is space complexity? Is recursion time expensive? Why was it so hard to write those simple search functions and the next time I write them, will it be easier for me?

So yeah, I think I'm going to take that advice and start studying this stuff. It might lead to more employment opportunities, and besides -- debugging and thinking pain aside -- it's fun to make things work the way you want them to.

My understanding is that smaller companies place less of an emphasis on DSA in the interview process. What if I go in that direction in my job search? Will I have wasted my time studying DSA? Nope. I mean first of all I may be wrong. And besides, it means I'll be ready for whatever they throw at me. I'll have a deeper understanding of programming, of how to write efficient code, and shoot, the nerd in me is loving it.

Top comments (0)