I weren't able to write that code during an interview. I've been asked to scrape unique 200 posts from a json feed.
require 'net/http'
require 'json'
start_url = "https://url-to-json.feed"
posts = []
loop do
body = JSON.parse(Net::HTTP.get(URI(start_url)))
next_page = body["paging"]["next"]
body["data"].each do |new_post|
id_exists = posts.detect {|post| post["id"] == new_post["id"] }
unless id_exists
posts << new_post
end
end
url = next_page
if posts.count > 199
break
end
end
Instead of that dumb and simple 15 lines of code, I started to create functions, pipe them each other, do some bullshit abstractions, and finally fucked up.
Top comments (4)
One thing that basically jumps at you when you look at that code is how you do a linear search through the array for every new element, so you end up checking the ID of 199! elements if there aren't any repeats (that is, 199+198+197+...+2+1).
This means your solution will scale poorly if you start dealing with larger numbers. Maybe not something you'd worry about in the real world*, but definitely something an interviewer might point out.
The solution would be to use a hash to keep track of IDs you've already seen :D
* I probably made it sound like this isn't something to worry about in the real world at all, but that isn't true. I often end up dealing with files that have several thousands if not millions of records and sometimes things like searching for duplicates need to happen before feeding the whole thing into a database. So this is definitely a real world issue, just not necessarily for every job.
Remember that when you don't achieve what you went for you can either look at it as a "failure" or a "lesson".
If you learn from the "lessons" you'll achieve your goal faster than otherwise.
So keep learning and moving forward!
You'll get them next time!
Embrace How Random the Programming Interview Is
Ben Halpern γ» Mar 4 '17 γ» 2 min read
You failed so that you will learn and do better next time. Always a light at the end of the tunnel.