These are not always necessarily useful on the job, but are somehow thrown around the industry. If you're someone who already knows HTML, CSS, and JS fundamentals, and are trying to get up to scratch like me... maybe you will find it useful. I also don't claim to know all the following areas and I lifted these from job posts, chats with peers and hearsay while in junior roles. Take it with a grain of salt.
Several years ago I drank the koolaid and did a bootcamp. I shopped around carefully and wanted to specialize in javascript, because I wanted to animate things in browser as an artist. As an overachiever up until that point, I believed there was no problem learning whatever I put my mind to. I was wrong, and despite my enthusiasm for the subject I just wasn't able to carry-over any intermediate information to make logical conclusions. My teacher was able to pick up that I had ADHD (which upon seeking subsequent treatment for, has helped me also cope with mounting responsibilities etc)
Aditya Y. Bhargava's "Grokking Algorithms" has been particularly helpful in learning the first four concepts.
If anyone has anything else to add or wants to correct my assumptions of what's fundamental to a dev gig, feel free to add!
Binary Trees
Sorting algorithms/methods
Whiteboarding
Big-O notation
Gitlab and Git Flow
BEM and modular CSS
Javascript patterns
Functional Programming
State management
Unit Testing
Test Driven Development
Style Guides
Virtual machines and containerization
Web sockets/Web hooks
"Nice to haves":
GraphQL
D3.js or WebGL
Deployment experience with Amazon Web Services
Latest comments (33)
I do know of bootcamps that don't cover that kind of thing and only did JQuery and Wordpress. I definitely shopped for something I didn't already know, but there are only so many waking, productive hours even in 8-12 weeks. I didn't go from 0 to 1 and played with code quite a bit before that, and am astounded by those who walk in without knowing a lick of code coming out as pro devs. Good for them!
The most valuable skills as a developer are to get things done and measure time spent in issues x ( money achieve + refactoring and mantainment capability)
There's a nice-looking book for gap-filling called The Imposter's Handbook. I haven't read it yet, but I did read something similar years ago called Theoretical Introduction to Programming (that Goodreads review is mine).
I think these topics mostly divide into: pure-CS, practical computer skills, and knowledge of specific tools.
IMO the best reason to learn CS is that it's awesome. Someone who's interested in it for its own sake will probably likely become a better and more versatile coder than the person who always asks "is this gonna be on the test?"
There's also the Blub problem, which I'll argue is real: once you need a thing, unless you've seen it before, you won't know what to look for. I've spent some time helping high schoolers learn how to do this work, and I see that play out often. We learn much faster when we already understand a related topic.
As a specific example -- my understanding of git went from murky to crystal clear the moment I read that git just traverses a directed acyclic graph. I'd coded up DAGs before, so that realization was all it took to end 6 months of fumbling. What's worse is that, if you'd asked me during that time, I would have said that I understood git.
Re: practical computer knowledge -- you need it, full stop. I worked as a sysadmin before programming and that comes in handy constantly. More than once, in my last web application job, knowing how to build from C source, for a specific architecture, including customizing the build, ended up making a difference in what I was able to accomplish. None of that is terribly difficult, but these days a large number of people have never even attempted it. Computer networking is very important too, assuming you deliver your product over a network.
The specific tools are what they are. It mostly comes down to reading the docs. But that combo of CS concepts and practical computer knowledge will give you a much better starting point for understanding what a particular tool does, and why.
Git was covered. Just not Git lab, or Git flow. I heard about those in my jobs ;)
Hey Jen,
There are many startups in Toronto who have middle managers that are Waterloo graduates and they use CompSci as their qualifier but on-the-job you would never utilize these skills.
Here is what I think is fundamental
❌ Binary Trees
❌ Sorting algorithms/methods
❌ Whiteboarding
❌ Big-O notation
❌ Gitlab and ✅ Git Flow
❌ BEM and modular CSS
🤷 Javascript patterns
❌ Functional Programming
❌ State management
✅ Unit Testing
✅ Test Driven Development
🤷 Style Guides
🤷 Virtual machines and containerization
🤷 Web sockets/Web hooks
I definitely could produce a list of what is missing with the Toronto Bootcamps and what the Toronto tech market really wants as
fundamental skillsDoes the ❌ mean fundamental or nah?
I've done about a dozen interviews and had 2 entry level jobs... the bar is all over the place 🤷
It seems like having experience with OSS is also a good look.
But maybe first, I should pass the test.
❌ means you don't need it.
From the post it seems like git was likely used but not gitlab/gitflow, which are add-ons/services. I looked at gitlab once/ don't know what gitflow is.
I learned about the concept of TDD at my bootcamp circa 2016 but we wrote no tests/rely on manual testing and I've heard a bunch about testing but we don't do it at work either :/. Yeah that is not optimal but it's the reality for a lot of places.
Yeah same they didn't have enough time to cover TDD in mine. I have learned of testing through the Koans learning methodology, though Jest/Mocha still isn't entirely clear to me.
The world truly is my oyster when everything is out there and I just have to sit down and do it.
I would say that it is more about being aware, unless the role you are performing requires that specific knowledge. Otherwise let your job teach you any long term knowledge in your field versus being expected to know all of those items, even if it's not needed in your current role.
As far as what you should know: problem solving and the process of problem solving in the context of designing and writing software.
This comes from practice and having a good foundation of how to approach problems.
For example: coderhood.com/5-problem-solving-sk...
I think another “nice to have” skill would be a general understanding of how computers work under the hood. Things like how RAM stores information, how bits and bytes work, how the CPU processes data, what the heap and the stack are, etc.
This kind of knowledge definitely isn’t necessary to be a developer and most devs won’t ever need to worry about such low level concepts, but ever since starting to learn them (which I only started doing a few weeks ago), I’ve found a lot of the things I actually do work with on a daily basis make a lot more sense.
I'm wondering what you mean by "low level" as opposed to "high level" as I hear of it used a lot.
Both terms are kind of relative, but basically it means the closer to the computer hardware you get the lower you go.
So when talking about languages, machine code (binary) is the lowest level language, assembly is a slightly “higher” but still extremely low, C is an even “higher” level language, but still relatively “low” compared to Ruby or Python which both take care of things like garbage collection and memory allocation for you.
I used it to refer to the actual hardware (RAM, CPU, etc.), which is about as “low” as you can get.
Also, like I mentioned above, it’s usually used as a relative term, which sometimes makes it hard to understand. Someone who works in Python might refer to C as a low level language, meanwhile someone who works in assembly might call C a high level language. What they really mean is that it’s a “higher” or “lower” level language than what they’re used to writing.
Thank you for the detailed and comprehensive explanation! Term demystified :D
Maybe it's a bit of intergenerational tension but I for some reason think of C as "my dad's time" but of course people still use it. It's probably because my dad really hoped I would learn C when I was younger.
lol is "replacing an iPhone screen" a CV bullet point? 🤔
The best thing you can do when learning to code is learning how to learn. You don’t have to know everything all at once. In fact, many of the concepts you listed are popular as interview questions but less relevant as an employed developer. But feeling confident in your ability to identify patterns and apply them is key.
re: "learning how to learn" I agree! and learning to be patient with a learning curve. It's impossible to know everything but the breadth of requirements is definitely frustrating.
Interesting. Don't want to share a big rant but I related to a bunch of your comments about the bootcamp experience- there was so much more to learn, and it became hard to gauge what was important or not.
I've worked as a web developer for two years now and I have never had to understand or apply Big O notation or a bunch of things on that list. Ive looked at articles about Big O a bunch of times but I never have had to use it in any meaningful way. I've learned a bunch of other stuff not on this list.
Someone else said what is fundamental will really be whatever particular tools are being used on the job- I agree with this. As soon as you get to read the tech stack on the job posting talk/ research about how you will apply those tools.
@codemouse92 @joshhadik @airbr I guess as lament-y and laundry-like this post sounds, I realize I am at the juncture of "I-know-enough-to-get-a-job" but "I-don't-know-enough-to-keep-a-career", and the only thing that is get me better is study, practice, patience and taking it as it comes.
As long as you keep learning even after you get a job you should have everything you need to keep a career. Don’t make the same mistake I made after bootcamp (and am still recovering from lol) of trying to learn everything first. It’s good to have some idea of the skills you’re lacking and try to go over them in your off time, but real-world experience is way more important.
I wasted a lot of time trying to learn everything before applying for a job and realized that it's literally a never-ending process. For every one thing I learn I'm introduced to three or more new things I've never even heard of before.
SAME.