I have been a developer for 6 years now (not very long) and I have enjoyed every minute of it. Even thought it was tough at times, I still wanted to continue. When I started my career (even when I was in university) there were tools to automate some of the tedious stuff when writing code. For example, I remember discovering IntelliJ code generation tool for generating constructors, getters and setters. Even having that, most of the time I wanted to write them on my own to know them when I don't have this tool available.
When AI started getting popular, I approached it pretty conservatively, even though I really like testing new things. I chose this approach because I didn't see the value that it brings at the time. But recently I gave in (since I can't escape it, it's everywhere) and decided to try a few tools to aid me when I am developing. I wasn't sure what to use since there is a lot to choose from (and I wanted to give AI a fair shot) so I tried with the following:
- Github Copilot
- ChatGPT
- Claude 3
- Codeium
- Sourcegraph Cody
- Local LLMs (with ollama)
- Codellama
- Mixtral
At first I was impressed as it was far better than I expected. Thay all offered very useful tools such as code completion, explain code, generate tests, refactor code, etc. It was far from the IntelliJ code generation tool that I know.
Having said all that, I still think they suck. Why? Let's start with the downsides.
First are the "hallucinations". When generating a piece of code, often they will imagine a library that doesn't exist but resembles a couple of existing ones. This happened more often than I expected. Also, sometimes the code that it suggests is just incorrect. My point is that by the time the code generates, and you make the necessary tweaks and corrections, you could have written it by yourself.
The second is that it just makes coding a lot less fun. I enjoy sitting down with a cup of coffee and just writing some code. When I get stuck, I never see it as a problem or get discouraged, I see it as a challenge to learn something new. So why waste an afternoon writing code when you can generate so code with AI and just fix it up and ship it? It kills creativity, it leads to a lack of deeper understanding of the language and tools you are using (and also the thing you are developing).
The biggest problem I see is that developers, especially young developers at the start of their careers, rely too much on AI. I get the value it brings when it comes to helping with the tedious tasks that are repeated in almost every project, but constantly using it while developing for everything is just wrong. An inexperienced developer can't tell if the code generated is right or wrong. It may be right for some simple thing that he will try, but it might cause more problems down the line. Even I noticed that after using AI for a couple of months, I started to rely too much on it. An example of this is when the Copilot extension crashed. I found myself waiting about 5–6 seconds for the autocomplete of a simple function call because I was becoming lazy (to be honest, I could have written it in the same time I waited).
Just to be clear, I am not dismissing AI outright, but I think we are still not on the level as everyone is making it out to be. I think there is a lot of potential, but we are still not there yet. These are just my thoughts, I would love to hear what other people think of this.
Top comments (4)
Great read and points shared. Thanks.
AI is currently most value for automating away lower value, time consuming tedious & repetitive tasks like testing. But the quality & understandability of the code generation output is hugely important. As is the ML training method used.
That's why ML methods like reinforcement learning are far more promising than LLMs.
It's worth evaluating more specialist AI for Code products like Diffblue Cover and Tabnine.
I completely agree that when it comes to more specialized cases like generating code, LLMs just fall apart. They are just not made for that. The problem I have with them is that they are advertised as the solution for every problem, and people that don't understand just blindly believe everything they say.
Thanks for the suggestion on Diffblue and Tabnine, I will surely try them.
Well said about the advertisement. From coding a neural network from scratch, I learned that AI actually doesn't have a brain. Even the term neural network has nothing to do with the human brain (see Mike X Cohen, a Neuroscientist for decades). It's just somewhat of making it sound great.