Jesus, take the wheel. 🚗
And Github Copilot, take the IDE. 💻
Github says, 92% of US devs are using Copilot.
What. Seriously?!
When did you hear 92...
Some comments have been hidden by the post's author - find out more
For further actions, you may consider blocking this person and/or reporting abuse
This article is an antithesis of what a real developer should be. In first instance, take leverage of AI and learn from it. However you should avoid taking for granted what AI produces.
You should take documentation in first place and from there build up.
Personally, I don't feel threatened by AI as I take decisions based on human experience, while AI only produces content as reflection of past experiences.
Should a dev be afraid of AI? Not at all.
Good take, Erick.
Devs just need to be responsible with introducing AI tools in their workflows. Starting small and seeing how it works, what it's good at and what it's not.
Though I am iffy on using terms like "real developer".
I slightly disagree, I think devs should commit mistakes with AI and then maybe learn from it. But ofc mistakes could be costly for orgs. That's where PR reviews, senior expertise come in right?!
Devs should definitely not avoid AI of course.
But I think maybe starting with a production-grade codebase isn't the best idea, regardless of whether senior devs are involved or not.
Devs should spend enough time learning how to use AI as a tool responsibly in their own personal projects, get community, friends, and coworkers feedback on it before involving AI in production grade code.
I am also in those 8% of devs not using co-pilot and my main concerns are the same as the ones mentioned here. But I still use AI every day at work to make my life easier. LLMs can boost your productivity to 10x if you are able to understand the generated solutions and iterate over those to make them better. Great Read :)
The trick is to not copy/paste production secrets into this thing. 🤣👇
Such a divisive topic. Clearly there are pros and cons and both are valid actually.
I would say that having an AI generating most of your code is not a good thing. And I would also say that not incorporating an AI would be a lost opportunity. So my take is to use it simply as a pair programmer. I'm driving, the AI might suggest some things I don't agree with - and I readily ignore those. Other times I might look at the code I've written and ask the AI if there is a simpler way to do something and it, again, may come up with something valuable, or some utter nonsense that I don't want in my codebase.
I think there is real risk in letting it write your code for you, but I see real value in using it as a tool to potentially improve parts of your work while you're doing it. It's great to pair program, now we all have access to one. Just use the tool appropriately, because all it is, is a tool.
Nuance, appears to be a skill lost to time. 😄
Thank you for your balanced perspective.
the competences of new code learners getting weaker day by day,
something like patience, searching for month about issue you faced while you code, watching course of 100 Videos, Or crash course of 10 Hours (Crash :) )
all these wonderful moments they wouldn't live it.
I speak from my experience with my classmates and i feel that the harm of AI for beginner is stronger than for professionals.
yes they won time and submit their assignments in the best way but they lose skills through two year of studying.
AI became an alternative mind think instead of them.
I wouldn't say it's competencies are getting weaker for anyone...
Whether patience is or not, well, there might be some merit in that, but that's probably not just driven by AI itself.
Before AI, copypasting from SO without understanding what it actually did was so common that it was a meme. IIRC, there was even a meme package that would import the code snippet from an accepted answer for a question or something...
Those that were responsible and cared about understanding what they were shipping, did so even with SO, or other pre-AI sources.
An argument could be made that they have a better chance at learning what their "copy-pasted" code does now because they could just ask the AI for an explanation (though that could be flawed too, but LLMs indeed have covered some distance).
=============
Where I am more aligned with you is that, it's far easier for someone to manage to not learn things along the way of their work, and just deliver something that works "for now". Meeting timelines at the cost of everything, even learning as a byproduct of researching solutions.
I’ve largely given up using co-pilot chat to generate code. For me it is a documentation tool and something I use to validate my decisions. The autocomplete feature is handy, but mostly when doing repetitive work. Although not revolutionary (yet), I couldn’t live without it!
Do you happen to use some other AI tool to generate code? Like, perhaps GPT or Gemini?
I'm a self-taught MERN stack developer. Without ChatGPT and Copilot, I wouldn't have been able to release a SaaS platform that's now generating some revenue for my startup. As a solo founder and developer, AI helps me write 96% of my code. I'm very comfortable with this approach because I review and correct the AI-generated code line by line before using it. I've become extremely efficient at prompting AI, and there's no going back for me. When I start hiring developers, those who don't use AI will be let go. I'm serious about this! 😄
Beautifully stated. As a mern developer with years of experience before ChatGPT and Co Pilot. There tools are priceless in regards to the speed and assistance they provide when writing code.
Relying solely on AI for all your coding needs risks undermining your development as a programmer, potentially reducing you to a copy-and-paste operator rather than a thoughtful, creative coder. Personal coding style and logic are crucial for fostering innovation and problem-solving skills.
Fear surrounding the current state of AI is largely unfounded. Language models and generative models are simply tools, and being skeptical of them is akin to using a rock to drive in a nail when you have a hammer available.
In reality, AI and language models are not inherently intelligent. Without creative input and context, they are essentially empty shells. This means that if you are not already skilled at something, AI is unlikely to make you significantly better.
However, AI and language models can be incredibly useful in augmenting our existing skills. For instance, I love writing code, but I often struggle with writing clear and concise documentation and comments. Here, AI and language models have been incredibly helpful. By providing the model with a rough idea of what I want to say, it can generate a draft that I can then refine and edit. This can save me a significant amount of time and mental energy, freeing me up to focus on other creative tasks.
In my view, the development of AI and language models should be encouraged, rather than stifled. By improving these tools, we can achieve more. As with any technology, there are certainly risks and challenges to be addressed, but I believe that the potential benefits far outweigh the drawbacks.
In short, AI and language models are not a solution to any of our problems, but they can be incredibly useful tools when used in the right way, which is augmenting current skills and creative pursuits. By embracing these technologies and working to improve them, we can do more.
Uncle Bob observed
the number of people working in software development doubles every five years
there for half of them at any time have less then 5 years of experience
this half has no experience to evaluate what the AI generates so it
sure looks to me like a doom loop
to me
as a friend of mine say
"you didn't know how to use stackoverflow to solve your problem why do you think using AI will do?"
I've already read Mark Manson's book a few times to develop the IDGAF muscle for cases like these.
Imho, using AI tools extensively as a developer changes you, and not in a good way.
I used them for a while. Meh. 🫤
I prefer: think yourself, look up yourself. Stay sharp.
Could you elaborate a bit?
When did you last use it? What AI tool was it? And what was your experience like?
Sure. About 1.5 years ago, GitHub Copilot, for about 3 months. On VS Code, for Javascript / React work. It was definitely worth seeing how it worked, with its pros and cons.
It was fun seeing a tool helping with code completion. especially a new tool and technology. It also sometimes felt a little bit like pair programming, but with a junior developer, which can fun but only to some extent.
Then I stopped for a little while, and I realised it was making my brain lazy. It wasn't producing any better code. It wasn't really saving me time in the short term, as I had to check everything and anyway, and often undo what it produced. A lot of the code completions and the produced code was not as good as that of a senior developer. It felt like it is copy/paste rather than a senior developer level thinking.
Then I uninstalled it, and never reinstalled it again. I don't miss it. It gave me a better perspective of what an AI tool is, and can do, and how the thinking process of a developer is different to an AI tool. I think I am a better developer without it.
Personally I think the AI tool is a glorified copy/paste and automatic code completion tool, too much of which will adversely affect your thinking process and coding skills.
That's my experience and impression at this moment. If the AI tool code completion quality changed in the recent times, I don't know.
I use AI because it's way less mental struggle to correct bad code than it is to write code from scratch. ADHD is funny that way.
I use ChatGPT.4 as my help at other hand but I don't like GitHub Copilot and I think people should write code on their own not like "Write an alternative to Finder on Python".
I don't think most people actually use copilot like that. While it may be used like a "do my job for me" tool to an extent, most people seem to be using it like a glorified autocomplete.
Which can yield hilarious results sometimes. 🤣
reddit.com/r/ProgrammerHumor/comme...