DEV Community

Cover image for Should Coders Feel Guilty About Using GPT? 🤖 😔
Best Codes
Best Codes

Posted on • Updated on

Should Coders Feel Guilty About Using GPT? 🤖 😔

Should Coders Feel Guilty About Using GPT?

The rise of GPT-like large language models (LLMs) has sent ripples through the coding community. These AI assistants can generate code, translate languages, and even write different kinds of creative content. But with this newfound power comes a question: should coders feel bad about using GPT to write code for them?

The answer, like most things in tech, isn't a simple yes or no. Let's delve deeper into the potential benefits and drawbacks of this coder-AI partnership.

Benefits of GPT for Coders:

  • Increased Productivity: GPT can automate repetitive tasks like boilerplate code generation, freeing up coders for more strategic problem-solving and design.
  • Reduced Errors: GPT can catch common syntax errors, saving debugging time and frustration.
  • Exploration of New Technologies: Struggling with a new library or framework? GPT can help you get started by generating basic code structures.
  • Learning Tool: Analyzing GPT's code generation can give you new perspectives and approaches to problems.

Drawbacks of GPT for Coders:

  • Limited Understanding: GPT can't grasp the full context of a project or the underlying logic behind the code. It might generate code that works syntactically but doesn't achieve the desired functionality.
  • Security Risks: Blindly trusting GPT-generated code opens the door to potential security vulnerabilities. Always review and understand the code before deploying it.
  • Ethical Concerns: Who owns the code generated by GPT? Is it considered plagiarism to use AI-written code without attribution? These are questions with no easy answers.
  • Over-reliance on AI: While GPT can be a valuable tool, relying on it too heavily can hinder a coder's ability to learn and grow independently.

What I think

The Verdict: Collaboration, Not Replacement

Here's the key takeaway: GPT shouldn't replace coders, it should collaborate with them.

Think of GPT as a skilled but inexperienced assistant. It can handle repetitive tasks and highlight potential solutions, but it needs your guidance and expertise to deliver robust, secure, and well-structured code.

Moving Forward with GPT:

  • Focus on GPT's strengths: Leverage it for repetitive tasks, initial code drafts, and exploring new technologies.
  • Maintain code ownership and responsibility: Always review, understand, and test GPT-generated code before using it.
  • Embrace the learning opportunity: Use GPT to explore new approaches and challenge your existing coding paradigms.
  • Engage in the ethical discussion: As the role of AI in coding evolves, participate in discussions about ownership and attribution.

Ultimately, the decision to use GPT is yours. By understanding its limitations and using it strategically, you can harness its power to become a more efficient and effective coder.

So, what do you think?

Let everyone know in the comments!


Article by BestCodes.
Image Alt
The GIF is broken :/

Top comments (21)

Collapse
 
kurealnum profile image
Oscar

Yes. You should feel at least mildly guilty about using a technology such as GPT to learn. Learning with such a technology leads to a very basic and simplistic understand of concepts, and not the deep understanding that you should strive for. For instance, I personally feel bad for the people that come into forums for help and say something along the lines of "I'm trying to implement this, this is what Chat GPT gave me but it's not working, help!"

It's not that it's bad to ask for help or not understand something, everyone was there once! And let's be honest; a lot of us are still in that phase, and there's nothing wrong with that. What I consider to be unacceptable -- and bad -- is that they need help because they've dug themselves into a bit of a hole because they view Chat GPT and other AI tools as shortcuts to a deep understanding of a certain concept.

All that being said, there's no reason for anyone to feel guilty about using it to work. Learning and working are two very different things. When you're working, you don't really need to worry about getting a "deep understanding" of whatever technology you're working with, as there's a pretty good chance that you already do. All you need to do is finish whatever feature you're working on, test it, merge it with your staging branch, etc, etc. You don't need to have a life changing realization every time you git commit :).

TLDR; don't feel guilty about using AI if you're using it responsibly.

Collapse
 
best_codes profile image
Best Codes

TLDR; don't feel guilty about using AI if you're using it responsibly.

I agree with that. 🙂
Thanks for commenting!

Collapse
 
miketalbot profile image
Mike Talbot ⭐

Here's another related question, should I feel guilty about using AI to do a job I'd otherwise have to employ 10 people for 3 months to do? I don't, I happily fling out a prompt and a bit of context and solve it, just the same way as I'd have tried with NLP or some other set of algorithms, with the added benefit that it's about 10x more reliable and I can actually use it. But should I? How long before I automate something that people I already have as colleagues do, then we don't need them. Should I worry about it? I think not but it's becoming an ethical dilemma.

Collapse
 
best_codes profile image
Best Codes

I think it's an ethical dilemma in much the same way as robot factory automation was.

Collapse
 
miketalbot profile image
Mike Talbot ⭐

Indeed, but I was never in a position to make a difference there, the costs and investments were so high and that wasn't a programmer's job. These things are happening subtly and without coordination because we can just do it....

Collapse
 
miketalbot profile image
Mike Talbot ⭐ • Edited

I use ChatGPT for pretty much everything I do now. Not necessarily for the code (pretty much always for calling someone else's API I don't know), but rather to perform the research, to inform me. I'll happily sit in the bath and talk to the ChatGPT app about programming concepts and ways I'm intending to solve problems and it's a super helpful sidekick. My rubber duck better watch out.

Collapse
 
best_codes profile image
Best Codes

I use GPT to generate base / boilerplate codes, help me with new concepts / languages, and so I can rant with something about coding.

Collapse
 
mjoycemilburn profile image
MartinJ

Personally I've found ChatGPT hugely helpful. Sure, it lies from time to time, but I've learnt to tell when it's feeding me a big fib. I particularly appreciate its limitless patience and unfailing courtesy. Why ever should one feel guilty about using its assistance?

Collapse
 
best_codes profile image
Best Codes

I use it in much the same way. I think a lot of people feel guilty about using it because they think it will put other coders out of a job.

Collapse
 
mjoycemilburn profile image
MartinJ

Yes, I can see that. But surely there's no virtue in pointless work? Our job as system developers isn't just to write code, it's to produce systems. ChatGpt helps me produce better systems more quickly. I'm guessing this means more employment opportunities rather than less.

Meanwhile, I'm wondering about how I can focus my development responsibilities purely on "system design" while I leave the grunt coding to a bot.. How exactly would I do that is the big question.

Collapse
 
pedrosmasson profile image
Pedro Masson

If you can't think nor resolve problems w/o ChatGPT then yes, you should feel worried or guilty. It was created to be a tool, not to replace you/your thinking proccess. It helps you with some code or when you get stuck, nothing deeper than that.

Collapse
 
best_codes profile image
Best Codes

Interesting…
I think that if you are new to coding it's OK to be fairly dependent on GPT, but someone who has been 'coding' a while with GPT and is not learning / progressing should be worried.

Collapse
 
ajborla profile image
Anthony J. Borla

I disagree. If you are new to anything, especially coding, you need to learn from first principles.

This means you first research (and researching means reading articles, and relevant sections from books, not by asking an AI), the topic, and, in the case of learning a programming language, practice by writing code. Lots of code. From scratch. Not assisted.

Just think back to learning mathematics in primary (elementary) school: learn by first understanding, then reinforce the learning via repetition. There are no shortcuts to this process.

Using an AI to perform this learning may seem to be more efficient because it gives the appearance of being more involving or active. However, it is not so. You are _cruising along passively with the AI, and not really performing the problem solving that will cement the learning.

Admittedly, my assertion is based on experiential intuition rather than hard numbers from formal studies (and I eagerly look forward to the results of such research in the near future).

Regardless, learning, that is solidly acquiring knowledge and skills, especially in new (to the student) knowledge areas, is a difficult task. Just do it, and don't look for crutches, or shortcuts.

Thread Thread
 
best_codes profile image
Best Codes

Thanks for sharing! In my experience, AI bots like GPT have done a pretty good job of filling their assistant role and teaching me. But, I'd have to agree with some of what you said.

Thanks all you guys for the feedback!

Collapse
 
fpaghar profile image
Fatemeh Paghar

Using GPT in coding can be helpful, but it's important to be careful. It can make work faster and reduce mistakes. However, coders need to be aware of its limits.

Feeling bad about using GPT isn't necessary, but it's important to be cautious and responsible. Coders should use GPT to help them, not replace their skills. They should check the code it makes, watch out for problems, and think about the ethics of using AI-made code.

GPT can be useful for coders, but they need to use it wisely and know its limits. As long as coders use it responsibly, there's no need to feel guilty. Let's use AI to help us, but also be careful.

Collapse
 
best_codes profile image
Best Codes

Agreed.

Collapse
 
leonardoschmittk profile image
Leonardo Schmitt

Although it can be bad for beginners who don't really grasp some concepts, GPT is really useful for searching for something you kind of know. But, for example, learning a new language on top of GPT is something I wouldn't recommend since it's responses generations often changes over the same input and once you get the answer basically you just copy and paste, which for a beginner may seem cruel in the long term.

Collapse
 
best_codes profile image
Best Codes

In my experience, it is good for beginners who don't grasp some concepts because (at least, again, in my experience) it kinda taught me the concepts.

Collapse
 
best_codes profile image
Best Codes
Collapse
 
erickim profile image
Eric Kim • Edited

I don't think using ChatGPT for programming is necessarily a bad thing. But I would like to point out that just copy & pasting codes that are generated by ChatGPT without understanding its logic is a bad thing. People should think of it as a interactive learning tool.

Collapse
 
cacilhas profile image
Montegasppα Cacilhας • Edited

Generative A.I.s are tools, and tools are meant to be used.

No, you shouldn’t feel guilty about using ChatGPT, however you ARE guilty if you’re fully relying on it.

Some comments may only be visible to logged-in visitors. Sign in to view all comments.