Okay.
There is something that has been bugging me and I'd like to hear the community's opinion on it.
Why is the use of AI is widely encouraged in code generation but everyone frowns at it for (written) content generation?
I have come across a lot of posts that simply demonise writers for using AI in generating their content. It even goes beyond the written content. There's a whole campaign now on blacklisting writers who use Dalle-A images for their article/writing covers.
When it comes to code generation, however, I saw a founder saying he won't hire you if you don't show sufficient proof that you use AI for code generation.
Let's talk about this.
Just going to ride off of the popularity of this post... **sorry.
Have checked out The Handy Developers Guide? The best non-video developer repository out there!
Top comments (23)
There is a cultural expectation that "content" β or human-to-human communication is directly created by humans. And there are lots of expectations hanging in the balance because of this.
Code is compiled and read by computers, without a direct cultural conotation.
With that said, I do think some types of content could easily have a culture shift towards accepted AI generation, but I don't think it would be right to shift everything in this direction. For example, I would be okay with AI-generated "docs". But I am not so much okay with AI-generated content where the authorship is assigned to a person.
I understand this, but isn't technical content supposed to serve the aim of educating? If the content (AI-generated) educates you, doesn't it mean that the purpose was achieved?
The real issue is flooding the Internet with low quality, low effort content by people who use AI to build "portfolio" and engagement. I'm sick of those autogenerated "10 reasons why you should use framework X" junk articles. It's just new form of SPAM.
Wellllll. So long ax knowledge is passed, I guess?
Some might also argue, there's nothing new to write about anymore.
Do you agree?
AI generated articles are copy-pasted knowledge. They add nothing to overall humanity knowledge pool and no new knowledge is passed.
If there is nothing new to write about - one should go for a walk instead.
π Fair @one should go for a walk. I agree.
Is it widely encouraged in code generation?
Outside of marketing hype nobody has encouraged me to use co-pilot. Visual studio forces it upon me, but that's not the same as a person saying I should use it.
I think we should encourage shunning the use of generative AI.
It creates drab marketing, homogenous "art", and difficult to assess code.
Also it comes from a place of worker exploitation at a level beyond even the most miserly Victorian workhouse master's dreams.
Let's kick the habit together!
People are so dependent on AI now.
I'm not even excluding myself. Got a lot worse when Meta added the AI assistant to WhatsApp.
I was having a conversation with someone a few days ago and they said that they wouldn't want to live in a world without AI. They won't know what to do.
I can't think of one person in my personal or professional life who is dependant on generative AI.
Amazing how different our perspectives are, the social circles we have.
I mean, generative AI or AI in itself is beyond just using the chatbot. Almost all the applications we use in this modern day relay on artificial intelligence in one way or the other.
That is generally followed but I wonder why people would be against using AI for covers.
Let me explain...
Imagine reading a tutorial with a complex AI-generated tone.. it would be super irritating, that is why It's recommended to avoid using AI for content.
While when you include your own experience or opinions, the content feels authentic and much more validated.
On the other hand with code, the goal is simply to have a functional website or product, users don't really care about the underlying codebase. So using AI there is perfectly acceptable imo.
It's completely fine to use Dalle images as covers as long as they align with the content (nothing clickbait), I use static banner on all my posts just to maintain a consistent style.
Yeah, writing is more relatable when there are human experiences or anecdotes attached to it. I totally agree.
However, I believe that the primary aim of technical content is to educate. If it does thisβeducatingβthen what's the problem?
That's how I see it.
AI generated content is the new form of plagiarism. Let's say I want to write an article about the print function in Python, and I find Bob's (imaginary person) pre-existing article about the print function in Python. If I take half of his article, edit it a little bit, and plop it into my own article, is that plagiarism (yes, of course)?
So what's the difference if you use AI? You're still taking someone else's work, editing it a little bit, and calling it your own. Even if you credit ChatGPT, you've still just copied and pasted a work that isn't yours. All that as well as the fact that AI is biased, hallucinogenic, and wildly inaccurate (on the occasion).
As for the difference between AI generated writing & code, I'll refer to the example that I initially made. If I take Bob's code, does anyone really care? Not really. Of course, Bob would have a problem if I copied his codebase and called it my own, but a few lines of code really isn't plagiarizing. Same thing with AI generated code.
Edit: all that being said, there are occasions where I absolutely approve of the use of AI. I know someone whose first language isn't English, and AI really helps them to "clean up" their articles.
Thank you for sharing and I resonate with this plagiarism point.
Is it better then, when you generate an article with AI that you ask the AI for its references and quote them? Does it make it less plagiarism?
I mean, if someone wanted to go that far and directly insert quotations like this:
ChatGPT said that:
Then yes, I would say that this isn't plagiarism. But at some point, the writer must ask themselves if quoting ChatGPT (or some other LLM) is really necessary. Why not just go to the project's documentation?
But to be fair, I'm coming from an academic standpoint, and plagiarism is much different in academics than it is in technical writing. You don't need to cite a source for common knowledge, and there's lots of common knowledge in tech. I don't need to go to the
man
page forls
to describe what it does.As a side note, I normally work off of this definition of plagiarism. Just wanted to drop that in here :p
Wow. This is some insight into plagiarism. Found out this I didn't know.
If AI generates content for you, you can ask it to provide its sources. You add those as your references.
Most AI-generated content is of low quality, lacking details and specificity. It's no surprise that Google has started to devalue websites filled with such low-quality AI-generated content in its search results.
But if it answers the "question" or the topic it was written for, would it still be low quality?
@Pabian, nothing new to write about doesn't mean one shouldn't write about one's experience about using the already known...and using AI to edit or even write about one's experience doesn't mean one should go for a walk instead. If one needs something new to write about, one should do a bit more research to give one something new to write about...
I would even challenge that "AI is widely encouraged in code generation". And if a founder says "he won't hire you if you don't show sufficient proof that you use AI for code generation" I'd gladly walk away... because he does not need me to be his AI-babysitter.
Oh, trust, it is encouraged.
Some companies go the length of paying for tools like Copilot or Cursor for their engineering team.
AI-generated content might be considered deceitful. Code isn't the same way.
This is true. This much I agree with.
Some comments may only be visible to logged-in visitors. Sign in to view all comments.