DEV Community

Cover image for StackOverflow is dying,,
Programming with Shahan
Programming with Shahan

Posted on • Updated on

StackOverflow is dying,,

Hey everyone, it's November 2023, and I've got some out-of-this-world news for you – aliens are real! Yep, according to some folks who used to work for the government, they found bodies of the pilots who flew these alien spaceships. But that's not even the wildest part – they also discovered some super weird code inside these space vehicles.

Image of weird code among alien crowds

This code was so bizarre that they posted it on a website called Stack Overflow, where programmers usually go for help. But here's the twist – it got taken down because it looked too much like another coding language. If you're a bit older, you might remember Stack Overflow as the place to be for coding assistance.

Image of alien crowsfeared about new updates

But here's the scoop: Stack Overflow isn't as popular as it once was. In the last eight months, fewer people have visited the site. So, why is this happening? Well, there's this thing called artificial intelligence, or AI. It's like a super-smart computer that can answer coding questions.

Guess what? Stack Overflow created its own AI called Overflow AI. This AI can come up with answers based on all the questions and answers people have posted on Stack Overflow over the years. So now, some programmers are scratching their heads, wondering, "Why bother answering questions when AI can do it for me?"

Image of AI running through the buildings

This drop in Stack Overflow's popularity is kind of like what happens in nature. Imagine a hungry and stressed-out snake in the desert; it might start eating itself out of desperation. But of course, that can't go on forever. And it's not just Stack Overflow – other big AI models, like GPT-4, have already used up a lot of the data from Stack Overflow to train themselves.


Summary

So, it seems like we're entering a new era where AI is taking over. It's a bit like a big change, similar to when people found out about aliens. Funny how these things happen around the same time.

Tools you may like to use: Figma
Figma AI Design Copilot is a future tech and Figma is already one of the most popular tools for UI/UX designers and developers. Many of them use Figjam (part of Figma) for whiteboard collaboration with other team members. You can sign up from here.

Figma

My Socials: Linkedin | X

Buy Me a Coffee

Thanks for taking the time to read this article. Stay tuned for the latest updates on programming.

Top comments (8)

Collapse
 
jason_espin profile image
Jason Espin • Edited

As someone who has worked in the field since 2010, I'm not that impressed with ChatGPT. I've used it through curiosity but dropped it within a few days. The answers it produces are largely incorrect or flawed and luckily having been around the block for a good amount of years I was able to identify these flaws quite easily. However, many of the younger more inexperienced developers I have worked with seem to rely on it heavily which is incredibly concerning given a major factor of being a good developer is being able to problem solve. This overeliance on ChatGPT and taking it as gospel is incredibly dangerous. At least with StackOverflow there is a peer rating system. Yes it's sometimes a toxic place with some very jumped up developers on there now making younger Devs feel terrible for some of the most basic questions (which I think is completely abhorrent, thank god it wasn't like that back when I was starting out) but I still think it's a much better place for people to go for answers than ChatGPT. In about 5-10 years I can see there being a bit of a crisis with all of the younger devs now being at a senior level and not actually knowing how to developbdue to their overeliance on having things done for them.

Collapse
 
miketalbot profile image
Mike Talbot ⭐ • Edited

I'm a pretty senior developer and I find it highly constructive. You have to treat it like a person who might make a few mistakes, but by discussing the answer it usually helps me like a slightly more junior colleague who happens to know all the APIs I've not looked at recently. Maybe it matters what you expect. I use Chat GPT exclusively now over SO, SO isn't answering my questions, but GPT is getting me there.

Explicitly - GPT just helped me build a Babel plugin and a Webpack plugin in about 15 minutes that do somethings I couldn't find on SO. Super useful, not critical to my work, but a massive benefit to it.

Collapse
 
jason_espin profile image
Jason Espin

I'm generally at the level that day to day I don't need either of them. But if I really am stuck I'd always go to StackOverflow over ChatGPT

Collapse
 
srbhr profile image
Saurabh Rai

That is really well explained @jason_espin

The answers it produces are largely incorrect or flawed and luckily, having been around the block for a good amount of years, I was able to identify these flaws quite easily.
This statement sums it all up, ChatGPT or any AI Autocomplete isn't a great option to swap for Stackoverflow.

What Stackoverflow needs to do is:

  • Reduce the pain of searching for multiple answers and bring them together in a simple search interface.
  • Provide AI Generated insights based on those answers (only). RAG but restrictive.
Collapse
 
lnahrf profile image
Lev Nahar

Could not have explained it better myself.

Collapse
 
jonrandy profile image
Jon Randy 🎖️ • Edited

Another senior dev here...

I've fiddled with ChatGPT a few times. I've found that when asking it about code - it is wrong or subtly wrong most of the time. This is what you get when you build something that is basically glorified autocomplete that has been trained on 'all' programming knowledge - both correct and incorrect.

I guess it could be useful, as others have said here, for fleshing out the basics of something... but you really need to understand what it is, and its limitations. ChatGPT understands nothing about code, or anything. It is a mathematical model that spits out what it deems to be statistically the most likely output based on the input (large oversimplification, I know, but that is the crux of it) - it neither knows nor cares if that output is right or wrong. I would say it is quite a dangerous tool for beginners if they automatically assume that the output is always going to be right.

The advantage SO has is that - as others also already mentioned - the answers are peer reviewed. They're not always right, but it is easier to go straight to the better answers.

Also, if SO (and other dev related sites) die, or just get filled with people posting 'articles' written by generative AI - where will ChatGPT get actual 'new' training data? From other similar AIs that also lack real understanding, and are also subtly wrong most of the time?

The whole thing risks becoming just an echo chamber with no real understanding... where the most common answers - right or wrong - become the 'accepted' ones, and just become further and further amplified. Real knowledge and understanding dies, leaving future generations reliant upon 'the machine' for their 'understanding'. Wait a minute, I just described the modern internet!

Collapse
 
dsaga profile image
Dusan Petkovic

I also feel like stackoverflow will slowly fall in popularity, with all the new communities and platforms and AI tools that are available today.

Collapse
 
pranavabhat profile image
Pranava Bhat

I feel that people will still have some room for human run communities like stackoverflow