The Forem codebase, which powers DEV and other networks, is open source, and is therefore great for demonstration purposes.
Rewriting existing code, or refactoring, is a really interesting use case for ChatGPT, and the chat interface is potentially the cleanest way to make use of this potential while still maintaining some common-sense control over the process.
Here were my results:
Refactor this Ruby on Rails controller:
(Pasted code from here)
This controller can be refactored to make it more readable and maintainable. Here are a few suggestions:
DEFAULT_HOME_FEED_ATTRIBUTES_FOR_SERIALIZATIONconstant into its own class or module, so that it is not defined within the
StoriesController. This will make the controller easier to read and understand, and will allow the constant to be easily reused in other parts of the application.
before_actioncallback to set the
@pageinstance variable, rather than defining it in the
indexaction. This will make the
indexaction easier to read and understand, and will avoid repeating the same code in multiple actions.
handle_base_indexmethods into their own private methods, so that the
indexaction is not cluttered with a large amount of code. This will make the
indexaction easier to read and understand, and will allow the code to be more easily reused in other parts of the controller.
redirect_to_changed_username_profilemethods into their own private methods, so that the
showaction is not cluttered with a large amount of code. This will make the
showaction easier to read and understand, and will allow the code to be more easily reused in other parts of the controller.
before_actioncallback to authenticate the user, rather than defining the authentication logic in multiple actions. This will make the controller easier to read and understand, and will avoid repeating the same code in multiple actions.
This definitely validated my feelings that this could be a useful exercise, and could lead to major changes in how we do our work.
Would it be useful to blindly implement these suggestions? No. Absolutely not (they're not even really correct, technically) — but it certainly introduces a tool which is... going to be a thing, one way or another.
It is dangerous to get too caught up in what this type of technology can or cannot do well today, because some of those gaps can be closed with targeted application. ChatGPT as an application of GPT-3 is an example of just this. ChatGPT is less powerful than GPT-3, but demonstrated a tremendously powerful application in how it is being used.
There is much more exploration to be done.
Top comments (16)
Wow, almost all are bad ideas :D
if they make a stackoverflowGPT bot, the bot will tell you that your question has already been asked, or that he is not there to code for you or the question in general does not make sence and you need to read some manual.
I do think, however, that SO will eventually have to implement an automated AI-generated answer, with a disclaimer that it might just be nonsense. The danger of negative feedback loops by ingesting its own garbage, and the vast amount of questions that are just chaff to mark as duplicates or otherwise close will lead to some need for a "first, check if the AI can at least give you the right nudge" approach sooner rather than later.
StackOverflow introduced a reminder that says: ""We do not currently allow content pasted from ChatGPT on Stack Overflow; read our policy here.""
[I didn't use the tool ..]
I wonder how will react when
refactoring yourself for double speedis asked.
if you want it or not, some company will publish a product or service that does just that. code suggestions. and your boss, will make you use it...
I have no doubt he will - because big corporate entities just love throwing heaps of money out of the window!
I did similar with a PHP exception. Loved when GPT told me that it made an already private property private.
Even better, I rolled it again and it told me that there was nothing to change because the implementation followed best practices…
(It was most definitely not best practice code)
The robots aren’t taking our jobs any time soon ;)
Hi Ben. As a developer that builds my own products, I want to be able to give it a spec and let it create most of the code, or take my .Net Framework code and migrate it to .Net Core for certain projects. Another good use case would be creating a front end from backend code or vice versa. Maybe something to look into for next year.
GPT is never accurate but it will be someday. Eventually.
From my experiences with ChatGPT, it's 50/50, similar to Github Copilot. The more information and good prompting I give to ChatGPT, the better its' answer and suggestions get.
It is a very helpful tool, and we should utilize it and benefit from it for our productivity.
I don’t get it. Unless I am missing something, those methods that it wants to “extract into their own private methods” are already private methods.
And it doesn’t look like the show or index methods will change at all.
I mean, Forem is already good code. Maybe this would be more actionable on an example of “bad code”? But since it’s all based on a language model, I honestly don’t think so :-)
Yeah, it's possible that it doesn't understand how to parse a Ruby file with the
privatekeyword where it is.
Neat! Very cool experiment!
Cool experiment, maybe if we could train the AI just for refactoring...