DEV Community

Cover image for Everything You Need To Know About Github Copilot!!!
Devarshi Shimpi
Devarshi Shimpi

Posted on • Updated on

Everything You Need To Know About Github Copilot!!!

Not so long ago, artificial intelligence was the main antagonist of Sci-Fi movies. And today, it is all around us. Almost every product that comes out of the technology world seems to have some element of artificial intelligence. AI has a significant impact on our daily life, but have you ever wondered what AI is really capable of? A great example of a futuristic vision becoming reality is the GitHub Copilot.

Copilot Homepage

GitHub Copilot is super cool, but what is it?

Copilot is a tool built into the code editor that is capable of writing code itself based on the code you’ve already written in your project. The only thing you need to do is type a function name or some comments and Copilot will automatically fill in the implementation. The tool processes the user’s input in the cloud and comes back with a snippet that you just accept, decline, or ask for further solution suggestions.

But, What's the Price?

GitHub Copilot licenses, which are currently only available to individual users, cost $10 per month or $100 per year. However, students enrolled in GitHub's Global Campus Program and maintainers of popular open source GitHub projects -- identified when a user navigates to the GitHub Copilot subscription page -- will still be able to use the tool for free.

Can GitHub Copilot be any threat to developers?

AI suggestion by Copilot

After hitting the programmer’s market, it has been widely discussed whether it is a big step toward the death of computer programming or just another autocomplete tool on steroids. Well, it still requires some serious knowledge to build software. Even with Copilot’s help, you need to know what you are doing, verify and understand generated code. It’s not possible (yet) for non-programmers to jump on Copilot and build whatever they want.

The tool is not perfect. It doesn’t always generate the correct code. There are a lot of bad practices and deprecated code out there. Even worse, Copilot can write security vulnerabilities, especially in memory unsafe languages. It isn’t 100% reliable yet. You absolutely need to review Copilot’s code. This is AI. AI needs to learn. It will get better with time, but we need to wait.

Conclusion

Copilot is just one of many tools that improve and will keep improving our work. In fact, relying too much on tools like that might lead to unnecessary work or even serious problems.

Thank You for reading till here. Meanwhile you can check out my other blog posts and visit my Github.

Top comments (9)

Collapse
 
zoppatorsk profile image
Zoppatorsk

For what it is I think co-pilot is great. Sure it does generate a lot of garbage code but I have an oversight with that as it helps a lot with repetitive tasks. If i for example start to write just some mock data to test a function or something it usually picks up on it and helps generate it.

Also for common algorithms it's really good.. It's like u know the algorithm exist put cant really remember the details how it should be written... helps with that and saves me a trip to stackoverflow.. hahaha..

But yes, ofc need to know and understand code to be able to use it in a good way. I think it will be a very long time until u can let an AI reliable write a whole software, and the question is do we really want AI's to be able write software on their own??

Collapse
 
devarshishimpi profile image
Devarshi Shimpi • Edited

True.
Your Question still exists as a mystery until we can determine how good AI can write reliable code without security bugs.

Collapse
 
zoppatorsk profile image
Zoppatorsk

Yeah... and how can we keep it from only writing "good code".. Like think that u have access to an AI that can write all kinds of advanced code, what prevents anyone to use it to write say a worm that infects bank systems and crashes the whole economy, or overloads nuclear plants and so on?

At the same time you could argue that this tool the AI wrote is made for penetration testing systems but used by the wrong person the same tool could be used to penetrate the systems and cause harm.

The whole AI discussion is not only a technical discussion, it's also a moral, ethical and philosophical discussion.

Thread Thread
 
devarshishimpi profile image
Devarshi Shimpi

Makes sense. At the same since AI has written the code, definitely a human brain can make some exploit and hack into it.

Collapse
 
elenameas profile image
ElenaMeas

i am glad you are taking time and sharing valuable information with us . dua to make someone forget you

Collapse
 
devarshishimpi profile image
Devarshi Shimpi

Thank You For Your Kind Words 😁😁

Collapse
 
sindouk profile image
Sindou Koné

Add to the discussion

Collapse
 
devarshishimpi profile image
Devarshi Shimpi

Haha