laitimes

Programmers use AI to write AI code and "kill" other programmers?

author:Taste play

Use AI algorithms to help programmers write AI algorithms? Perhaps many programmers ' "You are already a mature AI, it is time to learn to complete the code yourself" wish is about to be realized.

In June of this year, github and open AI worked together to launch an AI tool called "github copilot", copilot can automatically complete the code according to the context, including document strings, comments, function names, code, as long as the programmer gives a certain hint, this AI tool can complete the complete function.

Programmers use AI to write AI code and "kill" other programmers?

Shockingly, copilot is still evolving. At the recent GitHub universe 2021 developer conference, GitHub officially said that copilot has begun to support multi-line code completion in languages such as Java, C, C++ and C#, and added neoovim and jetbrains ide, including support for editors such as intellij idea and pycharm commonly used by developers.

Programmers use AI to write AI code and "kill" other programmers?

GitHub said that 30% of the newly written code today is done with the help of the AI programming tool copilot, is copilot really so powerful? In social networks, many bloggers who have obtained the copilot trial qualification early have shared their feelings about using copilot.

Programmers use AI to write AI code and "kill" other programmers?

Of course, in addition to bloggers who seriously write about the experience of using copilot, there are also programmers who sneak fish and are caught...

Programmers use AI to write AI code and "kill" other programmers?

Copilot is powerful, but like most AI tools, it is still based on the openai codex algorithm, which requires a huge amount of code to train its intelligence. This is not a problem for OpenAi and GitHub, which also have Microsoft pedigree, spent $7.5 billion in 2018 to acquire github, a code-sharing site with about 50 million users worldwide, which means that the codex algorithm on which Copilot relies has been trained in billions of lines of public code.

In fact, as early as the birth of copilot, OpenAi launched the 175 billion parameters of the AI model gpt-3, gpt-3 spent tens of millions of dollars on human poetry, novels, news and other massive natural language training (mainly English), so gpt-3 has a certain degree of understanding of natural language. Geoffrey Hinton, the father of neural networks, once lamented after the emergence of gpt-3: "The answer to life, the universe and everything is actually only 4.398 trillion parameters." ”

Programmers use AI to write AI code and "kill" other programmers?

Codex is based on GPT-3 training, and Greg Brockman, co-founder and chief technology officer of OpenAi, has said that Codex is a descendant of GPT-3. Therefore, codex also has the ability to translate some instructions into clear English into code, and even some media have publicized that codex has lowered the threshold for programmers to write code as long as they know English.

Programmers use AI to write AI code and "kill" other programmers?

In the official example of the codex effect display, the programmer only needs to write "make it be smallish" on the editing interface The large spaceship in the above example will be reduced as shown below, and in this process, the programmer does not need to enter a line of code, and the codex will automatically write the car program.

Programmers use AI to write AI code and "kill" other programmers?

Nowadays, the powerful ability of codex trained by countless codes and money has been applied to the ai tool copilot, which has created the magic code completion of copilot, giving function suggestions and other functions, but it has also made copilot fall into a series of public opinion controversies.

As copilot became popular with more and more programmers, Nat Friedman, CEO of GitHub, excitedly said: "Hundreds of github developers use copilot every day, and if the preview goes well, we plan to expand it into a paid product at some point in the future." ”

Nat Friedman's words made copilot less fragrant, which means that after a wave of divine operations by github and openai, the paid copilot has used the knowledge of 50 million users of github, the world's largest code-sharing site, to commercialize. <b>The focus of the controversy is that the copyright of copilot has the copyright of open source code derivatives commercialized gpl copyright. </b>The gpl (general public license) is a general term for a series of free software licenses that can be used to guarantee users the freedom to run, research, share, and modify software. Correspondingly, any reproduction or transfer of derivative works of gpl copyright must be subject to the same or equivalent license terms.

To put it simply, I am open source software, open source code you can use as you like, but as long as you use it, you must also support open source, and anyone else can use your code or software for free. The key to the public outrage of copilot in the github community is that it has washed open source code into commercial products, ignoring the early open source spirit of promoting a rich and open programming world, and many programmers have publicly stated on social media that they will no longer use github to host their own code in the future.

Programmers use AI to write AI code and "kill" other programmers?

Github officially explains that copilot "usually does not accurately copy the code block", and some people think that the result of copilot through massive code AI training is just like humans, and humans also need to internalize it into their own by learning other people's relevant knowledge in the early stage, and it is difficult to simply understand the code trained by ai model as copy and paste.

However, many people have rejected this statement, and when solving some classic problems in the program through copilot, they will find that copilot almost verbatim copied and pasted a piece of classic code on github. This means that after copilot becomes a commercial product, users who use copilot will inadvertently be caught violating the gpl agreement when applying their code to their own products, and face the risk of being sued, so some technology companies have explicitly required that employees be banned from using copilot.

Copilot faced more problems in practical applications than this, as programmers gradually deepened their understanding of copilot, they found that copilot is not perfect, there are still many flaws. The codex behind copilot has not only undergone a lot of text language training, but also absorbs a large amount of code in the online world, so some of the code output by copilot may not look so good, with privacy leaks, security risks and other issues. Chen Rui, the big guy of the b station, lay down the gun once, although some netizens immediately said that the information of the date of birth was wrong.

Programmers use AI to write AI code and "kill" other programmers?

Some netizens also said that "copilot is cool for a while, debugging the crematorium", because it is not simple to clearly and clearly describe the functions that the objective function wants to achieve, and at the same time, in the process of using copilot, it is necessary to constantly review and check whether the code generated by ai is correct, which is easy to interfere with the original idea of programming.

At present, github copilot is still in the trial stage of applying for no fees, and the debate about it on the Internet is still continuing, and as the AI tool becomes more powerful, the similar problems that humans will face in the future will only increase.

Read on