A Blog by Jonathan Low

 

Aug 5, 2020

Will the Latest AI - GPT3 - Kill Coding?

We've always known that technology might one day render technologists obsolete. With AI, that possibility looms closer.

Whether it will 'kill' human coding is doubtful, but whether it can supplant many tasks currently done by humans is beyond doubt. JL

Frederik Bussler reports in Towards Data Science:

Machine-dominated coding is almost at our doorstep. To direct GPT-3 to a specific language task, you simply feed it an example of what you hope to achieve. So, while you can direct GPT-3 to write code, you can also direct it to write poetry, music, social media comments, or any other text. The network constrains itself to the task at hand when given instructions. Its main breakthrough is eliminating the need for task-specific fine-tuning. The same model can be trained on pixel sequences instead of text. GPT  doesn’t just have the potential to one day replace coders, but entire industries
In 2017, researchers asked: Could AI write most code by 2040? OpenAI’s GPT-3, now in use by beta testers, can already code in any language. Machine-dominated coding is almost at our doorstep.
GPT-3 was trained on hundreds of billions of words, or essentially the entire Internet, which is why it can code in CSS, JSX, Python, — you name it.
Further, GPT-3 doesn’t need to be “trained” for various language tasks, since its training data is all-encompassing. Instead, the network constrains itself to the task at hand when given trivial instructions.

The Evolution of GPT-n

GPT achieved state-of-the-art in language tasks by pairing supervised learning with unsupervised pre-training (or using the parameters from an unsupervised step as a starting point for the supervised step). Compared to its successors, GPT was tiny. It was trained on just a few thousand books and an 8 GPU machine.
GPT-2 drastically scaled things up, containing 10X the parameters and fed with more than 10X the training data. Still, the dataset was relatively limited, and it was trained specifically on “outbound links from Reddit which received at least 3 karma.” GPT-2 was described as a “chameleon-like” synthetic text generator, but it wasn’t state-of-the-art in downstream tasks like question answering, summarization, or translation.
Image for post
Photo by Hans-Jurgen Mager on Unsplash
GPT-3 is the latest and greatest in the AI world, achieving state-of-the-art in a range of tasks. Its main breakthrough is eliminating the need for task-specific fine-tuning. In terms of size, the model drastically scales up once again, reaching 175 billion parameters, or 116x the size of its predecessor.

Evolve or Die

Here’s the situation: Beta testers are using GPT-3 to generate working code, with trivial knowledge necessary. From buttons to data tables, or even re-creating Google’s homepage. These examples are all being done with zero-shot learning.
Besides the rapid evolution of AI, two other major tech trends are compounding the reality that programming jobs won’t be safe in the future: No-code and AutoML.
No-code refers to visual tools that make it easier for anyone to build new products, whether it’s websites, designs, data analyses, or models. WordPress, Wix, and Shopify are good examples of no-code tools that enabled millions of people to do things on their own rather than hire a developer or a designer.
The second trend is AutoML, or automated machine learning, which drastically shortens the time it takes to bring AI to production.
Tools like Apteo combine these trends, and enable anyone to deploy AI models with no coding skills required.
GPT-3 will spark an additional wave of no-code and AutoML tools. Many would-be employers will opt for these tools rather than hire expensive programmers.
Naturally, the lowest-skilled swaths of programmers will be the first to go, while experts will enjoy job security for longer — the same as in any field.

More Than Code — GPT-3 Applied to Any Language Task

To direct GPT-3 to a specific language task, you simply feed it an example of what you hope to achieve. So, while you can direct GPT-3 to write code, you can also direct it to write poetry, music, social media comments, or any other text.
For instance, if you want to generate a news article, you might input a title like “United Methodists Agree to Historic Split” and a subtitle like “Those who oppose gay marriage will form their own denomination.”
With this input, GPT-3 yields the following:
“After two days of intense debate, the United Methodist Church has agreed to a historic split — one that is expected to end in the creation of a new denomination, one that will be “theologically and socially conservative,” according to The Washington Post. The majority of delegates attending the church’s annual General Conference in May voted to strengthen a ban on the ordination of LGBTQ clergy and to write new rules that will “discipline” clergy who officiate at same-sex weddings. But those who opposed these measures have a new plan: They say they will form a separate denomination by 2020, calling their church the Christian Methodist denomination…”
Only 12% of humans correctly stated that this was written by an AI. 88% were fooled.
Like a human, GPT-3 can be taught new words with just one example. For instance, given the context:
A “Burringo” is a car with very fast acceleration. An example of a sentence that uses the word Burringo is: ____________
GPT-3 outputs:
In our garage we have a Burringo that my father drives to work every day.
These results are incredibly impressive. Bear in mind that AI’s inevitable evolution, so any criticism of current performance will come to naught.

More Than Language — GPT Applied to Images

GPT can write code, or, well, anything, but it can also generate images.
How is this possible?
The same model architecture can be trained on pixel sequences instead of text encodings, thus generating novel images instead of novel text. In fact, it’s so good at doing so, that it’s competitive with top CNNs.
I mention this because it shows that GPT (and its successors) doesn’t just have the potential to one day replace coders, but entire industries, given its versatility.

Conclusion

GPT-3’s mind-boggling performance has convinced many that superintelligence is closer than we think — or at least, that AI-generated code is closer than we think. It generates creative, insightful, deep, and even beautiful content.

1 comments:

Rent a car said...

us car rental, enterprise in fairfield ca enterprise in fairfield ca best rental car prices

Post a Comment