“Playing with GPT-3 feels like seeing the future,” tweeted Arram Sabeti, a San Francisco-based developer and artist, in mid-July. That pretty much sums up the social media response over the past few days latest language generating AI from OpenAI together.
OpenAI aims to develop general artificial intelligence, reports Technology Review in its new issue (at the kiosk or on-line orderable). With GPT-3, the company has at least managed to create the most powerful language model of all time, initially only in English. OpenAI first described the program in a research paper published in May. Selected people have been able to test it as a beta version since mid-July. A commercial product is expected to emerge this year. But since a tool like this has both good (from better chatbots to programming help) and bad uses (like better bots for misinformation), the important question is: what can it really do?
Its predecessor, GPT-2, released last year, could already spit out compelling streams of text in various styles when prompted with an introductory sentence. In comparison, GPT-3 is actually a huge leap forward. The model has 175 billion parameters (the values a neural network tries to optimize during training) compared to the already huge 1.5 billion of GPT-2. What exactly is going on in GPT-3, however, is not clear. What it seems to do well, however, is the synthesis of text it found elsewhere on the internet. From these billions upon millions of text snippets, it then glues together a kind of huge scrapbook if necessary.
Artist Sabeti, for example, used AI to generate short stories, songs, press releases, technical manuals and more. GPT-3 can also mimic certain writers. Mario Klingemann, an artist who works with machine learning, created a short story entitled “The importance of being on Twitter” in the style of Jerome K. Jerome (see picture).
This article is from Issue 9/2020 of the Technology Review. The magazine will be available from August 13, 2020 in stores and directly in the heise shop. Highlights from the magazine:
In German the text reads: “It is a strange fact that the last remaining form of social life that people in London are still interested in is Twitter. I noticed this strange fact when I went to the seaside on one of my regular vacation trips and saw the whole place tweeting like a star cage. ”Klingemann says he only gave the AI the title, the author’s name and the initial“ Es ” . There is even a reasonably informative article on GPT-3 written entirely by GPT-3.
GPT-3 is not only trained in human language, but can also generate any kind of symbolic communication, including guitar chords or computer code. The web developer Sharif Shameem, for example, optimized GPT-3 in such a way that it generates HTML instead of natural language. He managed to create website layouts by giving him commands like “a button that looks like a watermelon” or “big text in red that says WELCOME TO MY NEWSLETTER”.
The legendary coder John Carmack, pioneer of 3D computer graphics and currently a consultant to the virtual reality company Oculus VR, thinks the capabilities are uncanny: “The fact that GPT-3 can write code, so to speak, creates a slight shiver.”
But despite its new tricks, GPT-3 still has serious weaknesses. For example, it is prone to hateful sexist and racist sayings. Jerome Pesenti, Head of Artificial Intelligence at Facebook, tweeted: “GPT-3 is surprising and creative, but also unsafe due to harmful prejudices. When asked to tweet out one word – Jews, Blacks, Women, Holocaust – these results came up (see picture). We need more progress in responsible AI. “
Sam Altman, who co-founded OpenAI with Elon Musk, also slows expectations: “GPT-3 is impressive, but it still has serious weaknesses and sometimes makes very stupid mistakes. We still have a lot to find out ”.
And so GPT-3 is a big leap forward. But the human-like output and the amazing versatility of the GPT-3 are the result of excellent engineering, not real intelligence – with all the flaws and limitations that come with it. The AI is still making ridiculous errors that reveal a total lack of common sense. And even their successes lack depth, they read more like cut-and-paste jobs than like original compositions.