A few months ago it was already the subject of various media, the text generator of OpenAI. The artificial intelligence research lab, co-initiated by Elon Musk, has now made the intelligent text generator available to the public and started the beta phase of GPT-3. Even though one still has to apply for access to GTP-3, the first users are already hooked on the Ki-based text generator.
OpenAI’s system uses over 175 billion parameters, which not only enables very precise results but also fundamentally increases the areas of application. Some authors have already used the AI to write unique stories and articles. Mario Klingemann has, as you can see in some press reviews, also a article about GPT-3 was written entirely by GPT-3.
Here are a few examples of GPT-3 in use
This is mind blowing.
With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you.
W H A T pic.twitter.com/w8JkrZO4lk
— Sharif Shameem (@sharifshameem) July 13, 2020
Here’s a sentence describing what Google’s home page should look and here’s GPT-3 generating the code for it nearly perfectly. pic.twitter.com/m49hoKiEpR
— Sharif Shameem (@sharifshameem) July 15, 2020
Here’s #gpt3 writing some SQL for me. pic.twitter.com/JVeyijV2MX
— Ayush Patel (@ayushpatel34) July 19, 2020
The use cases of the AI text generator are directly obvious in these few examples. Whether in journalism, programming or for authors, the AI can be a good help for complex content.