Natural Language Programming is evolving – and how!
“Artificial intelligence programs lack consciousness and self-awareness. They will never be able to have a sense of humor. They will never be able to appreciate art, or beauty, or love. They will never feel lonely. They will never have empathy for other people, for animals, for the environment. They will never enjoy music or fall in love, or cry at the drop of a hat.” — GPT-3
It’s nothing but pure wonder when a technology scientist looks at a program and thinks:“There’s no way it just wrote that!” But, as it turns out, that’s just what OpenAI’s GPT-3 is doing – mimicking general intelligence to the extent that it’s surprising to even its own creators and testers. So much so that it has even decided to go ranting about the futility of artificial intelligence, and how computers will never truly be intelligent – as evident from the quote above. Astounding!
The GPT-N series
GPT-3 (abbreviated from ‘Generative Pre-trained Transformer 3’) is an active development on its predecessor GPT-2 and was launched in June 2020 by San Francisco-based AI research laboratory OpenAI, founded by Elon Musk and Sam Altman, among others. Although the GPT-2, (released in 2019) was an active improvement on its precursor (the first in this series), it wasn’t half as flexible as its 2020 counterpart.
The ‘memory window’ or the ability to retain memory was small, the architecture was old, and the number of machine learning parameters was less than one-tenth of the current system. However, one must regard it as being as an essential stepping-stone in this series of evolutions. It had the ability to formulate decent paragraphs from words and phrases – even citing imaginary sources and organisations. But it wasn’t really what one would consider ‘intelligent’. The GPT-3, in this regard, is a marked improvement.
Essentially the third generation in a series of language prediction models,GPT-3 uses deep learning to generate near-human intelligence in text. In its full capacity, it does so by using over 175 billion machine learning-based parameters to train its language representations – an increase of almost 11,500% compared to its predecessor. In fact, the entire English Wikipedia constitutes only 0.6% of GPT-3’s total training data.
Learning on its own
GPT-3 is a prime example of how effective unsupervised machine leaning is going to be in the future. Interestingly, it will learn the way human beings acquire knowledge themselves. By using unsupervised learning, there would be no need to label all data (forming structured data) – thereby passively reducing programmer’s inputs. The system itself learns to better generalise across tasks using unstructured data. For example, if one was to generate a paragraph of text in advance about what a politician might say about a certain policy, the system will peruse its vast trove of unstructured data – which includes everything from recipes, news articles, coding manuals, novels, poetry, religious texts, journals, or literally anything else that is available in a digitised format – and then generate it just by feeding it a headline – or the first sentence.
This has its own set of issues too. The high level of depth and complexity of data sources will allow seepage of bias into the output, thereby leading to a high level of inaccuracies as well. Imagine conspiracy theories, racist manifestos, pseudo scientific textbooks,and the like being considered as input data. A system for partially structuring of data may thus be more prudent as use-cases for GPT-3 become more crucial – things such as policy or investment analyses, for example.
Despite several noted drawbacks, the wonders of GPT-3 are numerous. It can produce question-based search engines, answer medical queries, autocomplete images, write fiction or even music. A very nifty feature is its auto-chatbot that lets the user ‘talk’ to historical figures, generating automated replies based on hordes of digitised books and journals in its database. A great example is a highly intriguing automated dialogue between Claude Shannon and Alan Turing which eventually gets interrupted by – you guessed it – Harry Potter, of course!
Indeed, GPT-3 is rapidly propelling the entire landscape of NLP (Natural Language Processing) systems to an unchartered – and slightly daunting – new reality.