How does it work?

There are currently two major competing algorithms, BERT and GPT. They work in much the same way.

First of all, you need to assemble a substantial corpus of text, then let the AI hide words in certain paragraphs. It will then try to predict the missing words and compare its result with the original text.

This step enables the AI to make connections between each word and associate weights with them.

BERT has also been trained to predict the next sentence in texts, enabling it to adapt to the context.

BERT : Bidirectional Encoder Representations from Transformers

GPT has been trained to predict the words following a sentence. It can thus generate texts even if the input given to it is very short.

GPT: Generative Pre-trained Transformer

Giving artificial intelligences the ability to understand human beings is a fully-fledged field of computer science known as natural language processing. The discipline is as old as the Turing test.

To provide the most relevant experience this website uses cookies. By clicking on “I accept”, you consent to the use of all cookies. Details