}

GPT-3 transformer dark side

2021/03/11 Roa Zubia, Guillermo - Elhuyar Zientzia

In the field of artificial intelligence, an ancestral idea is lately being fulfilled. Science fiction of the 20th century has often proposed that machines would come to make creative works. Call robot, call your computer… whatever you want. Today there are many machines capable of writing literature or composing music, for example.
Ed. Flicr

We live in an artificial intelligence revolution, whether conscious or not. And that is why it is essential to talk about the operation and benefit of artificial intelligence (clear example is machine translation through neural networks). But as benefits are mentioned, risks must be mentioned.


In the field of artificial intelligence, an ancestral idea is lately being fulfilled. Science fiction of the 20th century has often proposed that machines would come to make creative works. Call robot, call your computer… whatever you want. Today there are many machines capable of writing literature or composing music, for example. Better or worse. But he is able, in a way, to be creative. The writer Isaac Asimov was once called “an unintelligent machine” because “he is not able to write a symphony”. And Asimov asked back “Are you able to write a symphony?”


The truth is that today artificial intelligence is getting better. A striking example is the GPT-3 program. Write. Well, very well. It makes literature, and surely neither you nor me, we are not able to distinguish between the literature made by a man and the GPT-3.


However, it is not perfect. Weak in writing. So says the head of the OpenAI itself, which has done GPT-3. From time to time he makes crazy mistakes (and who doesn't! ). However, it is surprising how he writes GPT-3.

The idea is that they have formed the program (actually, the GPT-3 algorithm is secret, but its basic operation is common). He is able to guess what is, so to speak, the next word he will find as a text is formed. So learn to imitate styles. In 2017, computer scientists invented a technique called Trasformer to increase efficiency in this learning. There are several processors that use in parallel to increase speed (actually it is more complicated). The name GPT means approximately “Generative Pretrained Transformer 3” and GTP-3 is the third version.


Where is the risk? Their risk is the same as artificial intelligence in almost any area. If it is formed with texts that are not very appropriate in origin (for example, racist texts), the GPT-3 will tend to produce such texts. So, it is clear that you have to look at what text is formed, but to have a good program you have to use billions of texts in training and not all can be supervised previously. It will assume and assume, therefore, the benefits and harms of all that “devours”. Therefore, programmers should look for ways to channel that learning ethically in part. And that's not easy. It is a great challenge for IT users.




Gai honi buruzko eduki gehiago

Elhuyarrek garatutako teknologia