Code of ethics of robots
2007/06/02 Galarraga Aiestaran, Ana - Elhuyar Zientzia
This robot of the Fujitsu house, in addition to performing household tasks, helps its owner. (Photo: Fujitsu)
Thus, in South Korea, in some U.S. state, and also in Europe, as in Britain, a document of obligations and rights of robots is being studied.
Among the topics most concerned by the expert, security occupies the first place. Robots are increasingly autonomous, able to make decisions to a certain extent, and will be increasingly intelligent. Therefore, the question is who is responsible if by decision of the robot itself someone is hurt: the author of the robot, its owner, or the robot itself?
The truth is that some already ask this question, for example when in finance a robot, in this case a complex computer program, makes an erroneous decision. Who is responsible for a bad investment?
The Three Laws
Isaac Asimov reflected on this topic in the 40s. I, in the robot book, wrote that smart robots should be programmed to comply with three laws. First, a robot can't harm a human being, or can't be without doing anything, if the person can prevent injury. Secondly, the robot must obey the orders given by man, except when the order is opposed to the first law. Finally, the robot must protect itself, as long as it does not break the first law or the second law.
These three laws seem pretty good if you want to prevent robots from damaging people. But programming robots to fill them up is not an easy task. For example, although the first law seems simple, it generates problems for robots. Because for people it is easy to separate a person from a chimpanzee or a dummy, but for a robot the three are very similar.
However, it is not the biggest obstacle; engineers will get the robot to be able to differentiate it, to understand people's orders well and to be programmed to comply with all three laws. But that won't be enough.
According to experts, the future role of robots will generate new questions. For example, robots have already been manufactured for military applications: The Samsung house has developed a scintillating robot consisting of monitoring the border between North Korea and South Korea.
This robot has two cameras and a gun. If the robot decides to use the gun and kills someone, you have to figure out who the killer is. Although the example is extreme, it highlights the magnitude of the problem.
Questions, doubts, concerns
In any case, the issue can be more complicated in civil uses than in the military. Researchers believe that in the future it is possible to see robots dispersing protesters through water cannons. These situations are the ones that worry, not those that seem of science fiction.
In fact, researchers have criticized a report by the British Government's Office of Science and Innovation in December 2006. The report is titled “Utopian dream or machine boom?” and foresees that in the future robots demand the rights of people, such as the right to housing or health service.
Quickplacer, the robot faster. (Photo: Fatronic Foundation)
According to experts, these issues are not based on science and are sensationalist. However, they recognize that the report can be beneficial because it has generated debate and, consequently, has contributed to arousing a social concern for this issue.
But in fact, researchers believe that people care about other issues and that it is now time to discuss them. In Japan, for example, robots have already been used to care for people. For the time being, older people are measured with the heart rate, etc., but it is logical to think that soon robots will be able to do more things and, therefore, they will take great care of the elderly.
This has the advantages that in nursing homes it will be cheaper to have care robots than workers. But it also generates concern, since many do not find it acceptable to leave older people in the hands of machines.
On the other hand, the use of robots for sexual relations is also a matter of discussion and, in general, experts consider that robots in the form of people can generate confusion.
However, robot manufacturers have not yet entered into such accounts. The objective of the Robots Rights Document being prepared in South Korea is, above all, to avoid illegal use, to protect the data collected by the robots, and to identify and follow them. That is, practical and basic problems are being solved.
But the other will also come. They ask the skeptics who worried about animal rights decades ago. They believe that the question of robots will have a similar evolution. Time will tell.
Published in Gara.
Gai honi buruzko eduki gehiago
Elhuyarrek garatutako teknologia