}

Automotive?

2015/12/01 Leturia Azkarate, Igor - Informatikaria eta ikertzaileaElhuyar Hizkuntza eta Teknologia Iturria: Elhuyar aldizkaria

The cars do not work by themselves, contrary to what their name indicates. They were called so because they did not have to throw a horse or a donkey in what the driving force refers to, but a man has to drive. However, truly autonomous vehicles are a gateway: several manufacturers have shown that the technology is advanced enough for cars to function on their own soon. But as with all the changes that occur very quickly, perhaps the practical and philosophical implications it may have taken into account sufficiently.
Ed. © Dollarphotoclub/Martialred

At present, it is usual that cars have more and more electronic components, sensors and software and can perform different tasks: parking alone, emergency braking, warning of approach to the edge of the road… The race to get fully autonomous vehicles was launched by Google in 2012 when it announced it was in it. Since then almost all the producers have joined the race and in some year of the next decade all hope to put autonomous vehicles on the street. Before, however, they will implement advanced concrete functions such as autonomous driving on motorways. In October, Tesla upgraded the software of its S model to make it the first car with autonomy level 2 (level 0 is the car that needs the driver at all times and level 4 runs completely, at level 2 can run at least two controls simultaneously).

It is clear that the issue of autonomous vehicles is going very fast and cannot be stopped. But it may also be too fast, as this will entail many changes on many levels (practical, moral, legal...) and perhaps not all good. I do not know if we have reflected a lot about it and if we are prepared for those changes.

Multiple to take into account

To begin with, cars need complex systems to be autonomous: sensors, communication systems, software… On the one hand, cars and their repairs are very expensive, but on the other hand, cars depend on more and more things to work. Is it practical that the car cannot be used only by the failure of one of these components? The truth is that it will always be possible or not the old way of driving by hand, it may arrive on the day when that option is removed, or it can arrive on the day when we do not know how to drive by hand.

On the other hand, there is a question of quality of the autonomous driving system. How to know if a system is good or bad or if it does it correctly? Buying a car with bad driving is not like buying a bad dishwasher, because our safety and that of yours are at stake. And although we know that a system is better or worse, they will surely have a different cost. How to assess, therefore, if this quality of the driving system compensates the cost difference?

In addition, there will be someone who doesn't rely on driving systems and who never wants to drive an autonomous vehicle. I personally think that autonomous cars will be much safer than many human beings who circulate on the roads. So, if it is shown that these types of cars are safer and always comply with traffic rules, is it obligatory for all cars to be autonomous?

Security (IT)

As these cars are controlled by software and supplied with communication systems, it has the same security problems as every computer. In July, two hackers showed that they were able to take full control of a car with ease. They soon solved this security problem, but news will always appear as they appear on computers. In addition, they say that the way to optimize the operation of autonomous vehicles would be to implement communication systems between cars. The detection of these communications allows a follow-up of the trajectory of a car. And as with computers or phones, there will be viruses, the infection of the cars between them or with the computers of the workshops…

But perhaps the most dangerous are not external but internal enemies. How do you know that the car software does what the manufacturer says? There is the theme of Volkswagen. Or it would not be surprising that the producers put “back doors” in the software for security agencies, governments or police, in order to track our cars or take their control.

Taking into account that the software of the cars can have security problems intentionally or not, the logical thing would be that the software was also inspected or that, like the mechanical components, the software also had to undergo a technical review, right? However, the car software, like any other software with copyright, is protected by copyright law and no one can force the producer to authorize its review. For this reason, some are asking that the car software be open source.

Another discussion is the freedom of the user: how we can install the operating system or software that we want in our computer, do we have the right to change or modify the software in our car? For the moment, in the US yes, but that can also change.

It is also discussed who is responsible for accidents when the vehicle travels alone. Or who has to pay for insurance, if the car always goes on its own.

The most difficult and interesting dilemma is how to program these machines so as not to cause damage to humans. Asimov has long given the three laws of robotics: 1st A robot cannot damage a human being, nor leave it or accept it to suffer damage; 2nd, a robot is obliged to obey the orders of human beings, provided it does not contradict the first law; 3rd, a robot must protect itself, as long as it does not violate the first and second laws. But things are not so simple. These laws have some problems that are seen in the tram dilemma: there are five people on the railroad and the tram can't stop. The only thing that can be done is to press the change iron button. Unfortunately there is another person on the railroad. In this case, both by action and by action will hurt the people present, so it is impossible to comply with the first law of Asimov. This type of cases can be given in daily use, and what should the car software do then? Do not do anything and kill those who in principle would die, or do an action and those who in the beginning would be saved? Save as many people as possible? Do you have to consider age? How can you ensure the age or number of these people? And in the case of autonomous vehicles, is it necessary to save the largest number of people or prioritize internal vehicles, or external vehicles, or those who are not guilty in the accident? On the other hand, who is guilty in the accident if there are two autonomous vehicles?

As you can see, autonomous vehicles still raise many doubts, but there is time to make decisions. In any case, it will be necessary to pay attention to the issue so that, in a matter as important as this one, decisions are not taken only by companies and governments and, as is customary, society is harmed.

Gai honi buruzko eduki gehiago

Elhuyarrek garatutako teknologia