USD
41.72 UAH ▲0.33%
EUR
49.18 UAH ▲1.09%
GBP
56.99 UAH ▼0.02%
PLN
11.57 UAH ▲1.03%
CZK
2 UAH ▲1.6%
This opinion was expressed by Lyran Antebi, a scientist of the Institute for Nat...

"The problem is not in the weapon but in the si": The Fourth World War will be conducted with stones and sticks

This opinion was expressed by Lyran Antebi, a scientist of the Institute for National Security Research (INS). Our fear of drones is false: real danger is the technology that launches them, says Lyran Antebi, a researcher at the Institute for National Security Research (INS). "I do not know what weapon the Third World War will host, but the Fourth World War will be waged with sticks and stones," the scientist writes.

As 2023 approached, it became clear that the world of hostilities moved from the threat of nuclear attacks to cyberspace. Artificial intelligence penetrates all aspects of our lives, and it was only a matter of time when it will begin to be used in our weapons systems, according to CTech. Lyran Antebi believes that weapons do not cause as much concern as the autonomy of the SI systems. "We have to cope with technology until it's too late," she said.

"We need to deeply delve into the subject, as well as understand how the" Ecosystem "of Shi-technologies works. Autonomy AI is relevant today for both the armed forces and corporations and for small companies. Everyone has its own tasks for AI. " Of course, artificial intelligence is already in our daily life, automating some of its aspects. Private companies create autonomous vehicles, autonomous household electronics, a "smart" house, etc. And people like it, because it is so convenient.

However, Antebi claims that when autonomy is talked about weapons, people's views change dramatically. "When we talk about the military question, we talk about human lives," she explained. "If there is fire power, there is a problem of survival, the problem of death and the problem of saving lives.

When it comes to autonomous fatal systems, people perceive it very critically, but only because they are dead To believe that autonomous weapons are, first of all, a weapon, that is, what kills, but an autonomous car or drone cannot kill. Antebi studies how autonomy opportunities. In the past, she advised the UN as part of an IPRAW group (International Group on Autonomous Weapons Systems Regulation Group) and advised the Israeli Ministry of Defense on these issues. weapons, and in the s.

We must be sure that AI is perfect before giving it the opportunity to drive at least something - and does not matter, it will be a weapon or a car, because it can be mistakenly harm or even kill. I am talking about autonomous cars like Telsa, I speak about all kinds of robotics. "The scientist explained that" ethical solutions "made by AI" independently "is a problem and responsibility of private companies that create technologies.

Systems that, It is expected to carry children in autonomous vehicles or carry out attacks of drones in the fields of combat operations, should be perfect before they can work by receiving the credit of the trust of citizens and governments. The famous dilemma who will save the robot during the fire, or how drone will attack, To hit the target, will require strict supervision and regulation so that autonomy is convenient but not deadly.