“Killer robots”: drones deployed in Ukraine herald a new era of warfare

Ukrainian soldiers launch a drone against Russian positions near Bakhmut, in the Donetsk region (AP) (Libkos /)

The technological advances in the used drones in ukraine have accelerated a trend that could soon bring to the battlefield the first fully autonomous combat robotsinaugurating a new era in wars.

The longer the war lasts, the more likely it is that drones will be used to identify, select and attack targets without human assistanceaccording to military analysts, combatants and researchers from artificial intelligence (AI).

That would mean a revolution in military technology as profound as the introduction of the machine gun. Ukraine already has semi-autonomous attack drones and AI-powered counter-drone weapons. Russia also claims to possess AI weaponry, although such claims are unproven. But there are no confirmed cases of a nation engaging robots that have killed humans in complete autonomy.

Experts say it can be just a matter of time Russia or Ukraine, or both, deploy them.

“A lot of states are developing this technology,” said Zachary Kallenborn, an arms innovation analyst at George Mason University. “Obviously it’s not that difficult.”

The feeling of inevitability it extends to activists, who have tried for years to ban killer drones, but now feel they must settle for trying to restrict the offensive use of these weapons.

Police officers fire at an Iranian drone during a Russian attack in kyiv (Reuters)
Police officers shoot an Iranian drone during a Russian attack in kyiv (Reuters) (STRINGER /)

Ukrainian Minister of Digital Transformation Mykhailo Fedorov agrees that lethal fully autonomous drones are the “logical and inevitable next step” in weapons development. He said Ukraine has been doing “a lot of research and development in this direction.”

“I think the potential for it is great. in the next six months”, Fedorov declared to Associated Press in a recent interview.

Ukrainian Lt. Col. Yaroslav Honchar, co-founder of the non-profit combat drone innovation company Aerorozvidka, said in a recent interview near the front lines that human combatants simply cannot process information and make decisions as quickly as machines.

Ukrainian military leaders currently prohibit the use of fully independent lethal weapons, although that could change, he noted.

“We haven’t crossed this line yet, and I say ‘yet’ because I don’t know what will happen in the future,” said Honchar, whose group has spearheaded drone innovation in Ukraine, turning cheap commercial drones into lethal weapons.

Russia could get autonomous AI from Iran or other countries. Iran-supplied Shahed-136 long-range explosive drones have damaged Ukrainian power plants and terrorized civilians, but they are not particularly smart. Iran has other drones in its arsenal that it says incorporate AI.

without much problem, Ukraine could make its semi-autonomous drones fully independent to better survive electronic interference on the battlefieldaccording to the manufacturers of the West.

Among those drones are the US-made Switchblade 600 and the Polish-made Warmate, both of which currently require a human to choose targets via a live video feed. The AI ​​finishes the job. Drones, technically known as “loitering munitions,” can lurk for several minutes over a target, waiting for a clean shot.

SWITCHBLADE DRONE
Drone Switchblade (@aerovironmentinc)

“The technology to achieve a fully autonomous mission with Switchblade it already exists”, says Wahid Nawabi, CEO of AeroVironment, its manufacturer. For it policy change will be needed —removing humans from the decision-making circuit—something that, by his calculations, is three years away.

Drones can already recognize targets, such as armored vehicles, using cataloged images. But there are disagreements about whether the technology is reliable enough to ensure that machines don’t goof off and kill non-combatants.

The PA he asked the defense ministries of Ukraine and Russia if they had used autonomous weapons for offensive purposes and if they would commit not to use them if the other side did too. Neither of them responded.

If one of the two sides went on the attack with full AI, it wouldn’t even be the first time.

An inconclusive UN report indicates that lethal autonomous systems (or “killer bots”) debuted in the internal conflict of Libya in 2020when Turkish-made Kargu-2 drones in fully automatic mode killed an unknown number of fighters.

Action in the field of maneuvers of the autonomous armed drone KARGU.  The Bulletin of Atomic Scientists considers that a Turkish-made drone could be the first known case of autonomous weapons based on Artificial Intelligence used to kill (Europa Press)
Action in the field of maneuvers of the autonomous armed drone KARGU. The Bulletin of Atomic Scientists considers that a Turkish-made drone could be the first known case of autonomous weapons based on Artificial Intelligence used to kill (Europa Press) (Sebastian Carrasco /)

A spokesman for STM, the manufacturer, said the report was based on “speculative and unverified” information and “should not be taken seriously.” He indicated to the PA that the Kargu-2 cannot attack a target until ordered to do so by the operator.

Fully autonomous AI is already being used in the defense of Ukraine. Fortem Technologies has supplied the Ukrainian military with drone-hunting systems that combine small radars and UAVs, both powered by AI. Radars are designed to identify enemy drones, which UAVs then take out by dropping nets on them, all without human intervention.

The number of AI drones is on the rise. Israel it has exported them for decades. Their Harpy drones can hang around in the air for up to nine hours waiting for some anti-aircraft radar system to turn on to destroy them.

Other examples include the Pez Puffer-3 unmanned helicopter gunship from China. Russia has long been working on an underwater drone called the Poseidon equipped with AI and nuclear warheads. The Dutch are testing a ground robot armed with a .50 caliber machine gun.

Honchar believes that Russia, whose attacks on Ukrainian civilians have shown little regard for international lawhe would have already used autonomous drones if he had them.

I don’t think they had scruplesagreed Adam Bartosiewicz, vice president of WB Group, maker of the Warmate.

US marine drones in the Persian Gulf
US marine drones in the Persian Gulf

AI is a priority for Russia. President Vladimir Putin said in 2017 that whoever masters that technology will rule the world. In a speech on December 21, he expressed confidence in the Russian arms industry’s ability to add AI to war machines, stressing that “the most effective weapons systems are those that work quickly and practically in automatic mode. ”

Russian officials already claim that their Lancet drone can operate in complete autonomy.

“It’s not going to be easy to know if and when Russia crosses that line,” said Gregory C. Allen, former director of strategy and policy at the Pentagon’s Joint Center for Artificial Intelligence.

Change a remote control drone to full autonomy maybe it’s imperceptible. To date, drones capable of operating in both systems have performed best when piloted by a person, Allen said.

The technology isn’t that complicated, said Professor Stuart Russell, one of the leading AI researchers at the University of California, Berkeley. In the mid-2010s, Russell consulted with his colleagues and they agreed that graduate students could manufacture an autonomous drone in a single semester “capable of locating and killing an individual, for example, inside a building”he claimed.

So far, attempts to set international basic standards for the use of military drones have been unsuccessful. Nine years of informal United Nations talks in Geneva have made little progress, and major powers including the United States and Russia oppose a moratorium. The last session, held in December, ended without a new round having been scheduled.

Washington officials say they will not agree to a moratorium because rival drone developers cannot be trusted to use them ethically.

Drones during a military exercise at an undisclosed location in Iran (Iranian Army/WANA (West Asian News Agency)/Handout via REUTERS)
Drones during a military exercise in an undisclosed location in Iran (Iranian Army/WANA (West Asia News Agency)/Handout via REUTERS) (WANA NEWS AGENCY/)

Toby Walsh, an Australian academic who campaigns against killer robots, hopes to reach a consensus on some limits, including a ban on systems that use facial recognition and other data to identify or attack individuals or categories of people.

“If we’re not careful, they will proliferate much more easily than nuclear weaponssaid Walsh, author of “Machines Behaving Badly.” “If you can make a robot kill one person, you can make it kill 1,000.”

Scientists are also concerned that terrorists can modify weapons with AI. In a feared scenario, the United States military spends hundreds of millions of dollars writing computer programs to operate attack drones. If the program is stolen and copied, the terrorists are given the same weapon.

To date, the Pentagon has not clearly defined “an autonomous AI weapon” or authorized a single such weapon to be used by US troops, said Allen, the former Defense Department official. Any proposed system needs the approval of the head of the Joint Chiefs of Staff and two deputy secretaries.

But that hasn’t stopped such weapons from being developed in the United States. The Defense Advanced Research Projects Agency (DARPA), military laboratories, academic institutions, and the private sector all have projects underway.

The Pentagon has emphasized the use of AI to assist human combatants. The Air Force is studying strategies in which drones form squadrons for pilots. Former Assistant Secretary of Defense Robert O. Work, the promoter of the idea, said last month in a report that “it would be crazy not to pursue an autonomous system” once AI systems surpass humansa threshold that, he said, was crossed in 2015 when computer vision surpassed that of humans.

Humans have already been excluded from some defense systems. missile shield Israel Iron Dome is authorized to shoot automaticallyalthough a person is said to be on the lookout and can intervene if the system goes after the wrong target.

Several countries and all branches of the US military are developing drones capable of striking in deadly synchronized swarms, said Kallenborn, the George Mason analyst.

Will future wars turn into a fight to the last drone?

That is what Putin predicted in a 2017 televised talk with engineering students: “When the drones of one side are destroyed by the drones of the other, you will have no choice but to surrender.”

(With information from AP)

Keep reading:

Russia admitted that the number of its casualties in Makiivka is higher than what it had released and blamed its own soldiers

Who is Yevgeny Prigozhin, the head of Putin’s mercenaries considered the darkest man of 2022

Source-www.infobae.com