"New Scientist on Friday claimed that Ukrainian attack drones
guided by artificial intelligence (AI) “are now finding and attacking targets
without human assistance.”
The report appears to be history’s first example of fully
autonomous weapons, and, if those vehicles were occupied, the first Lethal
Autonomous Weapon Systems (LAWS) or “killer robots.”
The drone in question is allegedly a new quadcopter model
called the Saker Scout, an indigenously produced unmanned aerial vehicle (UAV)
that can be folded up and lugged around in a suitcase-sized container.
Saker is a young Ukrainian company founded in 2021 to
produce commercial AI and drone solutions.
“Your support will help us make a ‘bird’ and deliver it on
your behalf to the defenders of Ukraine!” the company said in its pitch to
investors, making a little pun on its corporate name and purpose, as the Saker
falcon is a hunting bird found in the steppes of Ukraine.
Developed with remarkable speed and initially deployed in
September 2023, the Saker Scout claims to be a tough, barebones-looking UAV
with a prominent camera assembly hanging from its belly and an explosive charge
strapped to its back. The Saker platform is distinguished, creators say, by its
AI control system. The drones operate as an autonomous network in flight,
hunting down camouflaged enemy vehicles with recon units and then sending in
bomb-laden drones to make the kill.
Defense Express noted when the Saker Scout was unveiled in
September that its advanced artificial intelligence was not cheap or easy to
produce, but the drones themselves cost a relative pittance. Swarms of drones
working together to find and eliminate targets without human intervention are
very cost-effective compared to traditional infrared or radar-homing missiles.
Taking human operators out of the equation
makes the Saker airborne network very difficult to jam.
New Scientist noted that Saker Scouts can be piloted by
human operators in the traditional manner, with the AI picking out 64 different
types of Russian “military objects” and inviting the operator to commit bomb
drones to attack them. Input from the drone groups is collated into Delta, the
Ukrainian situational awareness computer system. Delta is a supercomputer that
creates highly detailed real-time battlefield maps by collating input from
various sensors and devices.
According to a Saker representative who spoke with New
Scientist, the company’s drones have now been deployed in autonomous mode “on a
small scale,” with human operators taken out of the loop.
Military analysts greeted the long-anticipated arrival of
killer robots with trepidation. Humanitarian groups worry that autonomous
systems might be less scrupulous about avoiding civilian casualties, as they
could act with lethal force against “false positive” targets without human
oversight. Military experts fear the coming of autonomous weapons that might
escalate a conflict very rapidly, leaving their human masters behind as they
duel with each other at lightning speed.
The Saker spokesperson told New Scientist that “using the
drone in autonomous mode is less reliable than having a human in the loop,” but
the system has proven effective so far, and the Ukrainian military believes it
is “more important to deploy a weapon that worked now, rather than waiting
until it was perfect.”
“Ultimately, Ukraine will need large numbers of drones that
can operate autonomously, locating and striking targets at machine speed,” the
spokesperson said.
Foreign Policy (FP) noted in May there have been rumors of
autonomous weapons before, stretching back to the deployment of Turkish Kargu-2
drones in Libya in 2020. Turkey’s wildly popular Bayraktar TB2 drones, which
Ukraine used against Russia in the early days of the events, reputedly have
limited autonomous capability.
Unless some of those earlier suppositions and rumors are
confirmed, the Ukrainians and their Saker drones will take the prize for using
the first fully AI-controlled weapons platform in battle and possibly for
causing the first AI-directed fatalities.
United Nations Secretary-General Antonio Guterres went further
than many other skeptics in 2019 when he said autonomous weapons were
“politically unacceptable, morally repugnant, and should be prohibited by
international law.”
Among the issues Guterres referred to was the danger that
autonomous weapons could be programmed to “target individuals or groups that
fit certain descriptions” and the possibility that nation-states or rogue
actors could launch LAWS attacks without revealing their identities. This would
make AI weapons ideally suited for hate crimes, ethnic cleansing campaigns,
political assassinations, and authoritarian regimes looking for a quick and
deniable way to wipe out dissidents.
FP warned:
Once these
technologies have spread widely, they will be difficult to control. The world
thus urgently needs a new approach to LAWS. So far, the international community
has done nothing more than agree that the issue needs to be discussed. But what
it really needs to do is take a page from the nuclear playbook and establish a
nonproliferation regime for LAWS."
Killer drones, if left without our attention, will destroy humanity.
Therefore people who release killer drones should be stopped by any
means, using nuclear weapons if needed.
Komentarų nėra:
Rašyti komentarą