Sekėjai

Ieškoti šiame dienoraštyje

2025 m. kovo 17 d., pirmadienis

A drone conflict

 


"IT HAS become a cliché to note that the conflict in Ukraine is a drone conflict. But two recent studies shed light on what that means in practice. In mid-February the Royal United Services Institute (RUSI), a think-tank in London, published the latest in a series of papers taking stock of tactical developments in Ukraine over the preceding year. On March 6th the Centre for Strategic and International Studies (CSIS) released another paper looking at Ukraine’s capacity and plans for conflict specifically involving artificial-intelligence (AI) tools. Together they paint a picture of a battlefield that is increasingly saturated with and dominated by the presence of uncrewed machines.

The first observation is that drones have become the most lethal weapons in Ukraine. “Tactical” drones, those with ranges in the low tens of kilometres, are now responsible for 60-70% of damaged and destroyed systems, says RUSI. A growing proportion of these are equipped with AI guidance, allowing them to lock on to targets in the final phase of flight even if the link between pilot and drone is jammed. The automatic guidance can kick in at distances of 2km or more, depending on conditions, notes CSIS, and can raise the hit rate from 10-20% (for manually piloted drones) to 70-80%. That means that one or two drones can do work that would previously have taken eight or nine. AI can also counter decoys and camouflage that would trick humans.

The second is that armies plan to go much further. They aim, says CSIS, “is to remove conflictfighters from direct combat and replace them with autonomous unmanned systems”. That includes not only aerial drones, but also their equivalents on the ground and at sea. Thirty-three ground robotic systems were approved in the first nine months of last year.

A third finding is that armies have embraced the idea of software-defined weapons, whose operating code matters more than their physical design. They make “modules”, typically chips loaded with software that are smaller than a bar of soap and which can slot into a wide range of different platforms, including drones, vehicles or gun turrets, to enable target recognition or other tasks. Advanced capabilities can thus be fitted or retrofitted to cheap and mass-produced hardware.

Armies are also attempting to be efficient in its use of AI. Newer models can be trained quickly on relatively small amounts of data.

Fourth, there are still limits to all this. Some 60-80% of “first-person view” or FPV strike drones fail to reach their targets, writes RUSI, depending on the pilot’s skill and target location.

(For remote-controlled drones, which need a radio signal, swarm attacks are difficult to pull off because the signals tend to interfere with one another.)

Of the 20-40% of FPV drones that do get through, a majority fail to destroy armoured vehicles—though they are good at wounding infantry, which helps explain the astronomically high casualty numbers. More than 50% of injuries are caused by drones, up from 25% at the end of 2023.

These figures must be put in context. It is still the interplay between drones and guns that often makes the difference. An FPV drone might immobilise a vehicle, for instance, with shellfire used to kill the infantry who dismount. Drone operations can also be time-consuming and complex. It takes “hours” to halt a tank with an FPV drone, compared with the two minutes needed to knock out three tanks with five precision-guided anti-tank shells after they were spotted by a drone.

Although AI models are performing a growing range of military tasks, humans are still closely involved in the decision to use force. Personnel can “override autonomous functions” when needed, notes CSIS. In ground systems, which face a more cluttered and complex environment than aerial ones, autonomy “remains largely unexplored by defence companies”. This means that one of the greatest potential benefits of automated combat—a reduction in human casualties—is some way off. With the pervasive drone threat keeping machinery at least 7km behind the front line, observes RUSI, soldiers have to dig trenches using picks and shovels. Some minefields are still cleared by hand. The grim irony is that AI and robotics have produced a more lethal battlefield for the men unfortunate enough to be deployed to its edge.” 

 

1. Engines of conflict. The Economist; London Vol. 454, Iss. 9439,  (Mar 15, 2025): 76, 77.

Komentarų nėra: