The next generation of drones doesn’t just fly fast — it thinks fast and kills without waiting for human orders, writes Patrick Drennan.
AUTONOMOUS KILLER DRONES are on the horizon.
Russia and Ukraine are engaged in a frantic technological race to develop drones with artificial intelligence (AI) and machine learning capabilities.
On 14 April 2025, the world’s top drone racers competed for a $1 million prize pool in the Abu Dhabi Autonomous Racing League (A2RL) Drone Race. The event was hosted at Marina Hall in Abu Dhabi and brought together 14 teams from across the world, including South Korea, Mexico, Turkey, China and the United States.
The drones raced at speeds of over 150 km/h through a tricky, winding indoor track filled with wide gates, uneven lighting and very few visual markers. The most astonishing feature of this competition: an autonomous AI drone won the event for the first time over human-piloted drones.
Delft University of Technology MavLab’s AI drone wasn’t just fast — it was smart. It completed two laps of a 170-metre course in only 17 seconds. This was enough to win not just the AI Grand Challenge but also a one-on-one race against a professional human drone racer.
This has major implications for military drones. For drones to operate autonomously, they must be able to navigate their environment efficiently.
AI algorithms in drones function by processing large amounts of data from onboard sensors, cameras and communication systems. These algorithms analyse environmental inputs, make predictions, and execute real-time decisions based on programmed logic and learned experiences.
In the Ukraine war, Russia and Ukraine are engaged in a frantic technological race to develop and deploy drones with artificial intelligence and machine learning (ML) capabilities.
According to the Institute for the Study of War (ISW):
‘The successful integration of AI/ML drones could enable Russian and Ukrainian forces to reduce their reliance on human drone operators and defenders, bypass electronic warfare, including jamming, reduce human limitations in target identification, and speed up the decision-making processes involved in drone warfare.’
AI models can manage swarms of drones against a target and allow for advanced drone-to-drone interoperability. Ant Colony Optimisation (ACO) helps drones find the most efficient routes by mimicking how ants search for food. This is useful in applications like logistical delivery and cooperative search-and-destroy missions.
In May 2025, Russia increased serial production of the Tyuvik AI attack drones, equipped with target-homing systems and resistant to electronic warfare (EW) interference. They have a range of about 30 kilometres.
Reportedly, Ukraine has a shortage of computing power and professionals with AI experience. Nevertheless, on 26 May, Ukraine launched its GOGOL-M AI-powered mothership drone on its first autonomous mission against Russian targets. The GOGOL-M mothership can deliver two FPV attack drones and launch a precision strike at a range of 300 kilometres. Ukraine claims to be able to produce 50 GOGOL-M mothership drones and up to 400 compatible FPV drones per month.
After Ukraine’s audacious and ingenious drone attack on Russia’s air bases on 1 June 2025, which destroyed at least 14 irreplaceable Russian bombers, Western militaries became more aware of their vulnerability to long-distance AI drones. The Ukrainian quadcopter drones were programmed with AI target recognition.
Parking expensive military aircraft on uncovered runways such as Whiteman Air Force Base and Joint Base San Antonio in the United States, and the RAAF Amberley Air Base in Australia, puts them in very vulnerable positions.
In response, the U.S. Army Corps of Engineers (USACE) developed upgrades for a family of modular, rapidly deployable protective structures to shield against drone attacks.
The development and integration of military AI into future weapon systems is inevitable. While the United States leads in technological innovation, it faces limitations in testing these advancements under real combat conditions. Collaboration with Ukraine, rather than disengagement, presents a unique and mutually beneficial opportunity to bridge this gap.
Despite the rapid advancements in these AI-powered drones, several challenges and limitations impact their efficiency, reliability and widespread adoption. These challenges stem from hardware constraints, data limitations and ethical considerations.
Most AI drone models demand high-performance graphics and specialised AI chips. Currently, most drones have limited onboard computing capabilities.
The ethical implications of AI drones are multifaceted, encompassing concerns about privacy, bias, accountability and human oversight. Specifically, the use of AI in drone warfare raises questions about the delegation of life-or-death decisions to machines, the potential for unintended consequences and the erosion of human control in combat.
On the other hand, artificial intelligence may be what saves humanity rather than what destroys it. AI drones offer enhanced disaster response, improved agricultural practice, and increased efficiency. They can quickly assess damage, deliver supplies and provide real-time information during emergencies.
Additionally, drones are used for environmental conservation, wildlife monitoring and mapping fragile ecosystems.
Hopefully, humanity will prevail.
Patrick Drennan is a journalist based in New Zealand, with a degree in American history and economics.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Australia License
Support independent journalism Subscribe to IA.
Related Articles
- The great American drone panic: Miracles or massive hoax?
- U.S. loses control of airspace as mysterious drone sightings escalate
- Drone technology transforming Australian agriculture
- Drones killing birds: What can be done?







