Improving airfield repairs with AI drones

22 November 2017 (Last Updated November 22nd, 2017 10:27)

The US Navy is looking for a drone equipped with artificial intelligence to speed up the repair of damaged airfields. Claire Apthorp looks at the current airfield maintenance process to find out where the weaknesses lie, and ask how intelligent unmanned systems could improve the speed and efficiency of repairs.

Improving airfield repairs with AI drones
The US military is now looking to move ahead with new technologies to enhance how quickly and efficiently it is able to assess the damage. Image: US Army – Tech Sgt Matt Hecht US Air National Guard.

Airfield facilities are a vital piece of the puzzle for military operations. Whether located at a home station or deployed as part of a forward operating base, the ability for aircraft to land and take-off safely and efficiently is vital to ensure that personnel, equipment and supplies are in the right place at the right time; and that offensive and defensive aircraft can complete tactical and strategic operations.

When an airfield is attacked and rendered unusable by enemy mortars, rockets, artillery or other weapons, damage must be repaired as quickly as possible in order to sustain the mission and minimise costly delays. This process generally includes a damage assessment so that craters, debris, and other impairments to the runways, flight line and supporting infrastructure can be identified quickly – along with any unexploded ordnance – and a repair plan can be put into place to get the airfield up and running again in as short a time as possible.

New approaches are being made for this, the biggest of which is underway with the US Air Force, which is currently transitioning from Rapid Runway Repair to the modernised Airfield Damage Repair (ADR) model. ADR improves on the RRR standard of completing four crater repairs in eight hours to completing up to 126 small crater repairs in six and a half hours under perfect (weather) conditions.

According to Lance Filler, the Air Force Civil Engineer Center ADR modernization team lead, the ADR guidelines are the new baseline for airfield recovery capabilities for all air force civil engineers.

‘The new programme is a response to the air force’s changing mission worldwide… We have new threats so we have to have a new response posture. We must be able to fly different types of aircraft and do so more sustainably than RRR allows.’

Eye in the sky

While the ADR modernisation programme focuses mainly on how to repair the damage following an attack, the US military is now looking to move ahead with new technologies to enhance how quickly and efficiently it is able to assess the damage to begin with. In June, speaking at the Global Explosive Ordnance Disposal Symposium and Exhibition, the head of the Navy Expeditionary Combat Command (NECC) said that unmanned aerial systems (UAS) are at the top of the wish list.

Rear Adm. Brian Brakke told the session, as reported by Defense News, that the current thinking centres on how to use unmanned technologies including UAS to run initial damage assessment on an airfield, to identify unexploded munitions, determine how many craters there are and how they can be filled, and then map commanders out a plan that will have the airfield repaired and ‘get that runway back up in an expedient fashion’.

Some steps are already being taken in this direction. UAS were used during a US Navy Field Training Exercise (FTX) in Fort Hunter Liggett, California, in May, as part of work to test Naval Mobile Construction Battalion (NMCB) 4’s ability to repair runways following a simulated attack, enhance interoperability within the NECC, and adopt new technologies and techniques established under the air force’s ADR programme.

NMCB 4’s Lt. Jeremiah Gill said: ‘The ultimate goal for the evolution was to improve efficiency by reducing the time it takes to complete airfield repairs; to advance technologies by providing input and recommendations to higher on the different equipment and materials; and to gain a joint operating picture with other units to ultimately establish a tactical training procedure for the Naval Construction Force to utilise.’

Following the simulated attack, an Aerovironment Puma AE UAS was deployed over the site, allowing personnel to pre-emptively see what damage and debris was on the field.

NMCB 4’s LTJG Ian Jordon, said: ‘Using the Puma… [also] showed us if there was a serious issue, like unexploded ordnance, and we could take special security precautions to clear the area.’

Following range clearance operations carried out by explosive ordnance disposal teams, NMCB 4 was able to accomplish its mission of repairing the damages on the runway. Craters were repaired with fibreglass panels and concrete. Over the three day exercise a total of 79 spalls and nine craters were filled within 16 hours.

Artificial intelligence

The challenge for the NECC is taking this capability beyond what is it can currently offer – essentially hi-res mapping and geo-location capabilities – and making the leap toward what the NECC is actually looking for. For a UAS to not only create a map but to put together a plan of action to get the airfield back into a serviceable state would require a higher level of artificial intelligence (AI) than is currently available. UAS can create high-resolution maps that flag up potential craters and munitions in great detail, but their actual identification still requires a human in the loop, as does any true analysis based on that data.

What is needed is an intelligence that goes beyond an unmanned system that simply follows algorithms or rules. Early this year, the US military’s research arm, the Defense Advanced Research Projects Agency (DARPA) released a video in which it laid out what AI can currently achieve and how it is going about taking it to the next level.

AI currently exists in two waves. The first wave of systems have imparted handcrafted knowledge which enables reasoning over narrowly defined problems, but have no learning capability and poor handling of uncertainty. The second wave of the technology is capable of statistical learning and has the ability to visually perceive – such as in the case of facial or voice recognition, but poor abstracting and reasoning capability.

‘To characterise this current very powerful second wave of technologies, we would say they have nuanced ability to classify data and even to predict the consequences of data, but they don’t really have any ability to understand the context in which they are taking place and minimal capability to reason,’ John Launchbury, the director of DARPA’s Information Innovation Office (I2O), said.

These first and second wave technologies are currently being combined to create very powerful platforms that have the potential to reshape defence missions. A great example here is DARPA’s Anti-Submarine Warfare Continuous Trail Unmanned Vessel (ACTUV) technology demonstration vessel, recently launched to spend months alone at sea with no human operator giving it directions, including understanding what other vessels are doing, navigate sea lanes and carry out its tasks.

Second wave systems

Second wave systems still face challenges though. While they might perform well the vast majority of the time, mistakes happen and the results could be extremely serious.

‘They turn out to be statistically impressive but individually unreliable,’ Launchbury said. ‘There are also challenges with systems that are intended to learn over time… we have to be very cautious about what data they get hold of [to learn from], because skewed training data creates maladaption. These challenges tell us that we need to move beyond these simple spreadsheet style calculations.’

This next wave is about contextual adaptation. Here, systems will construct explanatory models that will allow them to characterise real world phenomena.

Launchbury uses an example of a system that is intended to classify images. Give such a system and image of a cat and it will say, ‘That’s a cat’. When asked why it is a cat, the system will say, ‘I did my calculations and cat came out as highest’.

‘That’s not satisfactory. We’d much prefer the system to be able to respond to us and say, ‘Well, it has ears, and paws, and fur, and these other features,’ Launchbury said. ‘This kind of building the ability in these systems to understand or have clarity as to why they are making these decision is going to be very important.’

The third wave of artificial intelligence will be built around contextual models where the system over time will learn about how that model will be structured; it will perceive the world in terms of that model, it will be able to use that model to reason and make decisions and even to abstract, to take data further.

Viewed in this light, the US Navy’s wish for UAS capable of undertaking airfield damage assessment presents an interesting challenge, and as Launchbury says, ‘There’s a whole lot of work to be done to be able to build these systems.’