top of page
Line Concept Level 3 page 2.PNG

Automation: A Blind Reliance?

Automation is finding its way into more aspects of Air Force technologies. These processes are able to significantly speed up manual processes and relieve aviators of mental capacity to tackle the greater issues. But is automation the perfect solution we’re led to believe it to be? Warrant Officer Xavier Sherriff delves into the risks automation can bring when reliance is not coupled with understanding.


We live in a world where automation is becoming more prominent. Examples range from the ‘do not disturb’ function on the iPhone, through to advance flight automation systems that use Artificial Intelligence. Automation can be seen as the ‘magic potion’ used to solve human induced error, but does this sword have two edges?


While automation offers a range of benefits, it may also impair our depth of understanding, while almost certainly introducing new risks. But does that matter?

Credit: DLR German Aerospace Center

The Emirates Flight EK407 - ATSB investigation report provides an example of where the fundamental trust in automation was misplaced.


The investigation into this incident found that the payload data was entered into the aircraft Flight Management System (FMS) incorrectly - 100 metric tonnes lower than intended. This had several flow-on effects impacting the aircraft’s take-off parameters.


The Emirates Flight EK407 incident is a classic case of ‘garbage in, garbage out’; however, this is by no means the first-time data entry errors have occurred, and it is unlikely to be the last. The FMS operated exactly as it should. The data was input by the co-pilot, but the lack of cross check resulted in the error being missed. A situation was created where the aircraft would breach the end of the runway.


This incident clearly demonstrates how reliance on automation induces new risk if due diligence and cross checks do not occur. The results of which are disastrous to safe and effective flying operations.


There are many ways automation can present hazards to flying operations. Misuse and hidden errors are two of the most common issues found when using automation. Misuse occurs when humans manipulate with automation. While hidden errors are often associated with software that doesn’t undergo certification compliance or a fault that isn’t identified during the compliance activity.


Misuse

As a Royal Australian Air Force loadmaster, I have seen examples where automation has greatly improved tactical effectiveness allowing the aviator to ‘hand off’ the laborious work required so that attention can be focused elsewhere. Examples include systems that enhance situational awareness and software that automatically calculates aircraft weight and balance. This automation presents a chance for a shallower depth of understanding of core concepts, while still achieving mission outcomes. Additionally, it may be limiting our ability to identify errors and creating a ‘going through the motions’ mentality.


Is a shallower understanding of core concepts like weight and balance any less important than a pilot understanding ‘theory of flight’? I would argue no, and that this deeper level of understanding is vital for any high-risk profession. A detailed understanding of core concepts will increase the likelihood of identifying errors and determining the cause efficiently.


With the introduction of electronic weight and balance, loadmasters no longer need to complete manual calculations post initial qualification course. Believe me, nobody is happier about this than I am! However, it also means that loadmasters are no longer practicing these skills without a conscious effort to do so. This appears to be degrading our depth of knowledge relating to core loadmaster concepts, such as weight and balance theory. This weight and balance theory forms the basis for everything we do.


Hidden Errors

Consider this example - when completing the weight and balance utilising the electronic application, I was presented with an error. The application indicated a forward and aft centre of gravity limitation that was 1.1% different to the FMS. All my data entry was correct and cross checked, but I was still unable to identify the error! I had unwavering faith that the application was doing everything correctly, therefore it must be automation bias. As events turned out, the application had a limitation that was not yet known. The application didn’t factor the fuel contained within the external tanks (a recent modification at the time of the event). The fuel contained within these tanks has a tangible effect on centre of gravity limitations. When the tanks were fitted gap training was provided for aircrew to make them aware of how this impacted the limits, however, the application did not receive an update, nor were these internal limitations known or recorded. Members might assume the application was functioning correctly and this error could cause significant delays. Through a process of error detection, I determined the problem and made manual amendments after consulting the Aircraft Flight Manual for the correct limitation. This demonstrates that core skills and knowledge are vital for flight safety and in preventing automation bias.


Identification and Recovery

My identification and resolution of the error took time, but luckily on this occasion, time is what I had. Given my aviation knowledge, skills, and experience of calculating weight and balance manually, I was able to ascertain the issue and solve the problem. Later, I thought about the event and considered that if it had been a less experienced loadmaster with a greater reliance on the automated system, they might not have identified it at all. In the end, I reverted to manually calculating the forward and aft limitations. This application issue has subsequently been identified and published to the crews.


This event was by no means catastrophic in nature, the aircraft FMS would have advised the captain of the true limitation and highlighted my error, however this bias in a different situation could result in a disastrous oversight.


What now?

How do we maintain these skills? And further, who is responsible? As an Air Force aviator, there are many training and assessing evolutions that take place over the course of your career. The current method of assuring technical knowledge is via annual category assessments. Including manual weight and balance exercises as part of these exams is one method that could be explored.


Annual performance assessments are through an observation of an aviator’s application of knowledge during standard operations, current policy does not mandate a question and answer or discussion style of evaluation with the exception of emergency procedures. These exams and assessments are part of an Aviation Safety Management System, but are they contemporary and do they meet the intent? Does mindset and attitude play a part? Should we be doing more problem resolution training? Can we teach people to have passion for self-learning?


Final Thoughts

Automation is and always will be a force multiplier. It enables the crew to be more efficient and effective. However, blind reliance on automation has risks that are often hidden under an exterior of perfection. Becoming aware of your own reliance on automation and the mission creep that results from this reliance is a step towards becoming a safer aviator.



Warrant Officer Xavier Sherriff is a category B C-130J-30 Hercules Loadmaster with 15 years’ experience. Xavier is currently the training systems development Warrant Officer for the C-130J-30 Block 8.1 upgrade. Xavier has performed postings in Air Movements, C-130J-30 operations, ab initio training and Flight test with the Aircraft Research and Development Unit (ARDU) as an Operational Evaluator Aircrew conducting OT&E. You can follow him on Twitter at @FTLM2021


Comentários

Não foi possível carregar comentários
Parece que houve um problema técnico. Tente reconectar ou atualizar a página.
bottom of page