An Ethical Statement on Incidents

Donald Gotterbarn and Keith W. Miller wrote on a Software Engineering Code of Ethics in the June 2009 edition of IEEE Computer magazine. They illustrate the application of their principles with some case studies, including Case Study 2: Who Is In Control?

They consider first the October 2008 Qantas accident, concerning which an
interim factual report is available from the ATSB. Gotterbarn and Miller say

[begin quote]

“ The software on this Airbus 330-303 implemented a decision to give instant control to the plane’s flight control system when the autopilot shut off because of computer system failures. The resulting nosedive suggests that this decision was not in the best interest of the public, especially members of the public in or below this airplane.

There are good reasons to have the flight control system protect the jet from dangerous conditions. But this incident illustrates that the decision to turn over control to the flight control system should take into account the current state of inputs into that system. The flight control system should have been more sensitive to the quality of its inputs and to the possibility of disastrous consequences for instantly reacting to apparent conditions that were based on erroneous inputs.”

[end quote]

This is philosophy without understanding. I am not even sure the authors know what a flight control system is.

They they go on to consider the infamous incident in Russia, during which a pilot let his kids into the cockpit and gave them a hand in flying the airplane. There was an upset and the airplane crashed. Gotterbarn and Miller say

[begin quote]

“After such a disaster, we would expect the developers of subsequent Airbus autopilot software to be particularly sensitive to issues of control transfer between pilots and autopilots.

In the Aeroflot crash, much of the publicity focused on the judgement of the pilot in inviting his children into the cockpit. While that appears to have been a contributing factor in the tragedy, the autopilot design was at least as significant.”

[end quote]

What an extraordinary comment! I think no more needs to be said about that than what I say, below, in my letter to the Editor-in-Chief of IEEE Computer:

[begin letter]

Dear Professor Carver,

Professors Gotterbarn and Miller (The Public is the Priority, IEEE Computer, June 2009:66-73) omit one important ethical principle favored by those of us who analyse incidents: refrain from making imposing public statements on technical matters about which you know little.

The authors illustrate well the reasons for this principle through their Case Study 2. They introduce the 2008 Qantas accident and suggest that a

“…decision to give …. control to the … flight control system when the autopilot shut off because of … system failures….. was not in the best interest of the public.”

Autoflight systems have been doing exactly this since they were invented over half a century ago, and no pilot or engineer I know would have it otherwise.

The ATSB preliminary analysis hints rather at an obscure bug with the Flight Control Primary Computer, as well as a yet-undiagnosed fault in one of the air data subsystems. Let us hope that our colleagues at the companies concerned are able to discover what and how and devise remedies.

One can indeed hold moral views arising from this and other incidents, such as that critical software and interfaces need to be rigorously proven free from every possible source of error, but most software engineers would agree that best practice is still some way from that ideal, and back when flight control systems were cables and pulleys we were not close to it either.

Concerning the Aeroflot upset, I feel strongly that children should not be placed at the controls of commercial passenger jets in flight, and that it is silly to suggest that the system design should accommodate such an eventuality.

Sincerely,

Peter Bernard Ladkin

[end letter]

Leave a Reply