Human Centered Design applied to “pilot-vehicle” interface

Certain interfaces are extensively complex, due to the sheer number of activities that are going on sequentially or in parallel.  I always imagined that all complex systems with even more complex interfaces can always be simplified.I have always believed that there is a simpler way of modeling an activity no matter how complex the system is.

But I never thought that a system as complex as a Flight Management System (FMS for short), can be simplified any further. I was pleasantly surprised though, while I read the article “Towards an Improved Pilot-Vehicle Interface for Highly Automated Aircraft: Evaluation of the Haptic Flight Control System” by Paul Schutte and Kenneth Goodrich from NASA Langley Research Center.

The abstract drew me in instantly, when it stated that:

“The control automation and interaction paradigm (e.g., manual, autopilot, flight management system) used on virtually all large highly automated aircraft has long been an exemplar of breakdowns in human factors and human-centered design.”…

Given the number of switches and dials to operate, the manual mode may be prone to human errors, some of which could be the result of badly designed cockpit interfaces. These “breakdowns in human factors and human centered design” is obvious for the manual flight, but how can the fully automated flight be susceptible to the same design flaws?

Cockpit view

Kazillion knobs and dials

For this, we need to understand how a person’s situational awareness reduces while moving from performing manual tasks to automated tasks. In a fully automated mode, human intervention is minimal or none at all, leading to possible lapses in monitoring the system periodically. The author clearly states:

…”Humans become complacent with reasonably reliable pre-programmed automation. But one of their primary roles is to monitor the mission progress and the automation”…

And with this, the author indicates how their suggested system, Haptic Flight Control System (HFCS for short) can overcome this along with other issues that arise from either manual or Fully Automated (FA) mode of flight:

“An alternative paradigm is the Haptic Flight Control System (HFCS) that is part of NASA Langley Research Center’s Naturalistic Flight Deck Concept. The HFCS uses only stick and throttle for easily and intuitively controlling the actual flight of the aircraft without losing any of the efficiency and operational benefits of the current paradigm”

While reading this article I couldn’t help but think about the keyword “Situational Awareness” and in where in the world of software products for the masses would it apply. A while back I was working on a redesign for a call-center intranet based web-application, and the ‘users’ were the call-center employees who had to balance this legacy, complicated screen in front of them (with a dozen radio buttons, twenty odd select boxes and tonnes of other interface elements crying for attention), while answering the calls, understanding the callers requests and feeding that into this system or querying for related information at the same time.

With on-site observation and interview data at hand, we knew that at any given point during a call, the employee needs to be fully aware of what information the phone system is feeding the app as what information the caller is giving at the same time. We had to ensure that appropriate cues were in place which would help the user be aware of what information is flowing and when.

The HFCS system claims to improve the situational awareness of the pilot, along with other parameters like appropriate workload, graceful degradation and improved pilot acceptance.

It was really interesting to see how the authors used the perspective of ‘languages’ and ‘interfaces’ to frame the problems that were faced by the pilots and to illustrate how the human factors breakdown.’Language’ here is simply how the pilot will command the aircraft. So one of the languages is to manipulate the aircraft orientation and propulsion. The pilot will use certain ‘commands’ like ‘pitch-up’ or ‘bank-left’, and the interactions for instance, would be to pull back on the stick to pitch-up. Framing the problem this way provides an easier way to visualize and then apply a solution to the problem at hand.

The HFCS system offers a novel solution to the problem, it reduces the number of interfaces down to just two. A stick and throttle, a simple “point and shoot” system, which uses waypoints to figure out airways and all the pilot has to do is nudge the stick in the right direction. As mentioned by the authors:

“The HFCS appears to be increasing pilot workload back to the levels of manual flight before automation. The purpose of this study is to explore how pilots feel about this new control concept and to see how it affects their workload and their situation awareness.”

The fascinating part is that even though they know that this new system might be heavier on the “workload” for the pilot due to increased situational awareness and regular intervention, they still want to see how the pilots react to it.

The user observations that the authors conducted provide better insights into where HFCS is better than the manual and FA systems, and where it has limitations. Even though the limitations exist, it is important to observe the interactions with the system to understand what can be done to improve it.