Proceedings of the 8th International Symposium on Aviation Psychology (April 24-27,1995). Edited by R.S. Jensen and L.A. Rakovan, pages 310-314

DEVELOPING THE ELECTRONIC COCOON

Ronald L. Small

Search Technology, Inc. Norcross, Georgia
INTRODUCTION

Wiener and Curry (1987) coined the term electronic cocoon in reference to avionics software systems that would help pilots recognize and avoid dangerous flying situations. Search Technology began designing and developing such a cocoon in the mid-eighties as part of the Pilot's Associate (PA), a joint Defense Advanced Research Projects Agency (DARPA) and Air Force program, whose goal was to develop an electronic back-seater for an advanced, single-seat, tactical fighter. Search's role was to design, develop and test the pilot-vehicle interface portion of PA. That is, we were responsible for integrating the information from PA, an intelligent pilot decision aiding system, and presenting it in meaningful ways. We also translated pilot actions into intentional goals and plans for dissemination throughout the PA system. As with most DARPA (now ARPA) projects, PA was very risky and enjoyed only limited success: designs were successfully implemented and the concept was proven feasible; however schedule and budget constraints prevented thorough testing, so results were inconclusive. Nevertheless, many pilots, engineers and scientists exposed to the program were convinced that the PA represented a worthwhile approach to the problem of proliferating avionics systems that each had separate controls and displays, and that each incrementally abrogated the pilots' authority.

PA represented the first concerted effort to develop a pilot-centered system, one that would require less net workload (i.e., the amount of total workload to control PA was to be less than the total workload required by pilots to accomplish tasks themselves), and one that left the pilots in control of their aircraft and mission. Simply put, PA intended to harness the trend of avionics automation run amuck.

Search considered PA a success, despite limited software or pilot-in-the-loop testing. Many others involved also saw great promise in PA, and began pursuing spin-off opportunities. In particular, Search focused on one part of its PA pilot-vehicle interface (PVI), the Error Monitor. This focus was prompted by NASA's interest in detecting and remedying the problems commercial airline pilots experienced when programming their flight management systems (FMSs). These sophisticated navigation-autoflight systems require fairly complex preflight and inflight programming procedures to achieve optimal performance; and, many FMS programming procedures are error prone, especially during time-compressed situations.

This first application of PA technology to commercial aircraft, the FMS Error Monitor (FMS-EM), was a success. In fact the Phase I prototype FMS-EM exceeded expectations. Phase II is underway, as is an Air Force-funded application of error monitoring to an advanced C-130. Further applications of PA technology are planned for the next several years. This paper briefly highlights Search's portion of PA (the intelligent PVI), the application of error monitoring to civil and military aviation, and Search's plans for further development of the electronic cocoon.

PILOT'S ASSOCIATE OVERVIEW
The Purpose of the Pilot's Associate Program

The purpose of the program was to apply artificial intelligence to aid the pilots of single-seat tactical aircraft. Combat operation of a tactical aircraft taxes pilots to their limits. Pilots undergo physical stress due to high G-forces produced by sudden maneuvers. Pilots also undergo mental stress due to extremely rapid changes in the combat situation and the need to respond quickly and accurately, or face potentially fatal consequences. Decision aiding is an approach to overcoming some of these difficulties.

Increasing technological complexity of aircraft was another argument for decision aiding. One measure of complexity is exponential growth in functionality, displays and controls in the cockpit. Next generation aircraft, for which the Pilot's Associate was intended, are continuing this trend. Our associate system was intended to reverse this trend of increasing cockpit complexity. As automation, its purpose was to help the pilot perform the mission. While traditional automation can be valuable, it also presents problems with pilot interaction. Such automation often is difficult to understand and has no sense of judgment. The Pilot's Associate, therefore, needed to present information in a readily understood manner and use knowledge bases to decide when to aid.

The interface between the pilot and the aircraft was key to the problems of increased complexity and the inherent difficulty of tactical air combat. Most of the difficulties of operating the aircraft are traced to difficulties in the interface. Much of the impressive functionality in complex aircraft, even today's generation, goes unused because it is too difficult for pilots to remain proficient with its full range of capabilities. One goal of the program was to find ways to utilize the full capability of the avionics while at the same time simplifying the interface.

Definition and Functions of an Associate System

The purpose of an associate system is to use intelligent automation to help the pilot by overcoming pilot limitations and enhancing pilot abilities. The purpose is not to automate more of the tasks performed on the aircraft, because this does not necessarily help the pilot. Traditional automation philosophy is to automate as much as possible, which results in the pilot being a monitor of automation, a task for which humans are not particularly well suited.

Providing information to the pilot was a primary function of the system, which kept the pilot aware of the situation. Traditional interfaces deluge the pilot with data, much of it irrelevant at any particular time, forcing the pilot to integrate the data into useful information. In contrast, the associate system generated and managed the display of information to provide the right information in the right format at the right time. Some of the information generated by the associate system included judgments about the situation and recommended responses.

In addition to managing information that would be present in a traditional crewstation, the associate system also creates information. Judgments (or assessments) about the situation are provided because pilots expressed a desire for high level descriptions of the situation, as opposed to the flood of low level data that they traditionally receive. One task with which they want assistance is integrating the low level data into a higher level description of the world. The second type of information created is recommended responses to the situation. These responses, which internally are termed plans, deal with a variety of problems from routes to tactics to malfunction responses. Although pilots strongly prefer to choose responses, the recommendations are useful when (1) the response is computationally difficult to create; (2) a recommended, small variation in the pilot's plan produces a significant improvement; or (3) time is limited.

Under conditions approved in advance by the pilot, the associate system could perform actions—commands to the aircraft—on behalf of the pilot. Conditions under which this occur include task overload or the allocation of low importance tasks to automation. These tasks are the output of the response recommendation process described above. This path from response proposers to avionics commands allow intelligent automation to control selected aspects of the aircraft. The pilot remains in control, however, through authorization of the conditions under which specific tasks could be done automatically.

An associate system is mixed-initiative. It does not merely react to its inputs; it also may take the initiative within the authority given it by the pilot. For example, it may change displays or present recommendations to the pilot on its own initiative. In some sense, the associate system is more like an electronic crew member than conventional automation. As will be shown later, this results in a demand for new types of knowledge in the design of the interaction between intelligent automation (associate systems) and human operators of complex systems.

In addition to the human-like behaviors described above, the associate system is also intelligent. The concept of intelligence is difficult to define, but an intuitive definition seems to be the sophistication of control over behavior. Thus, a system displaying the above behaviors would be an associate system if the behaviors are intelligently controlled and the purpose of the behaviors are to help the pilot. From the perspective of intelligence, the focus is on the choices made that control the behavior rather than on the behavior itself.

Intelligence in an associate system is not immediately visible. It is detected only by operating the system to observe the sophistication of its control over behavior. This lack of visibility has both good and bad points. Pilots wanted the associate to provide real assistance in their complex domain. They did not want to have to manage an additional avionics system; so an invisible associate was their preference. On the other hand, an invisible associate was difficult to demonstrate (and the program had many demonstrations).

The Associate System Components/Modules

This section describes the modules of the associate system, their interaction with each other, and their interaction with the conventional avionics of the aircraft. The overall architecture is shown in Figure 1. The aircraft avionics provide sensed data to the assessors, which produce higher level state descriptions for use by the planners and intelligent interface. The planners propose recommended responses, which are passed to the intelligent interface. The intelligent interface may instruct the display generator to produce displays about assessments, plans, or other information. It may execute a task on behalf of the pilot by issuing commands to the aircraft. It will monitor for pilot error and, if detected, advise the pilot via displays or possibly remediate the error by commands to the aircraft. It also determines the pilot's intentions and feeds this back to the assessors and planners to direct their processing. The pilot reads the displays and issues commands to the aircraft and the display system. Figure 1 shows only those data flows that invoke processing. There is also considerable data sharing that is not shown. For example, the aircraft state is made available and is widely used by all modules.

Figure 1. Associate System Architecture

The Intelligent Pilot-Vehicle Interface

The intelligent interface is by design a bottleneck through which all output of the other associate system modules must flow. There are several reasons for this bottleneck. First, the intelligent interface is responsible for the interface, which requires that it inspect and select information from other modules for display. The intelligent interface is responsible for delaying or eliminating information irrelevant to the pilot's tasks. Second, the intelligent interface is responsible for executing actions on behalf of the pilot. This is more human-centered than allowing each subsystem to make its own decision because it ensures consistency and harmony; it maintains the pilot's own goals as the center of focus. Thus, it seems only natural for all of the final decisions to be made in the intelligent interface, the major topic of the rest of the chapter.

The intelligent interface uses inputs, models, and knowledge bases to make the following decisions:

The models used to make the above decisions describe:

The outputs resulting from the decisions by the intelligent interface are: display commands sent to the display generators; avionics commands sent to the appropriate avionics systems; and, a model of pilot intentions—as reflected in the active plan-goal graph—sent to the planners and assessors to focus their reasoning and outputs to support the pilot.

The modules of the intelligent interface are the intent inferencer, the resource model, the error monitor, the adaptive aider, the plan proposer, and the information manager. Their functions are summarized below:

Figure 2 illustrates the top level architecture of the intelligent interface, and the following subsections describe each of the PVI modules in more detail, emphasizing each module's purpose, theory, architecture, and developmental results.

Figure 2. Top Level Architecture of Intelligent Interface


A FLIGHT MANAGEMENT SYSTEM APPLICATION

Search Technology is now working on a NASA contract to apply the Hazard Monitor to a commercial, B747-400 Flight Management System (FMS). This project involves the integration of our HM into a B747-400 simulator with an actual B747-400 FMS, as well as the development of an in-depth knowledge base for selected FMS activities. This project was the first serious test of the dual-use potential of HM.

Because of the shared bus architecture on the B747-400, HM can access the FMS and aircraft data that it needs to recognize and monitor for FMS-related hazards. In the future, HM should be able to access all needed data via the common data buses currently proposed for future commercial aircraft such as the B777.

Prior to the ongoing B747-400 work, we developed a compelling proof-of-concept demonstration that indicated that a Hazard Monitor for a Flight Management System is a viable technology. Continuing work indicates that HM can be integrated with an actual FMS with no changes to the HM algorithms; only the interface code is being modified. The primary change to HM in adapting it from the military fighter domain for the commercial transport cockpit involves the knowledge required by the Hazard Monitor in order to assist the pilots.

A MILITARY TRANSPORT COCKPIT APPLICATION

Another application of the Hazard Monitor is its use on the flight deck of a military transport aircraft such as the advanced C-130. As part of a proof-of-concept effort for Wright Patterson Labs, Search has begun development of an HM for potential application to the advanced C-130 cockpit; the C-130 has already adopted the shared-bus architecture that enables HM to monitor necessary aircraft data. Our initial focus was the development of knowledge that demonstrated the ability of HM to assist the crew in general cockpit operations; we concentrated on developing the knowledge for hazards during the descent phase of flight, as well as for hazards related to emergency procedures.

Follow-on work has been proposed that would include the enhancement of the algorithmic portion of HM; not because this application requires different processing, but because our now-extensive experience with HM has revealed ways in which HM can be augmented to provide monitoring capabilities for more types of hazards. We intend to incorporate these HM enhancements into our applications for other domains.

ENHANCEMENTS
AA and IM enhancements

AA Purpose
The adaptive aider executes tasks automatically for the pilot under authority granted by the pilot prior to the mission. The adaptive aider only aids the pilot when pilot performance needs support to meet operational requirements, which can occur during combat engagements. This purpose reflects a pilot-centered design approach that emphasizes keeping the pilot in control as the rule rather than the exception.
Automating more of the pilot's tasks, while technologically possible, has some undesirable side effects. First, the pilot must then monitor automation, a role for which humans are not well suited. In addition, when humans take control from automation, there is a period of poor system performance until the human becomes accustomed to operating the system. Second, pilots strongly prefer to remain in control. Finally, task automation may act mysteriously in certain conditions and thus be awkward to control (Andes & Rouse, 1992; Rouse, 1988).

The adaptive aider is to assist the pilot in the following situations:

IM Purpose
The benefit of an automated information management scheme is that the pilot does not have to explicitly manipulate the displays in order to view information which might otherwise not be presented. Information management enhances the pilot's situational awareness, yielding more effective decisions by presenting the pilot with required information. It supports the pilot by intelligently presenting timely information in an easily interpreted form. In addition, information that is irrelevant to current and near term plans and situations is not displayed, which results in an automatic declutter function (Small & Howard, 1991; Shalin, et al., 1990; Howard, Hammer, & Geddes, 1988).

A description of information manager must necessarily include what functions it should perform and how it should perform those functions in general terms. Before addressing the architecture, though, it is important to list the information manager requirements from which the architecture was derived:

GOALS AND CONCLUSION

Once the safety back-up system, Hazard Monitor, is developed and installed, we must then focus on Information Management that helps pilots filter the sometimes overwhelming amount of incoming data and information. Lastly, we will incorporate the functionality of Adaptive Aiding so that pilots can delegate tasks as desired to automation, thus freeing them to focus on the essential tasks for which they ultimately responsible: safely and efficiently transporting passengers or cargo to the intended destination. When all the necessary pieces of the electronic cocoon are in place, single-pilot operations will be enabled for all but the most strenuous flights (e.g., trans-oceanic commercial flights or military Special Ops missions). The tangible benefits to pilots and the aircraft owners will far outweigh the costs of developing such systems, as our preliminary results exhibit.

ACKNOWLEDGEMENTS

The research efforts described in this paper were supported in part by USAF contracts F33615-85-C-3804 (Pilot's Associate) and F33615-94-C-3802 (An Adaptive Cockpit Error Monitoring System, Capt. Tony Moyers, Project Engineer), and NASA contracts NAS1-19898 and NAS1-20210 (A Prototype Flight Management System Error Monitor, Phases I and II, Mr. Terry Abbott, Contracting Officer Technical Representative).

REFERENCES

Wiener, Earl L. (1987). Fallible Humans and Vulnerable Systems: Lessons Learned from Aviation. in Information Systems: Failure Analysis. Edited by J.A. Wise and A. Debons. NATO ASI Series, Vol. F32. Berlin: Springer-Verlag.

Skidmore, M.D., Zenyuh, J.P., Small, R.L., Quinn, T.J., and Moyers, A.D. (1995). Dual Use Applications of Hazard Monitoring: Commercial and Military Aviation and Beyond. To be Presented at NAECON '95. Also in Proceedings of NAECON '95. Dayton, Ohio. May 23-27, 1995.

Hammer, J.M., and Small, R.L. (1995). An Intelligent Interface in an Associate System. In Human/Technology Interaction in Complex Systems (Vol. 7). Edited by W.B. Rouse. Greenwich, CT: JAI Press.