Human Factors in Head-Up Synthetic Vision Displays 2001-01-2652
Enabling technologies needed to provide pilots a synthetic view of the outside world are expected to mature within the next three to five years. Such a synthetic view, a display based at least partially on a database, is expected to greatly increase aircrew situation awareness in reduced visibility conditions with a consequent improvement in air safety. In particular, it is hoped that providing a synthetic view of the commanded flight path and the terrain surrounding the flight path will greatly reduce the incidence of controlled flight into terrain accidents. If a synthetic view can be provided to the pilot as part of his/her head-up primary flight display, the potential exists to provide these benefits while allowing the pilot’s head, eyes, and attention to remain focused outside the cockpit.
This paper reviews the current status of enabling technologies for head-up synthetic vision displays and outlines ongoing efforts in the Air Force Research Laboratory (AFRL) to develop and test these displays, both in ground-based simulation and in flight. Human factors issues associated with the design of these displays are discussed in detail. Among these issues are pilot information requirements, depth and self-motion cues provided, visual clutter in the head-up display (HUD), format of the synthetic view underlying other HUD symbology, integration with head-down displays, integration of the synthetic view with sensor information, and attentional issues. Examples of how these issues are considered in the design of a synthetic vision HUD are given, and recommendations made for the design of these displays based on this discussion and findings from human factors studies conducted by AFRL.