Man-Machine Integration Design and Analysis System
NASA jointly initiated this research program in 1980 with the U. S. Army, San Jose State University, and Sterling Software/QSS/Perot Systems, Inc. This ongoing, work-station-based simulation system, which was designed to further develop human performance modeling, links a "virtual human” of a certain physical anthropometric description to a cognitive (visual, auditory, and memory) structure that is representative of human abilities and limitations. MIDAS then uses these human performance models to assess a system’s procedures, displays, and controls. Using these models, procedural and equipment problems can be identified and human – system performance measures established before more expensive testing using human subjects.[423] The aim of MIDAS is to "reduce design cycle time, support quantitative predictions of human-system effectiveness, and improve the design of crew stations and their associated operating procedures.”[424] These models thus demonstrate the behavior that might be expected of human operators working with a given automated system without the risk and cost of subjecting humans to these conditions. An important aspect of MIDAS is that it can be applied to any human-machine domain once adapted to the particular requirements of that system. It has in fact been employed in the development of such varied functions as establishing baseline performance measures for U. S. Army crews flying Longbow Apache helicopters with and without chemical warfare gear, evaluating crew performance/workload issues for steep noise abatement approaches into a vertiport, developing an advanced
NASA Shuttle orbiter cockpit with an improved display/control design, and upgrading emergency 911 dispatch facility and procedures.[425]
Controller-Pilot Data Link Communications
Research for this program, conducted by NASA’s Advanced Transport Operating System (ATOPS), was initiated in the early 1980s to improve the quality of communication between aircrew and air traffic control personnel.[426] With increased aircraft congestion, radio frequency overload had become a potential safety issue. With so many pilots trying to communicate with ATC at the same time on the same radio frequency, the potential for miscommunication, errors, and even missed transmissions had become increasingly great.
One solution to this problem was a two-way data link system. This allows communications between aircrew and controllers to be displayed on computer screens both in the cockpit and at the controller’s station on the ground. Here they can be read, verified, and stored for future reference. Additionally, flightcrew personnel flying in remote locations, well out of radio range, can communicate in real time with ground personnel via computers hooked up to a satellite network. The system also allows such enhanced capabilities as the transfer of weather data, charts, and other important information to aircraft flying at nearly any location in the world.[427]
Yet another aspect of this system allows computers in aircraft and on the ground to "talk” to one another directly. Controllers can thus arrange closer spacing and more direct routing for incoming and outgoing aircraft. This important feature has been calculated to save an estimated 3,000-6,000 pounds of fuel and up to 8 minutes of flight time on a typical transpacific flight.[428] Digitized voice communications have even been
NASA’s Future Flight Central, which opened at NASA Ames Research Center in 1 999, was the first full-scale virtual control tower. Such synthetic vision systems can be used by both aircraft and controllers to visualize clearly what is taking place around them in any conditions. NASA. |
added to decrease the amount of aircrew "head-down” time spent reading messages on the screen. This system has gained support from both pilots and the FAA, especially after NASA investigations showed that the system decreased communication errors, aircrew workload, and the need to repeat ATC messages.[429]