Synthetic Vision: An Overview
NASA’s early research in SVS concepts almost immediately influenced broader perceptions of the field. Working with NASA researchers who reviewed and helped write the text, the Federal Aviation Administration crafted a definition of SVS published in Advisory Circular 120-29A, describing it as "a system used to create a synthetic image (e. g., typically a computer generated picture) representing the environment external to the airplane.” In 2000, NASA Langley researchers Russell V. Parrish, Daniel G. Baize, and Michael S. Lewis gave a more detailed definition as "a display system in which the view of the external environment is provided by melding computer-generated external topography scenes from on-board databases with flight display symbologies and other information from on-board sensors, data links, and navigation systems. These systems are characterized by their ability to represent, in an intuitive manner, the visual information and cues that a flight crew would have in daylight Visual Meteorological Conditions (VMC).”[1136] This definition can
be expanded further to include sensor fusion. This provides the capability to blend in real time in varying percentages all of the synthetically derived information with video or infrared signals. The key requirements of SVS as stated above are to provide the pilot with an intuitive, equivalent-to – daylight VMC capability in all-weather conditions at any time on a tactical level (with present and near future time and position portrayed on a head-up display [HUD] or primary flight display [PFD]) and far improved situation awareness on a strategic level (with future time and position portrayed on a navigation display [a NAV display, or ND]).
In the earliest days of proto-SVS development during the 1980s and early 1990s, the state of the art of graphics generators limited the terrain portrayal to stroke-generated line segments forming polygons to represent terrain features. Superimposing HITS symbology on these displays was not difficult, but the level of situational awareness (SA) improvement was somewhat limited by the low-fidelity terrain rendering. In fact, the superposition of HITS projected flight paths to include a rectilinear runway presentation at the end of the approach segment on basic PFD displays inspired the development of improved terrain portrayal by suggesting the simple polygon presentation of terrain. The development of raster graphics generators and texturing capabilities allowed these simple polygons to be filled, producing more realistic scenes. Aerial and satellite photography providing "photo-realistic” quality images emerged in the mid-1990s, along with improved synthetic displays enhanced by constantly improving databases. With vastly improved graphics generators (reflecting increasing computational power), the early concept of co-displaying the desired vertical and lateral pathway guidance ahead of the airplane in a three-dimensional perspective has evolved from the crude representations of just two decades ago to the present examples of high-resolution, photo-realistic, and elevation-based threedimensional displays, replete with overlaid pathway guidance, providing the pilot with an unobstructed view of the world. Effectively, then, the goal of synthetically providing the pilot with an effective daylight, VMC view in all-weather has been achieved.[1137]
Though the expressions Synthetic Vision Systems, External Vision Systems (XVS), and Enhanced Vision Systems (EVS) have often been used interchangeably, each is distinct. Strictly speaking, SVS has come
to mean computer-generated imagery from onboard databases combined with precise Global Positioning System (GPS) navigation. SVS joins terrain, obstacle, and airport images with spatial and navigational inputs from a variety of sensor and reference systems to produce a realistic depiction of the external world. EVS and XVS employ imaging sensor systems such as television, millimeter wave, and infrared, integrated with display symbologies (altitude/airspeed tapes on a PFD or HUD, for example) to permit all-weather, day-night operations.[1138]
Confusion in terminology, particularly in the early years, has characterized the field, including use of multiple terms. For example, in 1992, the FAA completed a flight test investigating millimeter wave and infrared sensors for all-weather operations under the name "Synthetic Vision Technology Demonstration.”[1139] SVS and EVS are often combined as one expression, SVS/EVS, and the FAA has coined another term as well, EFVS, for Enhanced Flight Vision System. Industry has undertaken its own developments, with its own corporate names and nuances. A number of avionics companies have implemented various forms of SVS technologies in their newer flight deck systems, and various airframe manufacturers have obtained certification of both an Enhanced Vision System and a Synthetic Vision System for their business and regional aircraft. But much still remains to be done, with NASA, the FAA, and industry having yet to fully integrate SVS and EVS/EFVS technology into a comprehensive architecture furnishing Equivalent Visual Operations (EVO), blending infrared-based EFVS with SVS and millimeter – wave sensors, thereby creating an enabling technology for the FAA’s planned Next Generation Air Transportation System.[1140]
The underlying foundation of SVS is a complete navigation and situational awareness system. This Synthetic Vision System consists mainly of integration of worldwide terrain, obstacle, and airport databases; real-time presentation of immediate tactical hazards (such as weather); an Inertial Navigation System (INS) and GPS navigation capability; advanced sensors for monitoring the integrity of the database and for object detection; presentation of traffic information; and a real-time synthetic vision display, with advanced pathway or equivalent guidance, effectively affording the aircrew a projected highway-in-the – sky ahead of them.[1141] Two enabling technologies were necessary for SVS to be developed: increased computer storage capacity and a global, real-
time, highly accurate navigation system. The former has been steadily developing over the past four decades, and the latter became available with the advent of GPS in the 1980s. These enabling technologies utilized or improved upon the Electronic Flight Information System (EFIS), or glass cockpit, architecture pioneered by NASA Langley in the 1970s and first flown on Langley’s Boeing 737 Advanced Transport Operating System (ATOPS) research airplane. It should be noted that the research accomplishments of this airplane—Boeing’s first production 737—in its two decades of NASA service are legendary. These included demonstration of the first glass cockpit in a transport aircraft, evaluation of transport aircraft fly-by-wire technology, the first GPS-guided blind landing, the development of wind shear detection systems, and the first SVS – guided landings in a transport aircraft.[1142]
The development of GPS satellite navigation signals technology enabled the evolution of SVS. GPS began as an Air Force-Navy effort to build a satellite-based navigation system that could meet the needs of fast-moving aircraft and missile systems, something the older TRANSIT system, developed in the late 1950s, could not. After early studies by a variety of organizations—foremost of which was the Aerospace Corporation—the Air Force formally launched the GPS research and development program in October 1963, issuing hardware design contracts 3 years later. Known initially as the Navstar GPS system, the concept involved a constellation of 24 satellites orbiting 12,000 nautical miles above Earth, each transmitting a continual radio signal containing a
precise time stamp from an onboard atomic clock. By recording the time of each received satellite signal of a required 4 satellites and comparing the associated time stamp, a ground receiver could determine position and altitude to high accuracies. The first satellite was launched in 1978, and the constellation of 24 satellites was complete in 1995. Originally intended only for use by the Department of Defense, GPS was opened for civilian use (though to a lesser degree of precision) by President Ronald Reagan after a Korean Air Lines Boeing 747 commercial airliner was shot down by Soviet interceptors in 1983 after it strayed miles into Soviet territory. The utility of the GPS satellite network expanded dramatically in 2000, when the United States cleared civilian GPS users to receive the same level of precision as military forces, thus increasing civilian GPS accuracy tenfold.[1143]
Database quality was essential for the development of SVS. The 1990s saw giant strides taken when dedicated NASA Space Shuttle missions digitally mapped 80 percent of Earth’s land surface and almost 100 percent of the land between 60 degrees north and south latitude. At the same time, radar mapping from airplanes contributed to the digital terrain database, providing sufficient resolution for SVS in route and specific terminal area requirements. The Shuttle Endeavour Radar Topography Mission in 2000 produced topographical maps far more precise than previously available. Digital terrain databases are being produced by commercial and government organizations worldwide.[1144]
With the maturation of the enabling technologies in the 1990s and its prior experience in developing glass cockpit systems, NASA Langley was poised to develop the concept of SVS as a highly effective tool for pilots to operate aircraft more effectively and safely. This did not happen directly but was the culmination of experience gained by Langley research engineers and pilots on NASA’s Terminal Area Productivity (TAP)
and High-Speed Research (HSR) programs in the mid – to late 1990s. By 1999, when the SVS project of the AvSP was initiated and funded, Langley had an experienced core of engineers and research pilots eager to push the state of the art.