Category NASA’S CONTRIBUTIONS TO AERONAUTICS

High-Speed Investigations

High-speed studies of dynamic stability were very active at Wallops. The scope and contributions of the Wallops rocket-boosted model research programs for aircraft configurations, missiles, and airframe components covered an astounding number of technical areas, including aerodynamic performance, flutter, stability and control, heat transfer, automatic controls, boundary-layer control, inlet performance, ramjets, and separation behav­ior of aircraft components and stores. As an example of test productivity, in just 3 years beginning in 1947, over 386 models were launched at Wallops to evaluate a single topic: roll control effectiveness at transonic conditions. These tests included generic configurations and models with wings repre­sentative of the historic Douglas D-558-2 Skyrocket, Douglas X-3 Stiletto, and Bell X-2 research aircraft.[471] Fundamental studies of dynamic stability and control were also conducted with generic research models to study basic phenomena such as longitudinal trim changes, dynamic longitudi­nal stability, control-hinge moments, and aerodynamic damping in roll.[472] Studies with models of the D-558-2 also detected unexpected coupling of longitudinal and lateral oscillations, a problem that would subsequently prove to be common for configurations with long fuselages and relatively small wings.[473] Similar coupled motions caused great concern in the X-3 and F-100 aircraft development programs and spurred on numerous stud­ies of the phenomenon known as inertial coupling.

More than 20 specific aircraft configurations were evaluated during the Wallops studies, including early models of such well-known aircraft as the Douglas F4D Skyray, the McDonnell F3H Demon, the Convair B-58 Hustler, the North American F-100 Super Sabre, the Chance Vought F8U Crusader, the Convair F-102 Delta Dagger, the Grumman F11F Tiger, and the McDonnell F-4 Phantom II.

High-Speed Investigations

Shadowgraph of X-15 model in free flight during high-speed tests in the Ames SFFT facility. Shock wave patterns emanating from various airframe components are visible. NASA.

High-speed dynamic stability testing techniques at the Ames SFFT included studies of the static and dynamic stability of blunt-nose reen­try shapes, including analyses of boundary-layer separation.[474] This work included studies of the supersonic dynamic stability characteristics of the Mercury capsule. Noting the experimental observation of nonlinear varia­tions of pitching moment with angle of attack typically exhibited by blunt bodies, Ames researchers contributed a mathematical method for includ­ing such nonlinearities in theoretical analyses and predictions of capsule dynamic stability at supersonic speeds. During the X-15 program, Ames conducted free-flight testing in the SFFT to define stability, control, and flow-field characteristics of the configuration at high supersonic speeds.[475]

Matching the Tunnel to the Supercomputer

The use of sophisticated wind tunnels and their accompanying complex mathematical equations led observers early on to call aerodynamics the

Matching the Tunnel to the Supercomputer

A model of the X-43A and the Pegasus Launch Vehicle in the Langley 31-Inch Mach 10 Tunnel. NASA.

"science” of flight. There were three major methods of evaluating an air­craft or spacecraft: theoretical analysis, the wind tunnel, and full-flight testing. The specific order of use was ambiguous. Ideally, researchers originated a theoretical goal and began their work in a wind tunnel, with the final confirmation of results occurring during full-flight testing. Researchers at Langley sometimes addressed a challenge first by study­ing it in flight, then moving to the wind tunnel for more extreme testing, such as dangerous and unpredictable high speeds, and then following up with the creation of a theoretical framework. The lack of knowledge of the effect of Reynolds number was at the root of the inability to trust wind tun­nel data. Moreover, tunnel structures such as walls, struts, and supports affected the performance of a model in ways that were hard to quantify.[602]

From the early days of the NACA and other aeronautical research facilities, an essential component of the science was the "computer.” Human computers, primarily women, worked laboriously to finish the myriad of calculations needed to interpret the data generated in wind

tunnel tests. Data acquisition became increasingly sophisticated as the NACA grew in the 1940s. The Langley Unitary Plan Wind Tunnel pos­sessed the capability of remote and automatic collection of pressure, force, temperature data from 85 locations at 64 measurements a second, which was undoubtedly faster than manual collection. Computers pro­cessed the data and delivered it via monitors or automated plotters to researchers during the course of the test. The near-instantaneous avail­ability of test data was a leap from the manual (and visual) inspection of industrial scales during testing.[603]

Computers beginning in the 1970s were capable of mathematically calculating the nature of fluid flows quickly and cheaply, which contrib­uted to the idea of what Baals and Corliss called the "electronic wind tunnel.”[604] No longer were computers only a tool to collect and interpret data faster. With the ability to perform billions of calculations in seconds to mathematically simulate conditions, the new supercomputers poten­tially could perform the job of the wind tunnel. The Royal Aeronautical Society published The Future of Flight in 1970, which included an arti­cle on computers in aerodynamic design by Bryan Thwaites, a profes­sor of theoretical aerodynamics at the University of London. His essay would be a clarion call for the rise of computational fluid dynamics (CFD) in the late 20th century.[605] Moreover, improvements in comput­ers and algorithms drove down the operating time and cost of compu­tational experiments. At the same time, the time and cost of operating wind tunnels increased dramatically by 1980. The fundamental limita­tions of wind tunnels centered on the age-old problems related to model size and Reynolds number, temperature, wall interference, model sup­port ("sting”) interference, unrealistic aeroelastic model distortions under load, stream nonuniformity, and unrealistic turbulence levels. Problematic results from the use of test gases were a concern for the design of vehicles for flight in the atmospheres of other planets.[606]

Matching the Tunnel to the Supercomputer

The control panels of the Langley Unitary Wind Tunnel in 1956. NASA.

The work of researchers at NASA Ames influenced Thwaites’s asser­tions about the potential of CFD to benefit aeronautical research. Ames researcher Dean Chapman highlighted the new capabilities of super­computers in his Dryden Lecture in Research for 1979 at the American Institute of Aeronautics and Astronautics Aerospace Sciences Meeting in New Orleans, LA, in January 1979. To Chapman, innovations in com­puter speed and memory led to an "extraordinary cost reduction trend in computational aerodynamics,” while the cost of wind tunnel exper­iments had been "increasing with time.” He brought to the audience’s attention that a meager $1,000 and 30 minutes computer time allowed the numerical simulation of flow over an airfoil. The same task in 1959 would have cost $10 million and would have been completed 30 years later. Chapman made it clear that computers could cure the "many ills of wind-tunnel and turbomachinery experiments” while providing "impor­tant new technical capabilities for the aerospace industry.”[607]

The crowning achievement of the Ames work was the establishment of the Numerical Aerodynamic Simulation (NAS) Facility, which began operations in 1987. The facility’s Cray-2 supercomputer was capable of 250 million computations a second and 1.72 billion per second for short periods, with the possibility of expanding capacity to 1 billion computa­tions per second. That capability reduced the time and cost of developing aircraft designs and enabled engineers to experiment with new designs without resorting to the expense of building a model and testing it in a wind tunnel. Ames researcher Victor L. Peterson said the new facility, and those like it, would allow engineers "to explore more combinations of the design variables than would be practical in the wind tunnel.”[608]

The impetus for the NAS program arose from several factors. First, its creation recognized that computational aerodynamics offered new capabilities in aeronautical research and development. Primarily, that meant the use of computers as a complement to wind tunnel testing, which, because of the relative youth of the discipline, also placed heavy demands on those computer systems. The NAS Facility represented the committed role of the Federal Government in the development and use of large-scale scientific computing systems dating back to the use of the ENIAC for hydrogen bomb and ballistic missile calculations in the late 1940s.[609]

It was clear to NASA that supercomputers were part of the Agency’s future in the late 1980s. Futuristic projects that involved NASA super­computers included the National Aero-Space Plane (NASP), which had an anticipated speed of Mach 25; new main engines and a crew escape system for the Space Shuttle; and refined rotors for helicopters. Most importantly from the perspective of supplanting the wind tunnel, a supercomputer generated data and converted them into pictures that captured flow phenomena that had been previously unable to be sim­ulated.[610] In other words, the "mind’s eye” of the wind tunnel engineer could be captured on film.

Nevertheless, computer simulations were not to replace the wind tunnel. At a meeting sponsored by Advisory Group for Aerospace

Research & Development (AGARD) on the Integration of Computers and Wind Testing in September 1980, Joseph G. Marvin, the chief of the Experimental Fluid Dynamics Branch at Ames, asserted CFD was an "attractive means of providing that necessary bridge between wind – tunnel simulation and flight.” Before that could happen, a careful and critical program of comparison with wind tunnel experiments had to take place. In other words, the wind tunnel was the tool to verify the accuracy of CFD.[611] Dr. Seymour M. Bogdonoff of Princeton University commented in 1988 that "computers can’t do anything unless you know what data to put in them.” The aerospace community still had to dis­cover and document the key phenomena to realize the "future of flight” in the hypersonic and interplanetary regimes. The next step was input­ting the data into the supercomputers.[612]

Researchers Victor L. Peterson and William F. Ballhaus, Jr., who worked in the NAS Facility, recognized the "complementary nature of computation and wind tunnel testing,” where the "combined use” of each captured the "strengths of each tool.” Wind tunnels and comput­ers brought different strengths to the research. The wind tunnel was best for providing detailed performance data once a final configura­tion was selected, especially for investigations involving complex aero­dynamic phenomena. Computers facilitated the arrival and analysis of that final configuration through several steps. They allowed develop­ment of design concepts such as the forward-swept wing or jet flap for lift augmentation and offered a more efficient process of choosing the most promising designs to evaluate in the wind tunnel. Computers also made the instrumentation of test models easier and corrected wind tun­nel data for scaling and interference errors.[613]

On TARGIT: Civil Aviation Crash Testing in the

On December 1, 1984, a Boeing 720B airliner crashed near the east shore of Rogers Dry Lake. Although none of the 73 passengers walked away from the flaming wreckage, there were no fatalities. The occu­pants were plastic, anthropomorphic dummies, some of them instrumented to collect research data. There was no flight crew on board; the pilot was seated in a ground-based cockpit 6 miles away at NASA Dryden.

As early as 1980, Federal Aviation Administration (FAA) and NASA officials had been planning a full-scale transport aircraft crash dem­onstration to study impact dynamics and new safety technologies to improve aircraft crashworthiness. Initially dubbed the Transport Crash Test, the project was later renamed Transport Aircraft Remotely Piloted Ground Impact Test (TARGIT). In August 1983, planners set­tled on the name Controlled Impact Demonstration (CID). Some wags immediately twisted the acronym to stand for "Crash in the Desert” or "Cremating Innocent Dummies.”[954] In point of fact, no fireball was expected. One of the primary test objectives included demonstration of anti-misting kerosene (AMK) fuel, which was designed to prevent for­mation of a postimpact fireball. While many airplane crashes are surviv – able, most victims perish in postcrash fire resulting from the release of fuel from shattered tanks in the wings and fuselage. In 1977, FAA offi­cials looked into the possibility of using an additive called Avgard FM-9 to reduce the volatility of kerosene fuel released during catastrophic crash events. Ground-impact studies using surplus Lockheed SP-2H airplanes

showed great promise, because the FM-9 prevented the kerosene from forming a highly volatile mist as the airframe broke apart.[955] As a result of these early successes, the FAA planned to implement the require­ment that airlines add FM-9 to their fuel. Estimates made calculated that the impact of adopting AMK would have included a one-time cost to airlines of $25,000-$35,000 for retrofitting each high-bypass turbine engine and a 3- to 6-percent increase in fuel costs, which would drive ticket prices up by $2-$4 each. In order to definitively prove the effective­ness of AMK, officials from the FAA and NASA signed a Memorandum of Agreement in 1980 for a full-scale impact demonstration. The FAA was responsible for program management and providing a test aircraft, while NASA scientists designed the experiments, provided instrumen­tation, arranged for data retrieval, and integrated systems.[956] The FAA supplied the Boeing 720B, a typical intermediate-range passenger trans­port that entered airline service in the mid-1960s. It was selected for the test because its construction and design features were common to most contemporary U. S. and foreign airliners. It was powered by four Pratt & Whitney JT3C-7 turbine engines and carried 12,000 gallons of fuel. With a length of 136 feet, a 130-foot wingspan, and maximum takeoff weight of 202,000 pounds, it was the world’s largest RPRV. FAA Program Manager John Reed headed overall CID project development and coor­dination with all participating researchers and support organizations.

Researchers at NASA Langley were responsible for characteriz­ing airframe structural loads during impact and developing a data – acquisition system for the entire aircraft. Impact forces during the demonstration were characterized as being survivable for planning purposes, with the primary danger to be from postimpact fire. Study data to be gathered included measurements of structural, seat, and occu­pant response to impact loads, to corroborate analytical models devel­oped at Langley, as well as data to be used in developing a crashworthy seat and restraint system. Robert J. Hayduk managed NASA crashwor­thiness and cabin-instrumentation requirements.[957] Dryden personnel,

under the direction of Marvin R. "Russ” Barber, were responsible for overall flight research management, systems integration, and flight operations. These included RPRV control and simulation, aircraft/ground interface, test and systems hardware integration, impact-site preparation, and flight-test operations.

The Boeing 720B was equipped to receive uplinked commands from the ground cockpit. Commands providing direct flight path control were routed through the autopilot, while other functions were fed directly to appropriate systems. Information on engine performance, navigation, attitude, altitude, and airspeed was downlinked to the ground pilot.[958] Commands from the ground cockpit were conditioned in control-law computers, encoded, and transmitted to the aircraft from either a pri­mary or backup antenna. Two antennas on the top and bottom of the Boeing 720B provided omnidirectional telemetry coverage, each feeding a separate receiver. The output from the two receivers was then combined into a single input to a decoder that processed uplink data and generated commands to the controls. Additionally, the flight engineer could select redundant uplink transmission antennas at the ground station. There were three pulse-code modulation systems for downlink telemetry, two for exper­imental data, and one to provide aircraft control and performance data.

The airplane was equipped with two forward-facing television cam­eras—a primary color system and a black-and-white backup—to give the ground pilot sufficient visibility for situational awareness. Ten high­speed motion picture cameras photographed the interior of the pas­senger cabin to provide researchers with footage of seat and occupant motion during the impact sequence.[959] Prior to the final CID mission, 14 test flights were made with a safety crew on board. During these flights, 10 remote takeoffs, 13 remote landings (the initial landing was made by the safety pilot), and 69 CID approaches were accomplished. All remote takeoffs were flown from the Edwards Air Force Base main runway. Remote landings took place on the emergency recovery run­way (lakebed Runway 25).

Research pilots for the project included Edward T. Schneider, Fitzhugh L. Fulton, Thomas C. McMurtry, and Donald L. Mallick.

William R. "Ray” Young, Victor W. Horton, and Dale Dennis served as flight engineers. The first flight, a functional checkout, took place March 7, 1984. Schneider served as ground pilot for the first three flights, while two of the other pilots and one or two engineers acted as safety crew. These missions allowed researchers to test the uplink/downlink systems and autopilot, as well as to conduct airspeed calibration and collect ground-effects data. Fulton took over as ground pilot for the remain­ing flight tests, practicing the CID flight profile while researchers qual­ified the AMK system (the fire retardant AMK had to pass through a degrader to convert it into a form that could be burned by the engines) and tested data-acquisition equipment. The final pre-CID flight was com­pleted November 26. The stage was set for the controlled impact test.[960] The CID crash scenario called for a symmetric impact prior to encoun­tering obstructions as if the airliner were involved in a gear-up landing short of the runway or an aborted takeoff. The remote pilot was to slide the airplane through a corridor of heavy steel structures designed to slice open the wings, spilling fuel at a rate of 20 to 100 gallons per second. A specially prepared surface consisting of a rectangular grid of crushed rock peppered with powered electric landing lights provided ignition sources on the ground, while two jet-fueled flame generators in the air­plane’s tail cone provided onboard ignition sources.

On December 1, 1984, the Boeing 720B was prepared for its final flight. The airplane had a gross takeoff weight of 200,455 pounds, including 76,058 gallons of AMK fuel. Fitz Fulton initiated takeoff from the remote cockpit and guided the Boeing 720B into the sky for the last time.[961]At an altitude of 200 feet, Fulton lined up on final approach to the impact site. He noticed that the airplane had begun to drift to the right of centerline but not enough to warrant a missed approach. At 150 feet, now fully commit­ted to touchdown because of activation of limited-duration photographic and data-collection systems, he attempted to center the flight path with a left aileron input, which resulted in a lateral oscillation.

The Boeing 720B struck the ground 285 feet short of the planned impact point, with the left outboard engine contacting the ground first.

On TARGIT: Civil Aviation Crash Testing in the

NASA and the FAA conducted a Controlled Impact Demonstration with a remotely piloted Boeing 720 aircraft. NASA.

 

9

 

This caused the airplane to yaw during the slide, bringing the right inboard engine into contact with one of the wing openers and releasing large quantities of degraded (i. e., highly flammable) AMK and exposing them to a high-temperature ignition source. Other obstructions sliced into the fuselage, permitting fuel to enter beneath the passenger cabin. The resulting fireball was spectacular.[962]

To casual observers, this might have made the CID project appear a failure, but such was not the case. The conditions prescribed for the AMK test were very narrow and failed to account for a wide range of variables, some of which were illustrated during the flight test. The results were sufficient to cause FAA officials to abandon the idea of forc­ing U. S. airlines to use AMK, but the CID provided researchers with a wide range of data for improving transport-aircraft crash survivability.

The experiment also provided significant information for improv­ing RPV technology. The 14 test flights leading up to the final demon­stration gave researchers an opportunity to verify analytical models, simulation techniques, RPV control laws, support software, and hard­ware. The remote pilot assessed the airplane’s handling qualities, allowing
programmers to update the simulation software and validate the con­trol laws. All onboard systems were thoroughly tested, including AMK degraders, autopilot, brakes, landing gear, nose wheel steering, and instrumentation systems. The CID team also practiced emergency pro­cedures, such as the ability to abort the test and land on a lakebed run­way under remote control, and conducted partial testing of an uplinked flight termination system to be used in the event that control of the air­plane was lost. Several anomalies—intermittent loss of uplink signal, brief interruption of autopilot command inputs, and failure of the uplink decoder to pass commands—cropped up during these tests. Modifications were implanted, and the anomalies never recurred.[963] Handling qualities were generally good. The ground pilot found landings to be a special challenge as a result of poor depth perception (because of the low- resolution television monitor) and lack of peripheral vision. Through flight tests, the pilot quickly learned that the CID profile was a high – workload task. Part of this was due to the fact that the tracking radar used in the guidance system lacked sufficient accuracy to meet the impact parameters. To compensate, several attempts were made to improve the ground pilot’s performance. These included changing the flight path to give the pilot more time to align his final trajectory, improving ground markings at the impact site, turning on the runway lights on the test sur­face, and providing a frangible 8-foot-high target as a vertical reference on the centerline. All of these attempts were compromised to some degree by the low-resolution video monitor. After the impact flight, members of the control design team agreed that some form of head-up display (HUD) would have been helpful and that more of the piloting tasks should have been automated to alleviate pilot workload.[964] In terms of RPRV research, the project was considered highly successful. The remote pilots accu­mulated 16 hours and 22 minutes of RPV experience in preparation for the impact mission, and the CID showed the value of comparing pre­dicted results with flight-test data. U. S. Representative William Carney, ranking minority member of the House Transportation, Aviation, and Materials Subcommittee, observed the CID test. "To those who were disappointed with the outcome,” he later wrote, "I can only say that the
results dramatically illustrated why the tests were necessary. I hope we never lose sight of the fact that the first objective of a research program is to learn, and failure to predict the outcome of an experiment should be viewed as an opportunity, not a failure.”[965]

JSC, the FAA, and Targets of Opportunity

Подпись: 11As 2005 drew to a close, Michael Coffman at FAA Oklahoma City had convinced his line management that a flight demonstration of the sen­sor fusion technology would be a fine precursor to further FAA interest. FAA Oklahoma City had a problem: how best to protect its approaches of flight-check aircraft certifying instruments for the Department of Defense in combat zones. Coffman and Fox had suggested sensor fusion. If onboard video sensors in a flight-check aircraft could image a termi­nal approach corridor with a partially blended synthetic approach cor­ridor, any obstacle penetrating the synthetic corridor could be quickly identified. Coffman, using the MOU with NASA JSC signed just that July, suggested that an FAA Challenger 604 flight-check aircraft based at FAA Oklahoma City could be configured with SVS equipment to dem­onstrate the technology to NASA and FAA managers. Immediately, Fox, Coffman, Mike Abernathy of RIS, Patrick Laport and Tim Verborgh of AANA, and JSC electronics technician James Secor began discussing how to configure the Challenger 604. Fox tested his ability to scrounge excess material from JSC by acquiring an additional obsolete but ser­viceable Embedded GPS Inertial Navigation System (EGI) navigation processor (identical to the one used in the ACES van) and several pro­cessors to drive three video displays. Coffman found some FAA funds to buy three monitors, and Abernathy and RIS wrote the software nec­essary to drive three monitors with SVS displays with full-sensor fusion capability, while Laport and Verborgh developed the symbology set for the displays. The FAA bought three lipstick cameras, JSC’s Jay Estes designed a pallet to contain the EGI and processors, and a rudimentary portable system began to take shape.[1192]

The author, now a research pilot at the Johnson Space Center, became involved assisting AANA with the design of a notional instrument pro­cedure corridor at JSC’s Ellington Field flight operations base. He also obtained permission from his management chain to use JSC’s Aircraft

Operations Division to host the FAAs Challenger and provide the jet fuel it required. Verborgh, meanwhile, surveyed a number of locations on Ellington Field with the author’s help, using a borrowed portable DGPS system to create by hand a synthetic database of Ellington Field, the group not having access to expensive commercial databases. The author and JSC’s Donald Reed coordinated the flight operations and air traffic control approvals, Fox and Coffman handled the interagency approv­als, and by March 2006, the FAA Challenger 604 was at Ellington Field with the required instrumentation installed and ready for the first sen­sor fusion-guided instrument approach demonstration. Fox had bor­rowed helmet-mounted display hardware and a kneeboard computer to display selected sensor fusion scenes in the cabin, and five demonstra­tion flights were completed for over a dozen JSC Shuttle, Flight Crew Operations, and Constellation managers. In May, the flights were com­pleted to the FAA’s satisfaction. The sensor fusion software and hardware performed flawlessly, and both JSC and FAA Oklahoma City manage­ment gained confidence in the team’s capabilities, a confidence that would continue to pay dividends. For its part, JSC could not afford a more extensive, focused program, nor were Center managers uniformly convinced of the applicability of this technology to their missions. The team, however, had greatly bolstered confidence in its ability to accom­plish critically significant flight tests, demonstrating that it could do so with "shoestring” resources and support. It did so by using a small-team approach, building strong interagency partnerships, creating relation­ships with other research organizations and small businesses, relying on trust in one another’s professional abilities, and following rigorous adherence to appropriate multiagency safety reviews.

Подпись: 11The success of the approach demonstrations allowed the team mem­bers to continue with the SVS work on a not-to-interfere basis with their regularly assigned duties. Fox persuaded his management to allow the ACES van to remotely control the JSC Scout simulated lunar rover on three trips to Meteor Crater, AZ, in 2005-2006 using the same sensor fusion software implementation as that on the Challenger flight test. Throughout the remainder of 2006, the team discussed other possibil­ities to demonstrate its system. Abernathy provided the author with a kneeboard computer, a GPS receiver, and the RIS’s LandForm software (for which JSC had rights) with a compressed, high-resolution database of the Houston area. On NASA T-38 training flights, the author evaluated the performance of the all-aspect software in anticipation of an official

Подпись: 11 JSC, the FAA, and Targets of Opportunity

evaluation as part of a potential T-38 fleet upgrade. The author had con­versations with Coffman, Fox, and Abernathy regarding the FAA’s idea of using a turret on flight-check aircraft to measure in real time the height of approach corridor obstacles. The conservations and the portability of the software and hardware inspired the author to suggest a flight test using one of JSC’s WB-57F High-Altitude Research Airplanes with the WB-57F Acquisition Validation Experiment (WAVE) sensor as a proof of concept. The WB-57F was a JSC high-altitude research airplane capable of extended flight above 60,000 feet with sensor payloads of thousands of pounds and dozens of simultaneous experiments. The WAVE was a sophis­ticated, 360-degree slewable camera tracking system developed after the Columbia accident to track Space Shuttle launches and reentries.[1193]

The author flew the WB-57F at JSC, including WAVE Shuttle tracking missions. Though hardly the optimal airframe (the sensor fusion proof of concept would be flown at only 2,000 feet altitude), the combination of a JSC airplane with a slewable, INS/GPS-supported camera system was hard to beat. The challenges were many. The two WB-57F airframes at JSC were scheduled years in advance, they were expensive for a single
experiment when designed to share costs among up to 40 simultaneous experiments, and the WAVE sensor was maintained by Southern Research Institute (SRI) in Birmingham, AL. Fortunately, Mike Abernathy of RIS spoke directly to John Wiseman of SRI, and an agreement was reached in which SRI would integrate RIS’s LandForm software into WAVE for no cost if it were allowed to use it for other potential WAVE projects.

Подпись: 11The team sought FAA funding on the order of $30,000-$40,000 to pay for the WB-57F operation and integration costs and transport the WAVE sensor from Birmingham to Houston. In January 2007, the team invited Frederic Anderson—Manager of Aero-Nav Services at FAA Oklahoma City—to visit JSC to examine the ACES van, meet with the WB-57F Program Office and NASA Exploration Program officials, and receive a demonstration of the sensor fusion capabilities. Anderson was convinced of the potential of using the WB-57/WAVE to prove that an object on the ground could be passively, remotely measured in real time to high accuracy. He was willing to commit $40,000 of FAA money to this idea. With one challenge met, the next challenge was to find a hole in the WB-57F’s schedule.

In mid-March 2007, the author was notified that a WB-57F would be available the first week in April. In 3 weeks Fox, pushed through a Space Act Agreement to get FAA Oklahoma City funds transferred to JSC, with pivotal help from the JSC Legal Office. RIS and AANA, work­ing nonstop with SRI, integrated the sensor fusion software into the WAVE computers. Due to a schedule slip with the WB-57, the team only had a day and a half to integrate the RIS hardware into the WB-57, with the invaluable help of WB-57 engineers. Finally, on April 6, on a 45-minute flight from Ellington Field, the author and WAVE operator Dominic Del Rosso of JSC for the first time measured an object on the ground (the JSC water tower) in flight in real time using SVS technol­ogy. The video signal from the WAVE acquisition camera was blended with synthetic imagery to provide precise scaling.

The in-flight measurement was within 0.5 percent of the surveyed data. The ramifications of this accomplishment were immediate and profound: the FAA was convinced of the power of the SVS sensor fusion technology and began incorporating the capability into its planned flight-check fleet upgrade.[1194]

Подпись: 11 JSC, the FAA, and Targets of Opportunity

Building on this success, Fox, Coffman, Abernathy, and the author looked at new ways to showcase sensor fusion. In the back of their minds had been the concept of simulating a lunar approach into a vir­tual lunar base anchored over Ellington Field. The thought was to use the FAA Challenger 604 with the SVS portable pallet installed as before. The problem was money. A solution came from a collaboration between the author and his partner in a NASA JSC aircraft fleet upgrade study, astronaut Joseph Tanner. They had extra money from their fleet study budget, and Tanner was intrigued by the proposed lunar approach sim­ulation because it related to a possible future lunar approach training aircraft. The two approached Brent Jett, who was Director of Flight Crew Operations and the sponsor of their study, in addition to oversee­ing the astronauts and flight operations at JSC. Jett was impressed with the idea and approved the necessary funds to pay the operational cost of the Challenger 604. FAA Oklahoma City would provide the airplane and crew at its expense.

Once again, RIS and AANA on their own modified the software to simulate a notional lunar approach designed by the author and Fox, derived from the performance of the Challenger 604 aircraft. Coffman was able to retrieve the monitors and cameras from the approach
flight tests of 2006. Jim Secor spent a day at FAA Oklahoma City rein­stalling the SVS pallet and performing the necessary integration with Michael Coffman. The author worked with Houston Approach Control to gain approval for this simulated lunar approach with a relatively steep flightpath into Ellington Field and within the Houston Class B (Terminal Control Area) airspace. The trajectory commenced at 20,000 feet, with a steep power-off dive to 10,000 feet, at which point a 45-degree course correction maneuver was executed. The approach terminated at 2,500 feet at a simulated 150-feet altitude over a virtual lunar base anchored overhead Ellington Field. Because there was no digital database avail­able for any of the actual proposed lunar landing sites, the team used a modified database for Meteor Crater as a simulated lunar site. The team switched the coordinates to Ellington Field so that the EGI could still provide precise GPS navigation to the virtual landing site anchored overhead the airport.

Подпись: 11

Подпись: Screen shot of the SVS simulated lunar approach PFD. NASA.

In early February 2008, all was ready. The ACES van was used to val­idate the model as there was no time (or money) to do it on the airplane. One instrumentation checkout flight was flown, and several anomalies were corrected. That afternoon, for the first time, an aircraft was used to simulate a lunar approach to a notional lunar base. Sensor fusion was demonstrated on one of the monitors using the actual ambient conditions to provide Sun glare and haze challenges. These were not

representative of actual lunar issues but were indicative of the benefit of sensor fusion to mitigate the postulated 1-degree Sun angles of the south lunar pole. The second monitor showed the SVS Meteor Crater digital terrain database simulating the lunar surface and perfectly matched to the Houston landscape over which the Challenger 604 was flying.[1195] This and two more flights demonstrated the technology to a dozen astronauts and to Constellation and Orion program managers.

Подпись: 11Four flight test experiments and three trips to Meteor Crater were completed in a 3-year period to demonstrate the SVS sensor fusion tech­nology. The United States military is using evolved versions of the orig­inal X-38 SVS and follow-on sensor fusion software with surveillance sensors on various platforms, and the FAA has contracted with RIS to develop an SVS for its flight-check fleet. Constellation managers have shown much interest in the technology, but by 2009, no decision has been reached regarding its incorporation in NASA’s space exploration plans.[1196]

NASA’S CONTRIBUTIONS TO AERONAUTICS

A

S THIS BOOK GOES TO PRESS, the National Aeronautics and Space Administration (NASA) has passed beyond the half cen­tury mark, its longevity a tribute to how essential successive Presidential administrations—and the American people whom they serve—have come to regard its scientific and technological expertise. In that half century, flight has advanced from supersonic to orbital veloc­ities, the jetliner has become the dominant means of intercontinental mobility, astronauts have landed on the Moon, and robotic spacecraft developed by the Agency have explored the remote corners of the solar system and even passed into interstellar space.

Born of a crisis—the chaotic aftermath of the Soviet Union’s space triumph with Sputnik—NASA rose magnificently to the challenge of the emergent space age. Within a decade of NASA’s establishment, teams of astronauts would be planning for the first lunar landings, accom­plished with Neil Armstrong’s "one small step” on July 20, 1969. Few events have been so emotionally charged, and none so publicly visible or fraught with import, as his cautious descent from the spindly lit­tle Lunar Module Eagle to leave his historic boot-print upon the dusty plain of Tranquillity Base.

In the wake of Apollo, NASA embarked on a series of space initia­tives that, if they might have lacked the emotional and attention-getting impact of Apollo, were nevertheless remarkable for their accomplish­ment and daring. The Space Shuttle, the International Space Station, the Hubble Space Telescope, and various planetary probes, landers, rov­ers, and flybys speak to the creativity of the Agency, the excellence of its technical personnel, and its dedication to space science and exploration.

But there is another aspect to NASA, one that is too often hidden in an age when the Agency is popularly known as America’s space agency and when its most visible employees are the astronauts who courageously

rocket into space, continuing humanity’s quest into the unknown. That hidden aspect is aeronautics: lift-borne flight within the atmosphere, as distinct from the ballistic flight of astronautics, out into space. It is the first "A” in the Agency’s name, and the oldest-rooted of the Agency’s tech­nical competencies, dating to the formation, in 1915, of NASA’s lineal predecessor, the National Advisory Committee for Aeronautics (NACA). It was the NACA that largely restored America’s aeronautical primacy in the interwar years after 1918, deriving the airfoil profiles and con­figuration concepts that defined successive generations of ever-more – capable aircraft as America progressed from the subsonic piston era into the transonic and supersonic jet age. NASA, succeeding the NACA after the shock of Sputnik, took American aeronautics across the hyper­sonic frontier and onward into the era of composite structures, elec­tronic flight controls and energy-efficient flight.

As with the first in this series, this second volume traces con­tributions by NASA and the post-Second World War NACA to aeronautics. The surveys, cases, and biographical examinations pre­sented in this work offer just a sampling of the rich legacy of aero­nautics research having been produced by the NACA and NASA. These include

• Atmospheric turbulence, wind shear, and gust research, subjects of crucial importance to air safety across the spectrum of flight, from the operations of light general – aviation aircraft through large commercial and super­sonic vehicles.

• Research to understand and mitigate the danger of light­ning strikes upon aerospace vehicles and facilities.

• The quest to make safer and more productive skyways via advances in technology, cross-disciplinary integration of developments, design innovation, and creation of new operational architectures to enhance air transportation.

• Contributions to the melding of human and machine, via the emergent science of human factors, to increase the safety, utility, efficiency, and comfort of flight.

• The refinement of free-flight model testing for aero­dynamic research, the anticipation of aircraft behavior, and design validation and verification, complementing traditional wind tunnel and full-scale aircraft testing.

• The evolution of the wind tunnel and expansion of its capabilities, from the era of the slide rule and subsonic flight to hypersonic excursions into the transatmosphere in the computer and computational fluid dynamics era.

• The advent of composite structures, which, when cou­pled with computerized flight control systems, gave air­craft designers a previously unknown freedom enabling them to design aerospace vehicles with optimized aero­dynamic and structural behavior.

• Contributions to improving the safety and efficiency of general-aviation aircraft via better understanding of their unique requirements and operational circum­stances, and the application of new analytical and tech­nological approaches.

• Undertaking comprehensive flight research on sustained supersonic cruise aircraft—with particular attention to their aerodynamic characteristics, airframe heating, use of integrated flying and propulsion controls, and eval­uation of operational challenges such as inlet "unstart,” aircrew workload—and blending them into the predomi­nant national subsonic and transonic air traffic network.

• Development and demonstration of Synthetic Vision Systems, enabling increased airport utilization, more effi­cient flight deck performance, and safer air and ground aircraft operations.

• Confronting the persistent challenge of atmospheric icing and its impact on aircraft operations and safety.

• Analyzing the performance of aircraft at high angles of attack and conducting often high-risk flight-testing to study their behavior characteristics and assess the value of developments in aircraft design and flight control technologies to reduce their tendency to depart from controlled flight.

• Undertaking pathbreaking flight research on VTOL and V/STOL aircraft systems to advance their ability to enter the mainstream of aeronautical development.

• Conducting a cooperative international flight-test program to mutually benefit understanding of the potential, behav­ior, and performance of large supersonic cruise aircraft.

As this sampling—far from a complete range—of NASA work in aeronautics indicates, the Agency and its aeronautics staff spread across the Nation maintain a lively interest in the future of flight, benefitting NASA’s reputation earned in the years since 1958 as a national reposi­tory of aerospace excellence and its legacy of accomplishment in the 43-year history of the National Advisory Committee for Aeronautics, from 1915 to 1958.

As America enters the second decade of the second century of winged flight, it is again fitting that this work, like the volume that precedes it, be dedicated, with affection and respect, to the men and women of NASA, and the NACA from whence it sprang.

Dr. Richard P. Hallion

August 25, 2010

TH