Ikhana: Awareness in the National Airspace
Military UAVs are easily adapted for civilian research missions. In November 2006, NASA Dryden obtained a civilian version of the General Atomics MQ-9 Reaper that was subsequently modified and instrumented for research. Proposed missions included supporting Earth science research, fabricating advanced aeronautical technology, and developing capabilities for improving the utility of unmanned aerial systems.
The project team named the aircraft Ikhana, a Native American Choctaw word meaning intelligent, conscious, or aware. The choice was considered descriptive of research goals NASA had established for the aircraft and its related systems, including collecting data to better understand and model environmental conditions and climate and increasing the ability of unpiloted aircraft to perform advanced missions.
The Ikhana was 36 feet long with a 66-foot wingspan and capable of carrying more than 400 pounds of sensors internally and over 2,000 pounds in external pods. Driven by a 950-horsepower turboprop engine, the aircraft has a maximum speed of 220 knots and is capable of reaching
altitudes above 40,000 feet with limited endurance.[1038] Initial experiments included the use of fiber optics for wing shape and temperature sensing, as well as control and structural loads measurements. Six hairlike fibers on the upper surfaces of the Ikhana’s wings provided 2,000 strain measurements in real time, allowing researchers to study changes in the shape of the wings during flight. Such sensors have numerous applications for future generations of aircraft and spacecraft. They could be used, for example, to enable adaptive wing-shape control to make an aircraft more aerodynamically efficient for specific flight regimes.[1039] To fly the Ikhana, NASA purchased a Ground Control Station and satellite communication system for uplinking flight commands and downlinking aircraft and mission data. The GCS was installed in a mobile trailer and, in addition to the pilot’s remote cockpit, included computer workstations for scientists and engineers. The ground pilot was linked to the aircraft through a C-band line-of-sight (LOS) data link at ranges up to 150 nautical miles. A Ku-band satellite link allowed for over-the-horizon control. A remote video terminal provided real-time imagery from
the aircraft, giving the pilot limited visual input.[1040] Two NASA pilots, Hernan Posada and Mark Pestana, were initially trained to fly the Ikhana. Posada had 10 years of experience flying Predator vehicles for General Atomics before joining NASA as an Ikhana pilot. Pestana, with over
4,0 flight hours in numerous aircraft types, had never flown a UAS prior to his assignment to the Ikhana project. He found the experience an exciting challenge to his abilities because the lack of vestibular cues and peripheral vision hinders situational awareness and eliminates the pilot’s ability to experience such sensations as motion and sink rate.[1041]
Building on experience with the Altair unpiloted aircraft, NASA developed plans to use the Ikhana for a series of Western States Fire Mission flights. The Autonomous Modular Sensor (AMS), developed by Ames, was key to their success. The AMS is a line scanner with a 12-band spectrometer covering the spectral range from visible to the near infrared for fire detection and mapping. Digitized data are combined with navigational and inertial sensor data to determine the location and orientation of the sensor. In addition, the data are autonomously processed with geo-rectified topographical information to create a fire intensity map.
Data collected with AMS are processed onboard the aircraft to provide a finished product formatted according to a geographical information systems standard, which makes it accessible with commonly available programs, such as Google Earth. Data telemetry is downlinked via a Ku-band satellite communications system. After quality-control assessment by scientific personnel in the GCS, the information is transferred to NASA Ames and then made available to remote users via the Internet.
After the Ikhana was modified to carry the AMS sensor pod on a wing pylon, technicians integrated and tested all associated hardware and systems. Management personnel at Dryden performed a flight readiness review to ensure that all necessary operational and safety concerns had been addressed. Finally, planners had to obtain permission from the FAA to allow the Ikhana to operate in the national airspace.[1042]
The first four Ikhana flights set a benchmark for establishing criteria for future science operations. During these missions, the aircraft traversed eight western U. S. States, collecting critical fire information and relaying data in near real time to fire incident command teams on the ground as well as to the National Interagency Fire Center in Boise, ID. Sensor data were downlinked to the GCS, transferred to a server at Ames, and autonomously redistributed to a Google Earth data visualization capability—Common Desktop Environment (CDE)—that served as a Decision Support System (DSS) for fire-data integration and information sharing. This system allowed users to see and use data in as little as 10 minutes after it was collected.
The Google Earth DSS CDE also supplied other real-time fire – related information, including satellite weather data, satellite-based fire data, Remote Automated Weather Station readings, lightning-strike detection data, and other critical fire-database source information. Google Earth imagery layers allowed users to see the locations of manmade structures and population centers in the same display as the fire information. Shareable data and information layers, combined into the CDE, allowed incident commanders and others to make real-time strategy decisions on fire management. Personnel throughout the U. S. who were involved in the mission and imaging efforts also accessed the CDE data. Fire incident commanders used the thermal imagery to develop management strategies, redeploy resources, and direct operations to critical areas such as neighborhoods.[1043] The Western States UAS Fire Missions, carried out by team members from NASA, the U. S. Department of Agriculture Forest Service, the National Interagency Fire Center, the NOAA, the FAA, and General Atomics Aeronautical Systems, Inc., were a resounding success and a historic achievement in the field of unpiloted aircraft technology.
In the first milestone of the project, NASA scientists developed improved imaging and communications processes for delivering near-real-time information to firefighters. NASA’s Applied Sciences and Airborne Science programs and the Earth Science Technology Office developed an Airborne Modular Sensor with the intent of demonstrating its capabilities during the WSFM and later transitioning those capabilities to operational agencies.[1044] The WSFM project team repeatedly demonstrated the utility and flexibility of using a UAS as a tool to aid disaster response personnel through the employment of various platform, sensor, and data-dissemination technologies related to improving near-real-time wildfire observations and intelligence-gathering techniques. Each successive flight expanded capabilities of the previous missions for platform endurance and range, number of observations made, and flexibility in mission and sensing reconfiguration.
Team members worked with the FAA to safely and efficiently integrate the unmanned aircraft system into the national airspace. NASA pilots flew the Ikhana in close coordination with FAA air traffic controllers, allowing it to maintain safe separation from other aircraft.
WSFM project personnel developed extensive contingency management plans to minimize the risk to the aircraft and the public, including the negotiation of emergency landing rights agreements at three Government airfields and the identification and documentation of over 300 potential emergency landing sites.
The missions included coverage of more than 60 wildfires throughout 8 western States. All missions originated and terminated at Edwards Air Force Base and were operated by NASA crews with support from General Atomics. During the mission series, near-real-time data were provided to Incident Command Teams and the National Interagency Fire Center.[1045] Many fires were revisited during some missions to provide data on time-induced fire progression. Whenever possible, long-duration fire events were imaged on multiple missions to provide long-term fire-monitoring capabilities. Postfire burn – assessment imagery was also collected over various fires to aid teams in fire ecosystem rehabilitation. The project Flight Operations team built relationships with other agencies, which enabled real-time flight plan changes necessary to avoid hazardous weather, to adapt to fire priorities, and to avoid conflicts with multiple planned military GPS testing/jamming activities.
Critical, near-real-time fire information allowed Incident Command Teams to redeploy fire-fighting resources, assess effectiveness of containment operations, and move critical resources, personnel, and equipment from hazardous fire conditions. During instances in which blinding smoke obscured normal observations, geo-rectified thermal- infrared data enabled the use of Geographic Information Systems or data visualization packages such as Google Earth. The images were collected and fully processed onboard the Ikhana and transmitted via a communications satellite to NASA Ames, where the imagery was served on a NASA Web site and provided in the Google Earth-based CDE for quick and easy access by incident commanders.
The Western States UAS Fire Mission series also gathered critical, coincident data with satellite sensor systems orbiting overhead, allowing for comparison and calibration of those resources with the more sensitive instruments on the Ikhana. The Ikhana UAS proved a versatile platform for carrying research payloads. Since the sensor pod could be reconfigured, the Ikhana was adaptable for a variety of research projects.[1046]