The Federal Aviation Administration (FAA) uses MetaVR Virtual Reality Scene Generator (VRSG) in two out-the-window general aviation flight simulators to support human factors research related to synthetic vision systems at the FAA Civil Aerospace Medical Institute (CAMI) within the Mike Monroney Aeronautical Center, Oklahoma City.
Since 2010 the FAA has purchased 10 VRSG licenses for use in its Advanced General Aviation Research simulator (with a Piper Malibu configuration) and the Very Light Jet simulator (with a Cessna 510 Citation Mustang configuration). These simulators are used to study and test landing on runways with limited human out-the-window vision with the aid of synthetic vision.
Synthetic vision is a computer-generated image that provides situational awareness of the external scene topography from the perspective of the cockpit. This image is derived from aircraft attitude, a high-precision navigation solution, and a terrain database with buildings, towers, and other relevant cultural features. NASA’s definition of a synthetic vision system is “an aircraft cockpit display technology that presents the visual environment external to the aircraft using computer-generated imagery in a manner analogous to how it would appear to the pilot if forward visibility were not restricted”.
The FAA’s human factors research department conducts its research at two laboratories with five active simulators, of which two are the Very Light Jet (VLJ) and Advanced General Aviation Research Simulator (AGARS). Research focuses on improving individual system effectiveness, efficiency, and safety. A major emphasis of the department is its focus on improving human performance through enhanced equipment design. In these two general aviation flight simulators, the experimenters typically run pilots through pre-designed scenarios and record the pilots’ interactions and behavioral responses. Recorded video, audio and computer aircraft data is then analyzed by the experimenter. In this case, such scenarios simulate landing in adverse weather visibility conditions, when visibility of the airfield is obscured.
The visual system for the FAA’s Cessna Citation Mustang VLJ simulator uses MetaVR VRSG with five projectors and a 225-degree fixed dome. The simulated cockpit replicates a Cessna Mustang VLJ cockpit’s physical equipment and furnishings, controls, and interphone and air/ground communications equipment. Having the system function and perform as the actual aircraft system increases the realism and helps pilot trainees gain experience with tools they would use in the real world, such as the Garmin G1000 integrated flight instrument system.
The VRSG licenses were acquired concurrently with the FAA’s purchase of nine virtual airfields built by Simthetiq, and delivered in MetaVR’s round-earth Metadesic terrain format for visualizing in VRSG. The airfields are fully integrated within MetaVR’s 1 meter-per-pixel high-resolution terrain of the area.
Most recently Simthetiq delivered to the FAA two virtual Colorado airfields, the Aspen-Pitkin County/Sardy Field Airport and the Gunnison-Crested Butte Regional Airport. Simthetiq’s synthetic environments include all major airport infrastructures, navigational aids, approach lights, VFR reference points, runways, aprons, and taxiways to form a cohesive, realistic, out-of-the-window scene. Built using the latest airport diagrams and elevation data, these airfield databases conform to the FAA’s strict standards. The terrain’s elevation matches the data from the FAA. The virtual runways match real-world data closely to enable trainees to simulate the challenges of runway fluctuations.
Synthetic vision research and simulation is an emerging area of research and technology focused on reducing approach-and-landing accidents caused by the pilot’s inability to see the runway in adverse weather conditions. Limited visibility remains the single most critical factor affecting safety and capacity in worldwide aviation operations. NASA and the FAA are among the organizations seeking to maximize situational awareness in the cockpit through synthetic vision with the goal of reducing such accidents.
MetaVR, founded in 1997, develops commercial PC-based software for the military simulation and training markets, featuring high-speed 3D visualization content and rapid creation of networked virtual worlds using real-world data. MetaVR’s real-time visual systems provide the fidelity of geospecific simulation with game-quality graphics. Users can build (with real-world photographic imagery, elevation data, and feature data) high-fidelity virtual worlds with our terrain generation tools, and render in real time, at 60Hz frame rates, the resulting virtual world with our real-time 3D visualization application, Virtual Reality Scene Generator. MetaVR systems are used for applications such as unmanned aerial systems trainers, manned flight simulators, mission planning and rehearsal, JTAC simulation training, urban operations training, and disaster management training. For more information, visit www.metavr.com.