U.S. Navy STTR N22A-T001
Helicopter pilots landing in degraded visual environments (DVEs) - specifically brownout and whiteout conditions - lose the visual cues that provide spatial awareness for successful landings. These conditions occur when the downwash of a helicopter's propellors cause loose dirt/snow and debris to be projected into the air, resulting in the obfuscation of the pilot's vision outside the aircraft. The lack of visual cues in these environments means that pilots are often flying blind.
As the university partner for the Small Business Technology Transfer (STTR) grant, the Applied Perception and Performance Laboratory at Embry-Riddle was tasked with identifying the visual and vestibular cues used by helicopter pilots during final approach and landing. The findings informed our display design of augmented reality stimuli to reintroduce these visual cues for helicopter pilots in a virtual environment.
As a research assistant on this project, I was tasked with:
I am the first author of our conference paper that is included in the proceedings of the 14th International Conference on Applied Human Factors and Ergonomics (AHFE 2023). Additionally, I traveled to San Francisco, CA to present this research during a technical session at the conference. In the paper, we explore the visual cues that helicopter pilots use to (1) regulate speed, heading, and altitude, (2) maintain position in space, and (3) detect potential collisions with environmental objects. Additionally, we discuss how understanding these visual cues used to maintain spatial awareness may inform the design of visual cues for new synthetic displays to aid pilots landing in DVEs. Finally, we describe current interventions that can be used to prevent DVE-related spatial disorientation accidents. In the presentation, I provided recommendations for practitioners in addition to the content included in the paper.