VFS Website
  • VIEW CART
  • CUSTOMER SUPPORT
  • MY STORE ACCOUNT
  • CONTACT US
  • STORE HOME
  • 5Prime
  • Forum Proceedings
  • Workshops
  • Technical Meetings
  • Vertiflite
  • Books, CDs & Gifts


Unable to log in or get member pricing? Having trouble changing your password?

Please review our Frequently Asked Questions for complete information on these and other common situations.
 

Vertical Flight Library & Store

CHECKOUT

0 Item(s) In Cart Total: $0.00


Multimodal Cueing and Assessment of Pilot Engagement During Low Level Flight

E. Bachelder, U.S. Army Aviation Development Directorate; M. Godfroy-Cooper, NASA Ames Research; A. Kahana, Haifa University; J.D. Miller, NASA Ames Research; M. Rottem-Hovev, Mafat HFE

  • Your Path :
  • Home
  • > Multimodal Cueing and Assessment of Pilot Engagement During Low Level Flight

Multimodal Cueing and Assessment of Pilot Engagement During Low Level Flight

  • Presented at Forum 74
  • 19 pages
  • SKU # : 74-2018-1389
  • Your Price : $30.00
  • Join or log in to receive the member price of $15.00!


VFS member?
Don't add this to your cart just yet!
Be sure to log in first to receive the member price of $15.00!

 
Add To Cart

Add to Wish List

Reward Value:
(60) Member Points

Multimodal Cueing and Assessment of Pilot Engagement During Low Level Flight

Authors / Details: E. Bachelder, U.S. Army Aviation Development Directorate; M. Godfroy-Cooper, NASA Ames Research; A. Kahana, Haifa University; J.D. Miller, NASA Ames Research; M. Rottem-Hovev, Mafat HFE

Abstract
This paper presents the work of a collaborative project whose objectives are to develop a multimodal cueing environment for near-earth helicopter operations, and to develop methods for assessing pilot workload for real-time and post-mission applications. Considerable research has been devoted to investigating visual requirements of out-the-window scenery for allowing pilots to effectively conduct near-earth helicopter flight. The primary motivation for that body of research has been twofold: 1) operations, where real-time sensor and stored digital terrain data information are employed to operationally recover a visual environment that has been degraded, and 2) simulation, where limited computational resources force visual system design to down-select from a bourgeoning list of visual effects to produce functional realism. In simulation the originating environment being rendered in a given fashion for an intended effect (i.e. Night Vision Goggle simulation) is already known, whereas in degraded conditions the original environment must be inferred from sensor data. As this data is obtained from sources outside human visual sensing range, it must undergo some level of post-processing to be usable. Sensor information can be presented to the pilot in relatively raw form (i.e., a LIDAR point-cloud), or conditioned to reduce cognitive demand on the pilot, using visualization (or pre-interpretive) techniques such as height color-coding, mesh overlay, digital terrain overlay/substitution, and iconic replacement. As the level of pre-interpretive processing increases, the possibility of the pilot generating alternative interpretations decreases. The present research explores the potential benefits of providing the pilot with "perceptual prosthetics" that allow precise local structuring of the environment for contour flight in the presence of visual uncertainties inherent with low-level sensor rendering. A first simulation experiment was conducted that examined 1) visual cueing depicting both terrain slope and aircraft height-above-ground, and 2) spatial (3D) auditory cueing for depicting deviation from desired height-above-ground and impending collision with terrain. Initial results indicate that synergistic visual and auditory cueing can enhance performance, therefore could be used to reduce pilot workload while sustaining performance. A second simulation experiment was designed to assess the convergence of direct and indirect measures of performance during a low-level flight under different workload levels, associated to visibility level, presence or absence of obstacles and/or targets and terrain difficulty. Direct measures included control stick entropy, head motion entropy and eye fixation duration. Indirect measures comprised Index of Cognitive Activity (ICA) and visual search behavior. Preliminary results show promise for using some of these measures as indicators of pilot workload and engagement. All together, the results of these two simulations will provide a framework for the development and evaluation of future advanced multimodal display concepts for helicopter operations during low-level flight.

Recently Viewed Items

  • Multimodal Cueing and Assessment of Pilot Engagement During Low Level Flight

    Member Price :
    $15.00
    Your Price :
    $30.00

Popular Products

  • Master Card
  • Visa
  • American Express
  • Customer Support
  • Contact Us
  • Privacy and Security Policies
  • Refund Policies

Copyright © 2022 The Vertical Flight Society. All rights reserved.