Improving judgment performance through integrated task feedback
Gattie, Gordon J.
MetadataShow full item record
The increasingly complex nature of current and evolving human-machine systems require operators, maintainers, trainers, and other support personnel who are effectively trained for their role in system development and operation. Depending on technology readiness levels and available resources, training efforts may even become an afterthought to system deployment, where user manuals or online help may not be developed until after a system has been fielded. In addition, training systems may only be able to assist a novice attain a certain level of expertise. However, one technique for developing effective training is using adaptive training, where easier task conditions are presented to trainees before more challenging scenarios are attempted. In laboratory experiments, developing adaptive training systems have been accomplished in two ways: by increasing time lag and by increasing task complexity. In order to determine training effectiveness, this research investigated various approaches for modeling dynamic decision making to quantitatively describe judgment performance. The major goal of this research was to identify feedback information that would be useful to trainees in a dynamic decision-making task. The experimental task used in this research was a decision task in the baseball domain, where participants made pitch selections throughout a simulated ballgame, based on information and feedback displayed. The task investigated the effects of varying feedback content and feedback frequency throughout the experiment. This research modeled within the Judgment Analysis framework, a paradigm based on work by Egon Brunswik, as the basis for analysis. Some evidence for the effectiveness of using relative weights as feedback information elements was revealed.