Document Type


Publication Date



In this contribution, we describe our initial conceptual thoughts on how to determine pilot activity in real-time within a multitasking environment. The presented concept extends our previous research on determining the pilot activity which analyses manual and gaze interactions by evidential reasoning. This approach resulted in a fragmented pattern of activities over time due to the high frequency of gaze shifts. Our concept suggests concatenating the activities and using the resulting sequence as a feature set for classification. We hypothesize that this representations of activities reflects the complex nature of concurrent and serial multitasking more appropriately. For probabilistic inference, we aim to use Conditional Random Fields. We are currently developing an activity recognition prototype supported by pilot interaction datasets from experiments. Our application is the management of a team of unmanned vehicles guided from the cockpit of a fast jet. We aim to use activity determination for adaptive assistance purposes.