Multimodal Mobile Collaboration Prototype Used in a Find, Fix, and Tag Scenario
Document Type
Conference Proceeding
Publication Date
2013
Find this in a Library
Abstract
Given recent technological advancements in mobile devices, military research initiatives are investigating these devices as a means to support multimodal cooperative interactions. Military components are executing dynamic combat and humanitarian missions while dismounted and on the move. Paramount to their success is timely and effective information sharing and mission planning to enact more effective actions. In this paper, we describe a prototype multimodal collaborative Android application. The mobile application was designed to support real-time battlefield perspective, acquisition, and dissemination of information among distributed operators. The prototype application was demonstrated in a scenario where teammates utilize different features of the software to collaboratively identify and deploy a virtual tracker-type device on hostile entities. Results showed significant improvements in completion times when users visually shared their perspectives versus relying on verbal descriptors. Additionally, the use of shared video significantly reduced the required utterances to complete the task.
Repository Citation
Burnett, G. M.,
Wischgoll, T.,
Finomore, V.,
& Calvo, A.
(2013). Multimodal Mobile Collaboration Prototype Used in a Find, Fix, and Tag Scenario. 4th International Conference on Mobile Computing, Applications, and Services, 115-128.
https://corescholar.libraries.wright.edu/cse/329
DOI
10.1007/978-3-642-36632-1_7
Comments
Presented at the 4th International Conference on Mobile Computing, Applications, and Services, Seattle, WA, October 11-12, 2012.