Multimodal Mobile Collaboration Prototype Used in a Find, Fix, and Tag Scenario
Find this in a Library
Given recent technological advancements in mobile devices, military research initiatives are investigating these devices as a means to support multimodal cooperative interactions. Military components are executing dynamic combat and humanitarian missions while dismounted and on the move. Paramount to their success is timely and effective information sharing and mission planning to enact more effective actions. In this paper, we describe a prototype multimodal collaborative Android application. The mobile application was designed to support real-time battlefield perspective, acquisition, and dissemination of information among distributed operators. The prototype application was demonstrated in a scenario where teammates utilize different features of the software to collaboratively identify and deploy a virtual tracker-type device on hostile entities. Results showed significant improvements in completion times when users visually shared their perspectives versus relying on verbal descriptors. Additionally, the use of shared video significantly reduced the required utterances to complete the task.
Burnett, G. M.,
& Calvo, A.
(2013). Multimodal Mobile Collaboration Prototype Used in a Find, Fix, and Tag Scenario. 4th International Conference on Mobile Computing, Applications, and Services, 115-128.