Multimodal Mobile Collaboration Prototype Used in a Find, Fix, and Tag Scenario

Document Type

Conference Proceeding

Publication Date

2013

Find this in a Library

Catalog Record

Abstract

Given recent technological advancements in mobile devices, military research initiatives are investigating these devices as a means to support multimodal cooperative interactions. Military components are executing dynamic combat and humanitarian missions while dismounted and on the move. Paramount to their success is timely and effective information sharing and mission planning to enact more effective actions. In this paper, we describe a prototype multimodal collaborative Android application. The mobile application was designed to support real-time battlefield perspective, acquisition, and dissemination of information among distributed operators. The prototype application was demonstrated in a scenario where teammates utilize different features of the software to collaboratively identify and deploy a virtual tracker-type device on hostile entities. Results showed significant improvements in completion times when users visually shared their perspectives versus relying on verbal descriptors. Additionally, the use of shared video significantly reduced the required utterances to complete the task.

Comments

Presented at the 4th International Conference on Mobile Computing, Applications, and Services, Seattle, WA, October 11-12, 2012.

DOI

10.1007/978-3-642-36632-1_7

Catalog Record

Share

COinS