SITIS Archives - Topic Details
Program:  SBIR
Topic Num:  A10-073 (Army)
Title:  Multisensory Navigation and Communications System
Research & Technical Areas:  Information Systems, Human Systems

Acquisition Program:  
 The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), which controls the export and import of defense-related material and services. Offerors must disclose any proposed use of foreign nationals, their country of origin, and what tasks each would accomplish in the statement of work in accordance with section 3.5.b.(7) of the solicitation.
  Objective:  Design and build a proof-of-concept system that enables investigation of multisensory navigation and communications among dismount soldiers. The system would enable dismount soldiers to quickly navigate to or away from specified waypoints or areas, while maintaining radio silence and light security. The system should utilize real-time visual and tactile cues to enable hands-free navigation and critical communications when silence must be maintained. In addition, the system will log user actions and present queries to enable detailed and automatic assessment of performance accuracy, time, and situation awareness.
  Description:  Prototypical torso-mounted tactile displays have proven effective for navigation and communication in field evaluations. These displays, if integrated with GPS, enable Soldiers to navigate at night hands-free (allowing the soldier to hold his/ her weapon) and eyes-free (allowing focused attention to surroundings as opposed to a visual display). Torso-mounted displays have also proven effective for Soldier communications, and in fact, proved better than arm & hand signals for critical communications, even during strenuous combat movements. However tactile systems must be integrated with visual and command systems to enable map-based situation awareness and easy input of waypoints. The system must be able to receive communications regarding waypoints, off-limits areas, and rally points, in real time. This system would augment battlefield visualization techniques now common to command and control, by enabling the commanders to quickly relate critical communications as to where to go or where to shoot in a manner that is immediately and intuitively understood. The integration of a visual command center with distributed tactile communications enable dynamic battle maneuvers with intuitively understood signals. Data-logging capabilities will enable in-depth research in multisensory perception and decision making. A critical aspect of this system is its contribution as a research platform to investigate effectiveness of alternative tactile patterns and multisensory arrays. Flexible multisensory cues and data-logging capabilities will enable in-depth research in multisensory perception and decision making.

  PHASE I: Research and develop an overall networked system design and a proof-of-concept prototype system that includes integrated GPS-driven tactile cues integrated with a visual map display. The proof-of concept should enable stand-alone navigation by a single person while the system design should specify the characteristics and procedures to build a networked system.
  PHASE II: Develop and demonstrate a networked prototype system in a realistic environment. Conduct human interface, task cost/benefit and ergonomic evaluations across several different mission scenarios using both novice and experienced soldiers.

  PHASE III: Demonstrate capabilities to military POCs. This capability can be integrated into current Soldier and command and control systems. In addition, there are many commercial applications, from enabling navigation of first-responders (firemen, rescue personnel, etc.) to activities such as hiking, skiing, or touring an unfamiliar city.

  References:  1. Duistermaat, M., Elliott, L., van Erp, J., Redden, E. (2007). Tactile land navigation of dismount soldiers. In D. de Waard, G. Hockey, P. Nickel, K. Brookhuis (Eds.) Human factors issues in complex environement performance. Europe chapter of the Human Factors and Ergonomics Society: Shaker Publishing. 2. Van Erp, J. B. F. (2002). Guidelines for the use of vibro-tactile displays in Human Computer Interaction. Proceedings of Eurohaptics 2002 (pp. 18-22). Eds. S. A. Wall, B. Riedel, Crossan, A. & McGee, M. R. Edinburgh: University of Edinburgh. 247-256. 3. Van Erp, J. B. F. (2005). Presenting directions with a vibrotactile torso display. Ergonomics, 48(3), 302-313. 4. Elliott, L.R., Duistermaat, M., Redden, E., van Erp, J., (2007). Multimodal Guidance for Land Navigation. (Technical Report No. ARL-TR-4295). Aberdeen Proving Ground: Army Research Laboratory. 5. Wickens, C. D., & Hollands, J. G. (2000). Attention, time-sharing, and workload. Chapt. 11 in Engineering Psychology and Human Performance. NY: Prentice Hall.

Keywords:  Land navigation; Soldier performance; multisensory display; multimodal display; command and control

Questions and Answers:
Q: 1. For the Phase I proof-of-concept, who should be able to enter destinations -- remote command, the soldier, or both?

2. Should the prototype enable navigation via solely tactical cues (with the visual display as an augment) or can the use of both be required for successful navigation?
A: A1: Ideally both; however, it is up to the proposer to determine the reasonable
limits of any solution put forward.

A2: In the ideal case, once a map with waypoints, etc. has been entered into the system, navigation should be possible solely via the tactile cues, with the visual system as a back-up/alternative. Risk reduction and graceful failure modes will be important items to be reviewed in any proposed solution.
Q: Could the visual system back-up/alternative and data entry take place on an Android platform mobile device, witch tactile navigation produced by proprietary connected systems?
A: I am not sure I understand the question. Could you elaborate (thru another posting in SITIS System) and provide further information on what is meant by Android platform? Generally speaking, the visual display should allow data logging of performance such as gps tracking of actual movements, timestamped, that can be downloaded at the end of performance. It should also allow ease of use of the tactile display, as a controller (e.g., selecting waypoints, etc.). The proprietary nature of a particular tactile system would have to be considered in terms of impact on interactions with other systems and overall flexibility.
Q: Does the topic sponsor prefer hands free navigation that is also ear and audio free?
A: The system should be capable of communicating direction and various alerts without relying on audio. If the system has audio, it should be able to be turned off and demonstrate capabilities without it.
Q: To clarify, would it be acceptable to use a mobile device powered by the Andoid operating system as the primary device to receive, enter, and select waypoints, while using other pieces of hardware to provide the necessary tactile navigation cues?
A: If I understand you correctly, you are suggesting the use/integration of commercial off-the-shelf products. We do not have any constraints on the proposed technological approaches or platforms for the phase 1 effort; however, the proposer will have to consider issues and justify the approach taken in terms of its eventual phase 2 and phase 3 integration with standard military equipment.
Q: What is meant by "radio silence"?
Is "radio silence" the absence of voice communication by radio or the absence of all RF transmissions from a soldier?
A: By "radio silence" we were referring to lack of acoustic noise.

Record: of