SITIS Archives - Topic Details
Program:  SBIR
Topic Num:  A10-091 (Army)
Title:  Adversarial Reasoning for Combined Unmanned Aerial Systems (UASs) and Unmanned Ground Vehicles (UGVs)
Research & Technical Areas:  Air Platform, Information Systems, Ground/Sea Vehicles

Acquisition Program:  PEO Ground Combat Systems
  Objective:  Design, develop, and demonstrate a mission execution framework and software application to enable tracking and surveillance of uncooperative targets by combined teams of UASs and UGVs.
  Description:  This research effort will focus on the development of a Battle Command framework to support collaborative teaming of Unmanned Aerial Systems (UASs) and Unmanned Ground Vehicles (UGVs) to enable autonomous target tracking in the inherent challenge of clutter and occlusions in urban and complex environments. To add to this challenge, adversarial targets will attempt to evade detection though complex maneuvers and utilization of natural and manmade terrain features. This topic is not intended to address automatic target recognition nor target identification. Rather, the focus shall be the development of the algorithms to characterize and predict elusive target behavior and the mission execution framework to direct the collaborative actions between multiple, heterogeneous unmanned air and ground vehicles. Additionally, this execution framework shall be extensible to other collaborative missions beyond target tracking. Dismounted operators at the tactical edge will input high-level tasks into the application via a handheld mobile device while the mission execution framework would decompose missions into their underlying tasks and represent these actions by a common descriptive language. Coupled with adversarial reasoning algorithms to characterize and predict target behavior, the software would optimally sequence and allocate tasks across the team of UASs and UGVs to maintain target tracking. This application is to be run on a handheld device running an operating system such as the Apple iPhone OS, Android, Windows Mobile, or similar environment. Furthermore, this requires the development a novel user interface to manage the mission throughout execution. Such an interface would leverage recent advances in multimodal inputs including multitouch, gesture and voice recognition to mitigate the cognitive load placed on the Warfighter. The execution of the mission by the unmanned vehicle team is to primarily autonomous, however the operator shall have the ability to monitor mission status, input new tasks, and update mission goals. Examples of interaction may include free-hand path planning/drawing, image and video management, or map manipulation.

  PHASE I: Analyze how the Army currently uses UASs and UGVs to perform visual target tracking missions and develop a conceptual mission execution framework for collaborative missions. Research shall include development of algorithms to characterize and predict elusive behavior by the target. Additionally, the offeror will provide a plan for practical deployment of the proposed solution the Warfighter in an operation environment.
  PHASE II: Build and demonstrate a prototype software application on the basis of Phase I study which implements mission execution framework to deploy a team of UASs and UGVs to autonomously track an evasive target. Phase II research shall include design and development of a multi-modal user interface. Application shall be demonstrated on an applicable handheld device running an operating system such as the Apple iPhone OS, Android, or Window Mobile. Level of autonomy shall be demonstrated along with an assessment of operator workload during mission execution.

  PHASE III: Finalize development of UAS and UGV visual target tracking application. Identify opportunities for insertion into existing current and future force C2 systems such as FCS Battle Command and the Joint Battle Command Platform (JBC-P). Additional commercial applications of the mission execution framework and target tracking application may include Homeland Security, Boarder Patrol, and local law enforcement organizations.

  References:   1. M. Moseley, B.P. Grocholsky, C. Cheung,S. Singh. "Integrated Long-range UAV/UGV Collaborative Target Tracking" Proceedings of the SPIE, Volume 7332 (2009)., pp. 733204-733204-11 (2009). 2. A. Kott. "Battle of Cognition", Praeger Security International, 2008. 3. S. Russell and P. Norvig. "Artificial Intelligence A Modern Approach - Second Edition", Prentice Hall, 2003. 4. US Army Field Manual FM 3-04.155 Army Unmanned System Operations US Army Field Manual FM 5.0 Military Decision Making

Keywords:  Command and Control, UAS, UGV, Multi-touch, Gesture

Additional Information, Corrections, References, etc:
Ref #4: available at:

Questions and Answers:
Q: Is "development of a Battle Command framework" meant generically - a battle command framework - or does it mean the framework must incorporate / comply with / be compatible with a specific entity called Battle Command?
A: The Battle Command framework may be interpreted more generally as the services which coordinate and allocate tasks across the unmanned vehicle team.
Q: What kinds of target are we talking about; dismount, ground vehicle, water vehicle, aerial vehicle, all of the above?
A: Both dismounted and ground vehicle targets may be considered.

Q: Is there a way to access the following references:
4. US Army Field Manual FM 3-04.155 Army Unmanned System Operations
US Army Field Manual FM 5.0 Military Decision Making
A: Link is:
Q: Your link for the field manual website ( prompts me for a user name and password to download field manuals. Is there a user name and password available?

"The server at AKOCOMM.US.ARMY.MIL - Enter AKO Username and Password [11:08:25:1089] requires a username and password"
A: Try These Links Instead:

Q: We have virtual tool to simulate the real time scenario where UASs and UGVs team up to track evading targets. Smarts of adversarial targets can be applied to the virtual tool to maneuver targets in virtual world. Complex actions can be easily staged; for example, hiding in high density traffic, affinity for hiding behind obstacles, variations in speed/direction, collusion between multiple targets meeting under bridges, etc. Variations with target’s activity can be quickly adjusted. The tool can also accept real-time input for multiple human-controlled adversaries.
Q: Is there interest in using this kind of simulation tool as the basis for a Phase I investigation?
A: The tool may be discussed in detail in the proposal and will be considered in the evaluation process.

Record: of