SITIS Archives - Topic Details
Program:  SBIR
Topic Num:  AF071-021 (AirForce)
Title:  Team performance measurement and tracking in collaborative environments
Research & Technical Areas:  Information Systems, Human Systems

  Objective:  Conduct research to develop and validate methods for identifying, tracking, analyzing and reporting team performance in collaborative environments.
  Description:  Much of the Air Force’s current Intelligence, Surveillance, and Reconnaissance (ISR) team work and command and control (2) decision making is accomplished in distributed, networked environments. Teams operate remotely and often asynchronously with unknown teammates to carry out complex missions, coordinate command and control (C2), or make high-tempo decisions. These activities are conducted by the team as a unit, therefore assessment must also be conducted at the team level. ISR data integration and C2 decision making used to be constrained by limitations in information flow. Today’s environment, in contrast, requires systems and operators to manage large amounts of information efficiently and effectively and to provide specific information to others for their decision making. When team interactions are highly interdependent, system-wide error can occur as the results of a single weak link in the system. Without the ability to manage information, coordination becomes error prone and mission success could suffer. Consequently, there is a limited research base on easy to routinely and in real time assess and track coordination, communication, collaboration and information sharing activities and performance on a team and team of teams level of analysis. Have such a capability would help to identify critical points of potential failure and flag them for attention. Typical solutions to this problem involve time-intensive techniques that are based on observation, but are one step removed from actual data streams. How can we deduce the cognitive state, or the current level of expertise of the war-fighter directly from operational or training data? Some of the best data for assessing situation awareness, decision making, and leadership in team settings are also the least used. Communications – in a variety of forms – are ubiquitous and easy to capture. Research in team communication analysis has produced valuable insights into the structure of expert knowledge concerning military operations, and patterns of discourse in successful teams. However, it is unclear how the approaches used in these studies relate to other performance metrics in the contexts of interest (e.g., observer based or simulation derived metrics) or how they might ‘scale up’ to larger and widely distributed teams and teams of teams. Assessment of large team performance and vulnerabilities in social networks is needed in real-time or even in a predictive mode in order to monitor and intervene to prevent such failure. The goal of this effort is to develop, demonstrate, and validate a team collaboration construct-oriented system for performance measurement that includes both embedded objective and observer-based subjective approaches to tracking, analysis, warehousing, and reporting. The system would also be able to support ‘alarming’ decision drop outs and a data integration and display suitable for after action review and analysis.

  PHASE I: Investigate existing and candidate performance measurement tools in a specific ISR/C2 environment. Develop and demonstrate measurement techniques/applications/methodologies that will help tracking, assessing and possibly diagnosing operator performance based on coordination, communication, collaboration and information sharing analyses. Demonstrate the validity of this approach in a bounded context.
  PHASE II: Develop and demonstrate a functional prototype of a team collaboration performance assessment, tracking and after action review capability/system that integrates the validate metrics. Conduct validation studies of the measures, the functions of the system and the utility of the data for diagnosis and after action review in a C2/ISR team collaboration environment.

  PHASE III DUAL USE APPLICATIONS: Performance measures that effectively utilize information about communications flow and content would have wide application in dynamic and distributed environments throughout the military services. While the focus this effort is on training applications, the resulting measurement techniques will be applicable in operational environments as well. Private sector organizations are continuing to move toward more distributed teamwork activities. However, there has historically been no systematic approach to monitoring the workflow in these distributed teams. This effort, if successful will provide a unique and validated methodology for addressing a significant gap in distributed organizations.

  References:  1. Bower, M. J., (2003). Distributed Mission Training. Military Training Technology, Vol. 8, Issue 4. http://www.military-training-technology.com/article.cfm?DocID=272. 2. Phister, P.W., Cherry, J.C., (2005). Command and Control Implications of Network-Centric Warfare. AFRL Horizons, February 2005, Document # IF-04-09. http://www.afrlhorizons.com/Briefs/Feb05/IF0409.html.

Keywords:  Team Collaboration, Command and Control (C2), Intelligence, Surveillance, & Reconnaissance (ISR), training, performance measurement, data warehousing, performance-based after action review

Additional Information, Corrections, References, etc:
AF071_021 Q n A.doc
AF071_021 Q n A.doc

Questions and Answers:
Q: I assume the accessibility of certain existing technologies as well as the data and information repositories of DMO initiatives. Further investigation and evaluation of existing measurement tools in a specific ISR/C2 environment requires the access and coordination of related organizations, such as DMT and DMO centers, to ensure the success of R&D if my proposal is awarded. Are these assumptions valid?
A: Any access to data will be provided to the successful offeror. It is important to note that the vast majority of data for this effort will be classified secret. Proposers need to consider this in their proposal.
Q: If a proposer does not have a secret clearance on hand, can they do some work without touching the secret data and information? Will a scret clearance be required?
A: The content domain for this topic is secret. It will be very hard for an offeror who does not have someone cleared to at least secret to complete the work envisioned. This is one of those topics that has sponsorship and relevance in several research/operational communities and working unclassified will make phase II work difficult if not impossible to complete.
Q: 1. Should the effort be focussed more on performance in TST or TCT situations, or attempt a broader investigation of both levels?
2. To what extent should the project focus on the development of novel algorithms for assessing the team performance state, versus the development of interfaces for presenting the task performance to an observer?
A: 1. I am not sure you have the focus for this topic correct. I am looking for much broader teamwork activities that might be evidenced in the domain you as about. With this in mind, please review the examples provided in the Q&A file that is posted to SITIS:

A context for this topic application and some ideas to help frame the discussion in terms of scenarios of relevance: There are two examples:
The first has to do with coordination and information exchange among teams in a command and control situation. In this situation, individuals and small teams have to gather, evaluate, request and pass along information for decisions and for mission success. What data are important to examine? For example things like chat and where someone goes to seek data for a solution, etc. How often should we be looking at critical metrics? What do we do when things are "out of range" and what does out of range mean in this context? Do we set off collaboration alarms to tell someone that the process has broken down or it's likely to break down? The second example situation has to do with an airborne example from air combat where we have several fighter aircraft (four to be exact) and they are on a mission and are coordinating their activities with airborne battle managers on an AWACS aircraft and with command and control and targeting personnel on the ground. In this situation what are the key collaboration events and data associated with planning the mission, executing the mission, and after action reviewing the mission. Are there critical collaboration events or triggers that tell us something about the likelihood of mission success given that things did or did not happen as expected. What do we do with the information - simply save it for the AAR or do we try and target an intervention in real time. What key metrics and measures need to be tracked, warehoused, and accessed for diagnosis/analysis after the fact?

2. I want defensible (theoretical and practical) approaches to the problem as stated. However, if you read that the emphasis in the topic is on the latter, I'd reconsider. To your question on the former, its more than algorithms. I am looking for research and implementations related to identifying what is important to look at, how best to look at it, and what to put together as metrics and measures. See the last part of the scenario descriptions above.
Q: I assume the accessibility of certain existing technologies as well as the data and information repositories of DMO initiatives. Further investigation and evaluation of existing measurement tools in a specific ISR/C2 environment requires the access and coordination of related organizations, such as DMT and DMO centers, to ensure the success of R&D if my proposal is awarded. Are these assumptions valid?
A: Any access to data will be provided to the successful offeror. It is important to note that the vast majority of data for this effort will be classified secret. Proposers need to consider this in their proposal.
Q: If a proposer does not have a secret clearance on hand, can they do some work without touching the secret data and information? Will a scret clearance be required?
A: The content domain for this topic is secret. It will be very hard for an offeror who does not have someone cleared to at least secret to complete the work envisioned. This is one of those topics that has sponsorship and relevance in several research/operational communities and working unclassified will make phase II work difficult if not impossible to complete.
Q: 1. Should the effort be focussed more on performance in TST or TCT situations, or attempt a broader investigation of both levels?
2. To what extent should the project focus on the development of novel algorithms for assessing the team performance state, versus the development of interfaces for presenting the task performance to an observer?
A: 1. I am not sure you have the focus for this topic correct. I am looking for much broader teamwork activities that might be evidenced in the domain you as about. With this in mind, please review the examples provided in the Q&A file that is posted to SITIS:

A context for this topic application and some ideas to help frame the discussion in terms of scenarios of relevance: There are two examples:
The first has to do with coordination and information exchange among teams in a command and control situation. In this situation, individuals and small teams have to gather, evaluate, request and pass along information for decisions and for mission success. What data are important to examine? For example things like chat and where someone goes to seek data for a solution, etc. How often should we be looking at critical metrics? What do we do when things are "out of range" and what does out of range mean in this context? Do we set off collaboration alarms to tell someone that the process has broken down or it's likely to break down? The second example situation has to do with an airborne example from air combat where we have several fighter aircraft (four to be exact) and they are on a mission and are coordinating their activities with airborne battle managers on an AWACS aircraft and with command and control and targeting personnel on the ground. In this situation what are the key collaboration events and data associated with planning the mission, executing the mission, and after action reviewing the mission. Are there critical collaboration events or triggers that tell us something about the likelihood of mission success given that things did or did not happen as expected. What do we do with the information - simply save it for the AAR or do we try and target an intervention in real time. What key metrics and measures need to be tracked, warehoused, and accessed for diagnosis/analysis after the fact?

2. I want defensible (theoretical and practical) approaches to the problem as stated. However, if you read that the emphasis in the topic is on the latter, I'd reconsider. To your question on the former, its more than algorithms. I am looking for research and implementations related to identifying what is important to look at, how best to look at it, and what to put together as metrics and measures. See the last part of the scenario descriptions above.

Record: of