|Acquisition Program: ||Program Management Test, Measurement, and Diagnostic Equipment (PMTMDE)|
| ||RESTRICTION ON PERFORMANCE BY FOREIGN CITIZENS (i.e., those holding non-U.S. Passports): This topic is “ITAR Restricted”. The information and materials provided pursuant to or resulting from this topic are restricted under the International Traffic in Arms Regulations (ITAR), 22 CFR Parts 120 - 130, which control the export of defense-related material and services, including the export of sensitive technical data. Foreign Citizens may perform work under an award resulting from this topic only if they hold the “Permanent Resident Card”, or are designated as “Protected Individuals” as defined by 8 U.S.C. 1324b(a)(3). If a proposal for this topic contains participation by a foreign citizen who is not in one of the above two categories, the proposal will be rejected.|| Objective: ||Develop an advanced equipment maintenance capability, based upon revolutionary Augmented Reality (AR) technology, that is designed to provide Marine Corps equipment maintainers with rapid and intuitive access to complete weapon systems data.
|| Description: ||For many years, the Marine Corps and DoD have been using paper technical manuals and a variety of Interactive Electronic Technical Manuals (IETM) to guide equipment maintainers through a variety of complex maintenance procedures. These have been utilized on equipment such as land vehicles, radar systems, radio communications systems, and ground based weapon systems. These technical manuals require the maintainer to often flip, scroll, or hyperlink though a long series of pages full of text, drawings, and computer programs in order to locate the information that he needs to debug maintenance issues. The use of these types of technical manuals can tend to be time consuming, tedious, and limited to either desktop or laptop computer viewing methods. Extracting the appropriate data can require much time and searching in many cases.
In light of this, there is currently a need in the Marine Corps to provide its equipment maintainers with a cutting edge, graphic intensive, interactive, and highly intuitive Augmented Reality (AR) based equipment maintenance capability that provides Marine Corps equipment maintainers with rapid and intuitive access to complete weapon systems data. It must equip them with the intelligence and data pertaining to any given Marine Corps weapon system for the purpose of quickly and accurately assessing its true condition, and restoring that system to full operation. This includes maximal automation and auto-documentation. The end state of this particular capability will allow an average maintainer to be equipped with all of the technical knowledge that he needs to perform maintenance on any given piece of Marine Corps equipment; even while he is being trained in the field (post schoolhouse). This capability, when used by the maintainer, will allow him access to the exact data that he needs at the time that he needs it.
Such technology can be used to rapidly expand a Marine Corps maintainer’s awareness by providing timely and valuable information and instructions to him as he attempts to repair a malfunctioning piece of equipment. As the maintainer nears the asset, a small display system would generate augmented reality overlays on his visual field thru what appear to be low profile, safety-like eyewear. Using the latest visual processing methods, microminiaturized, high resolution cameras embedded within the frame of the glasses would use the true visual field as the maintainer moves about the asset as a relative reference for any visual augmentation to the field of view. The augmented information would be clear and stark, would not present any safety hazard, and would be viewable in any lighting or dust conditions.
The low profile, safety glasses would also come equipped with integral earpieces that provide noise reduction circuitry as well as audio to the maintainer. The AR based system will even know if the maintainer is performing the wrong action, as the same visual capability that provides/drives the augmented reality will be able to derive the position and placement of the maintainer as he moves upon/within the asset. The key here is that the detailed physical layouts and topology of the weapons systems themselves rarely change except within a certain range of motion. (i.e., Line Replaceable Unit (LRU), chassis and cabinet modifications, cable removals etc.). Therefore, the potential for the AR based system to detect incorrect actions is enhanced.
For example, consider a situation where a maintainer approaches a Light Armored Vehicle (LAV)-25 to perform diagnostics. As he approaches the vehicle, augmented visual overlays driven by validated, data-based prior maintenance actions and remote sensoring, with behavioral input indicate to him that the LAV-25 is not currently due for any planned maintenance. However, sensor data and behavioral trending indicate that unusual electrical noise had been detected on several analog and digital data lines within the hull during the last mission. As the maintainer enters the powered down LAV-25, he sees three-dimensional renditions of the suspected noise signal locations and is able to view exact signal routing layouts and measurement points without opening a single module.
At this point he is ready to energize the LAV-25. He is not a seasoned veteran, but a young Marine who is still acquiring experience, and the AR based system knows the skill level of the actual maintainer using it (the system would ‘rate’ his skill level based on training, and behavior tracking each time the maintainer ‘logs in’). The AR based system prompts if he is ready to start up the LAV, and as the maintainer indicates, he is well instructed (audibly) by the system as if a seasoned maintainer were standing next to him. The system, equipped with AR technology, shows the Marine exactly where to go and what to do for a safe startup. The instructions are seamlessly overlaid upon the actual areas on which the maintainer must interact.
There is currently no Augmented Reality (AR) based system in existence today that offers all of these capabilities. Most AR based maintenance aids simply utilize predefined procedures to guide an equipment maintainer through maintenance tasks with very few smarts in the system (see reference #2 below). A tremendous amount of research and development would be needed to make an AR based capability with the all of the smarts that this SBIR topic describes. In light of this, a great deal of risk would be involved in the development of this capability.
|| ||PHASE I: Design a cutting edge, graphic intensive, interactive, and highly intuitive Augmented Reality (AR) based equipment maintenance capability. This would provide Marine Corps equipment maintainers with rapid and intuitive hands free access to complete weapon systems data while fully viewing the area or platform in need of maintenance.
This includes detailed, fully functional, three dimensional, mechanical, electrical signal level, and block diagrammatic information that can be accurately overlaid or manipulated in context within the maintainer area of visual interest while working on a weapons system. This information must be available upon need, as the situation may require, in an intuitive fashion, but not as to distract the maintainer. This system must be capable of data integration with external test equipment to verify displayed parametric signal data with AR represented signal data and show matching or mismatching of data within the same field of view. The system must be capable of interaction and recognition of both seasoned maintainers who require little intervention and less experienced technicians who will require more enriched data. Active learning and auto-documentation of completed and verified work taskings can also occur within this environment. The capability must be designed to augment a Marine Corps maintainer’s view of the maintenance environment by providing timely and valuable information and instructions to him as he attempts to repair a malfunctioning piece of equipment. The AR system must be capable of overlaying information within the maintainers’ normal field of view in real time. The overlaid, full color data must adjust for lighting, and track correctly to the equipment of interest even as the maintainer moves about. All methods to eliminate vertigo, eye strain or disorientation must be utilized to maximize utility and ensure safety. Interaction between the user and the system must be as intuitive and seamless as possible. It must also be designed to know if the maintainer is performing a wrong action via real time detection of the position and placement of the maintainer as he moves upon/within the test asset. Upon proper identification and verification of a failed part, the ability to order the item automatically with full auto documentation will help maximize work throughput. The additional use of integral noise-canceling, audio interfacing (listening and speaking) technology will accelerate data transfer. This type of interfacing will provide an unparalleled training and data transfer capability that currently does not exist within the DoD.
|| ||PHASE II: Develop a lightweight, hands free prototype AR based equipment maintenance capability that fully implements the capabilities developed in Phase I.
|| ||PHASE III: Develop the AR based equipment maintenance capability prototype for field demonstration of equipment maintenance for specific DoD platform applications. Transition the capability to the fleet.
PRIVATE SECTOR COMMERCIAL POTENTIAL/|| ||DUAL-USE APPLICATIONS:
The proposed novel technology would have broad civilian impact for equipment maintainers in diagnosing and repairing problems associated with a variety of electronic systems in numerous commercial applications.
|| References: ||1. Feiner, S., MacIntyre, B., and Seligmann, D., “Knowledge-Based Augmented Reality”; Communications of the ACM – July 1993, Vol. 36, No. 7, p. 53.
2. Boeing website – http://www.boeing.com/defense-space/support/training/instruct/augmented.htm
3. Columbia University Computer Graphics and User Interface Lab’s website –
http://www1.cs.columbia.edu/graphics/top.html - June 2005
4. Flexwork website – http://www.flexwork.eu.com/members/iststor/starmate.pdf|
|Keywords: ||Maintenance; training; automation; augmented reality; equipment; data overlay|