Developing intelligent and adaptive tools supports the Army’s modernization efforts and provides the Warfighter with additional situational awareness, keeping them safer and enabling them to make smarter and more informed decisions on the battlefield.
Collected data will be used to train the sensors in recognizing and classifying objects in the field which improves the system’s accuracy and usability for future operations. The images collected will be labeled as specific types of objects in order to further train the model in identifying the same or similar objects of interest. Achieving a greater shared awareness at the edge facilitates collaboration between sensors, systems, and Soldiers.
Recently, components of the Synthetic Training Environment, or STE, have taken shape and will consist of One World Terrain — which compiles realistic and accurate virtual maps of territory — as well as training simulation/training management tool and virtual collective trainers. All of this will make up the soldier/squad virtual trainer and the reconfigurable virtual collective trainer.
“Right now, if I’m an Army soldier and I want to train for seizing a building, I get some opposing forces, I get some pop-up targets and things like that, and I rush into the building with real weapons and shoot.” But in a mixed reality world, holograms could be used to simulate enemy forces.
An issue that has plagued VR and AR headsets is latency between user actions and corresponding changes in the simulated environment. It reduces the effectiveness of simulations.
VR technology is going to keep improving because that’s all about the gaming community and the gaming business is growing.
Augmented reality is a greater concern.
“That’s because you are trying to integrate the live and the virtual world, and there are significant challenges in that. “For instance, if you don’t have your alignment of those two worlds absolute, then … you’re off a lot as you go down range, and we can’t have that.
“So the specific challenges that I’m going after in AR … is that alignment, it’s that tracking, it’s that dynamic occlusion piece.”
The Army would like to have a single piece of kit that could transition from AR to VR, so the service could “get both out of that.
It also wants goggles with more capable passive sensors attached to them.
“The Army doesn’t like active type of sensors because the enemy can see them. We want passive type of sensors. The problem today with passive sensors is you don’t get the distance, you don’t get the range capability … that we require to do unit training.”
What’s the point of virtual training if it doesn’t feel real? The Army is tackling the problem, driving realism into virtual training to enhance effectiveness, but it’s not an easy task — even for the gaming industry.
One of the challenges the service is facing as it embarks on developing a Synthetic Training Environment is “providing a realistic and immersive virtual training experience” that portrays “computer-generated people and objects behind real things and doing so in real time from multiple perspectives as actors and objects move around in the environment.”
This problem is improving “dynamic occlusion,” and the service is working to solve the issue. The dynamic occlusion issue is one with which video gamers are well-acquainted. “When virtual projections within a player’s view of the world are not layered appropriately with real-world objects, the experience feels unnatural,” which is an undesirable attribute for a realistic training experience.
To achieve realism, the system must be able to sense dynamic changes to the mission environment, updating 3D terrain pictures — or meshes — in real time.
“In military scenarios, the problem can adversely affect the learning experience or lead to negative habit transfer if a soldier can’t realistically take cover or if a vehicle crew is hindered from accurately aiming and firing on an enemy position."
The Army plans to mature and demonstrate augmented reality algorithms and techniques to occlude real or virtual dynamic objects in realistic, changing environments. “Occlusion of live, moving objects is challenging — doing so at long distances is even more so.”
The service’s augmented-reality, head-mounted displays for dismounted soldiers are limited to small, indoor environments because the hardware limits the ability to sense the world around them “at a meaningful distance.”
To make it work, a sensor must register and a computer must “see and understand” the live environment including changes. This allows for realistic placement of computer-generated holographic content.
The service anticipates dynamic occlusion range and accuracy to improve over the next year using the developed techniques and algorithms.
There are still fundamental breakthroughs in methods for large-area tracking and in dynamic occlusion algorithms — particularly algorithm optimization for weapon tracking — on the horizon.
More advancement is also needed to achieve the extreme simultaneous localization and mapping SLAM capabilities to construct and update maps of unknown environments while simultaneously keeping track of an agent’s location within it.
And while the Army works to bring more realism into its virtual constructive environments, achieving more realism in live training is challenging enough. The service is trying to find alternatives to its Instrumentable-Multiple Integrated Laser Engagement System, or I-MILES, a system that was developed for live force-on-force and force-on-target training at Army training locations around the world.
“Although I-MILES has seen enhancements over the years, laser-based systems inherently introduce artificialities into live exercises because of their limited ability to realistically represent lethal effects. A shrub or a cardboard box, for example, provides effective cover from a laser hit but would be useless in a firefight.”
The Army also wants to more accurately depict the effects of direct and indirect fire and train on more emerging longer-range or more sophisticated weapons that are difficult and expensive to depict in live training.
“Our goal with Live is to better replicate the lethality, vulnerability and effects of actual live-fire engagements at all of our Army training centers. “Simultaneously, the consequences of all the actions and the weapons systems in use must be accurately depicted in the virtual environment so soldiers training via simulation at other locations will have the same operational picture in real time.”
Army is jumping to the next level in virtual training.
Augmented reality systems enable troops to do mission planning across a variety of changing environments and adversaries, and customized training scenarios reinforce how to think tactically, to make rapid decisions and to communicate effectively.
The components of the STE have taken shape and will consist of One World Terrain — which compiles realistic and accurate virtual maps of territory — training simulation software, a training management tool and virtual collective trainers. All of this will make up the soldier/squad virtual trainer and the reconfigurable virtual collective trainer.
The idea is to be able to click on any place on a virtual globe and go there. Soldiers can then train virtually in an exact environment in which they can expect to operate in reality.
The training simulation software will support training simultaneously across many locations and training platforms. The training management tool allows users to build training scenarios through simulation databases
.
The virtual trainers are being designed for dismounted, air and ground formations to train from a squad level through battalion, and ultimately at higher echelons. The trainer for the soldier and squad will support individual and collective task at the smallest formation.
The reconfigurable virtual collective trainers, or RVCT, will represent Army and Marine Corps air and ground systems for training at the unit level and will be used for mission rehearsals at every echelon.
The new trainer takes what was a tethered system and — while it still uses projectors and screens — allows users to move around a base with more flexibility, which is more operationally realistic.
Ultimately, the projectors and screens could be replaced by a headset, which is in keeping with the service’s requirement to bring trainers to an operational unit in the field or at home station. This means the system must be easy to set up and transport.
1. Create a More Engaging Training Experience.
2. Do Scenarios that are Impossible with Static Training
3. More Practical Hands-on Approach
4. Make Serious Mistakes — and Walk Away From Them
5. Encourage Exploration and Trial & Error
6. Boost Learning Retention
7. Appropriately Pace Learning
8. Improve Trainee Performance
9. Reduce Training Cost
10. Speed to Train