Engineers at the Johns Hopkins Applied Physics Laboratory (APL), in Laurel, have designed and implementing what it’s calling the Mixed Reality Collaboration Environment framework, or REACTOR. The system allows subject matter experts and analysts to interact and collaborate globally with no externally hosted servers, and with the capability to customize the approach to the user’s needs.
The extended reality (or XR) technology supports bringing virtual reality (a virtual environment), augmented reality (an overlay of virtual content that doesn’t allow interaction) and mixed reality (virtual objects that can interact with the actual environment) into a collaborative framework that promotes remote, synchronous and cross-domain analysis and discussion.
“REACTOR is designed to bring people together to view the same information in a collaborative and immersive virtual space, and its modularity allows developers to customize it for their needs,” said Kevin Torgas, a developer in AMDS and engineer on the project. “It was designed without a specific application in mind, which means any AR/VR platform will have the capability to join into the immersive space.”
REACTOR uses include mission planning, trajectory analysis, data visualization and creating teaching aids for complex objects or objects with no tangible physical counterpart. REACTOR has grown into a stand-alone framework that can support multiple users’ platforms, as well as real-time data-state alignment.