RealityMedia
Augmented Environments Lab, Georgia Institute of Technology
Research Project (2021-present)
Reality Media encompasses three related writing spaces and three technologies for representing ideas: print, 2D web, and 3D immersive VR and AR. We conducted empirical research with a mixed-method approach investigating users’ mental models behind hyperspatial navigation in VR.
My Role: Researcher, UX Designer
Work-in-progress papers: <RealityMedia: Immersive technology and Narrative space>, <Welcome to the Metaverse: Investigating Users' Mental Models behind Hyperspatial Navigation>
Motivation
With the adoption of VR, there has been heated discussion on how to best implement and use the technology to facilitate information-rich spaces. A challenging problem for immersive interfaces, be they current AR/VR interfaces or future Metaverse interfaces, is how best to present traditional information formats (e.g., books, magazines, newspapers and other written or printed documents) in the virtual environment. Little attention has been paid to how best to integrate VR systems with the extensive body of information and legacy 2D content available on the World Wide Web(WWW), and how to create unified systems that can be accessed from both sorts of interfaces.
The typical solution is to embed the information directly in the immersive space, using billboards or other familiar metaphors such as virtual computer displays. However, this poses some problems in designing VR experiences for presenting, viewing and reading text-heavy content and navigating information spaces that are densely connected by hyperlinks or distributed hypertext systems such as the internet. It is especially problematic if we want the content accessible on traditional 2D computer interfaces.
Our question is then, how do we envision the future of information-rich immersive spaces in VR, where rich 2D content needs to be readily available to users?
System Design
We designed and implemented a testbed, RealityMedia using a customised version of Mozilla Hubs, a WebXR platform. RealityMedia is a complement to the printed book <Reality Media>, providing information about the history of reality media and how augmented reality and virtual reality are taking their places in contemporary media culture.
Our approach is to create interlinked 3D hypermedia and 2D hypertext systems, rather than requiring that all 2D content be somehow embedded in the immersive space. To implement our approach, we leverage the WebXR technology in modern web browsers, including those in stand-alone immersive devices such as the Meta Quest2 and Microsoft Hololens2. It allows web apps to tap the power of the web along with a device's XR capabilities, bridging the gap between discursive and immersive environments for both desktop and standalone AR/VR headset users. Using web-based technology, the system can also naturally fall back to using “3D in a window” on traditional 2D displays, allowing the immersive space to be navigated with keyboard and mouse. That is, the user can engage in different forms of information by traversing the hyperlinks between web “pages” that support 2D and 3D display, on both 2D and 3D displays
The information space is organised as a collection of galleries, many of which resemble the rooms of a conventional museum, although some of the galleries recall other built structures (e.g., the amphitheatre). The galleries present 2D content in the form of text, images and videos. All the galleries are accessed from a central rotunda via teleporting portals. In addition to the 2D content, RealityMedia also makes use of the affordances and conventions of VR to organise and express ideas in the immersive environment.
The information space is organised as a collection of galleries, many of which resemble the rooms of a conventional museum, although some of the galleries recall other built structures (e.g., the amphitheatre). The galleries present 2D content in the form of text, images and videos. All the galleries are accessed from a central rotunda via teleporting portals. In addition to the 2D content, RealityMedia also makes use of the affordances and conventions of VR to organise and express ideas in the immersive environment.
To allow users to get inside AR and VR—to inhabit these new reality media, we designed the spaces with less text and more audio-visual materials. For example, the relationships among ideas presented in each gallery are portrayed in the form of 3D force graphs with which the user can interact. Another example is the 360-degree panoramic spheres, which provide complete immersive experiences that are embedded in galleries but visited as independent world spaces (e.g., a virtual trip to the surface of Mars, a virtual Acropolis, and a recreation of the well-known VR ‘Pit experiment’ conducted at the University of North Carolina in the 2000s).
User Study
Our study focused on understanding the participants' mental models, their spatial perceptions regarding hyperspatial navigation and their experiences with RealityMedia. Additionally, we aimed to uncover the challenges and opportunities in integrating the 2D and 3D spaces to leverage an information space.
We conducted mixed methods through user logging, survey, notes, and semi-structured interviews to accomplish this. We focused on eliciting the experiences of the participants and asked open-ended questions about their understanding of hyperspatial navigation and hypertextualised information spaces. We collected and analysed user-generated drawings, with the primary goal being the identification and analysis of key mental models.
GVU Posters 22'