What Will It Take for Virtual Reality to Become More Immersive?

Jun 1,  2017 -  

By Achin Bhowmik

The rapid advancement of perceptual computing over the past decade is providing amazing insights into how we would naturally interact with technology. Devices now have the ability to sense the environment around them, providing new possibilities for compelling virtual reality (VR) experiences.

For a VR device to create fully immersive experiences, it must blend the real-world elements into the virtual world, and enable natural interactions with the digital content. My keynote today at the  Augmented World Expo, centered on this necessary evolution. Today, we can transport someone to a live concert where they can have a front-row seat, which is an amazing feat, but we have yet to reach the stage where the virtual reality experience naturally merges the physical and virtual worlds. Rather than just watching the concert, imagine dancing with your friends in attendance and experiencing the same excitement from your living room through lifelike visuals, sound, touch, haptics and complete freedom of movement!

So what will it take from a technological standpoint for VR to become more immersive? Let's look at the key advances that will enable this exciting future.

Press Kits:  Intel Virtual Reality |  Project Alloy

A realistic virtual reality experience requires freedom of movement. Being connected to an external computer with cables handicaps a user's natural movement. The future of virtual reality must "cut the VR cord" allowing the user to operate freely, immersed within the experience.

Intel® RealSense™ technology enables VR headsets to sense depth, scan, map and navigate in the 3D environment. Additionally, as we announced at  Computex, Intel and HTC are working together to create a VR accessory that allows Vive customers to get high-fidelity, low-latency, immersive VR experiences without the wire.

Freedom of mobility, or movement with "6 degrees of freedom" (6DOF), not only means an untethered experience, but one that also avoids collision with real-world objects while in a virtual environment. At Intel, we have showcased this capability through Intel RealSense technology to accurately track head movement without the need for external sensors.

Today, many VR platforms employ sensors set up in a room to track the movement of a headset and controllers, corresponding to the displayed images in the headset of the virtual environment. These sensors require calibrating emitters to detect the movements of your head and render images accordingly. For VR to become mainstream and enter new verticals, the user experience must be simple without requiring external tracking setup.

Building upon the capabilities of Intel RealSense technologies, we leverage Intel® Movidius' vision processing unit (VPU) for accelerating inside-out 6DOF tracking algorithms. This reduces the need for elaborate and costly sets of external sensors and results in greater ease of use.

In addition, by integrating real hands into the virtual experience, you achieve a much more natural method of interaction, one that mirrors how we interact with the real world. Intel RealSense's depth-sensing camera technology, which senses and detects real-world objects, allows your real hands to figuratively grasp virtual-world objects for natural interactions. Instead of pressing a button on a controller to grab a virtual object, you can pick it up in your hands.

The strength of these natural interactions is enabled by a depth sensor along with a high-resolution color camera that allows users to visualize their hands. The embedded camera creates a natural perspective to see real people or objects as they enter the user's virtual field of view.

Finally, the content. The future of virtual reality will go beyond computer-generated content by bringing real-world objects into the digital environment in real time. Virtual and physical worlds should be able to interface with each other directly. We are already seeing early prototypes of these integrated examples, such as through Intel's work with  Surgical Theater, which provide surgeons with ways to rehearse a difficult procedure with their own surgical tools within a virtual rendering of the operation environment. This allows for not only better medical training, but for the patient to be brought along on the journey.

A culmination of these technological advances is found within Project Alloy. With the sensory experiences explained above, we designed Project Alloy, a first-generation, performance-based, all-in-one VR headset. It's an example of the future of merging virtual and physical realities today, redefining what is possible in an all-in-one VR platform.

The future of immersive VR experiences will provide an indistinguishable merging between real and virtual experiences with compelling sensory-based content. This journey will certainly take time and a lot of hard work, but we are excited to be part of creating that future.

Achin Bhowmik is vice president and general manager of the Perceptual Computing Group at Intel Corporation.




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us
ShareCG™ is a trademark of Internet Business Systems, Inc.

Report a Bug Report Abuse Make a Suggestion About Privacy Policy Contact Us User Agreement Advertise