Facebook’s late-September Oculus Connect 6 event covered a lot of ground on the future of augmented and virtual reality. We’ve written about how VR can change the healthcare landscape before, but VR has come a very long way in just the past year. It may take time to realize Mark Zuckerberg’s vision for the social network, but its implications on how we interact with our surroundings have an enormous impact.
This recap will highlight Oculus’s biggest announcements from the AR and VR event, cover the practical applications for patients, and address some ramifications for pharma.
Perhaps the feature with the greatest impact – and it’s coming soon — is the ability to use hands as controllers in a virtual world. Last year, VR enthusiasts wanting to run intensive games and environments had to buy a powerful PC, headset, controllers, and sometimes external sensors.
Earlier this year, Oculus introduced the Quest – an all-in-one headset that uses a powerful mobile processor and built-in motion tracking. Paired with touch controllers, the Quest does not require an expensive computer or external sensors to run high-quality experiences.
By removing the need for controllers, Oculus is making VR even more accessible, expressive and mobile.
Beginning in early 2020, people can use their hands to control objects in the digital world intuitively, without the need to charge controllers or be familiar with how to use them. Some games and experiences that require the gyroscope or accelerometer may still require controllers, but this is a literal game-changer because it lowers the barrier to entry for new VR users.
Mind Control? Sort Of
No, we don’t need to be concerned about Facebook controlling our every move just yet. However, Oculus is working on technology that will allow us to use our thoughts to control our VR interactions.
In his keynote, Zuckerberg talked about Facebook’s acquisition of CRTL-Labs, “the team working on neural interfaces.” CRTL-Labs develops technology known as brain-machine interfaces or brain-computer interfaces. Facebook has already developed a research project that uses these non-invasive tools to allow people to “type” words by thinking about them, using the speech center of the brain.
Oculus is also working on a wristband that “picks up electrical impulses sent to your nervous system and turns them into digital inputs you can use in VR. It will give the sensation of being able to interact with digital objects.”
This technology is still several years away, but it isn’t a leap to start thinking about how we can more easily control our actions with thoughts and to receive haptic feedback in VR.
For example, patients with limited motor functions may be able to control their actions and environments better in virtual world than the real world thanks to a neural interface.
It was odd that the world’s largest social network initially left native social features off of its most popular VR headset in the Quest.
Facebook, known for connecting people and creating communities, is finally introducing social elements by launching Facebook Horizon. The company has tried to add social features to previous headsets with Facebook Spaces and Oculus Rooms, but those projects appear likely to be sunsetted in favor of Horizon.
Horizon is Facebook’s vision for the future: a platform where people can gather from all over the world to play games, chat, watch videos, and hang out with friends. Horizon will act as a framework where developers can build games and apps into this world, though it has yet to be seen how open this world will be to content creators.
For healthcare, it’s easy to imagine safe places for patients and caregivers to gather and interact with others in their community who share similar experiences. Patients with limited mobility can join virtual conferences; reps can show interactive aids to HCPs; and key opinion leaders can meet with patients.
The beta for Horizon will go live for developers in early 2020, with a target launch date of mid- to late 2020. We’re excited to see how Horizon can help bring worlds like the OASIS to life and connect people more than ever before.
Facebook Reality Labs
The mission for Facebook Reality Labs is to blend our physical and virtual worlds. In order to create more immersive experiences, we need more realistic avatars and environments.
Facebook Reality Labs is building LiveMaps, a “core infrastructure” that uses machine vision with localization and mapping technology to capture environments and convert them into virtual spaces. The technology generates a shared virtual map that mirrors the physical world using crowdsourced data.
Facebook’s AR headsets and glasses can tap into these maps to offer better accessibility, provide information about objects and locations, and allow users to virtually teleport to almost anywhere in the world.
This application is especially useful when paired with the more realistic avatars Facebook is developing. With humanlike avatars in intricately mapped environments, we’ll be able to build ultra-realistic recreations of our surroundings. In healthcare, use cases like telemedicine could take leaps forward.
Seen here, two life-like avatars (middle) are generated in real-time based on input from humans with VR headsets (corners).
While this virtual presence will help patients have access to experiences that they may not otherwise be able to physically experience, there are emotional aspects of virtual teleportation worth considering. No one has yet put in the extensive hours into wearing VR headsets all day, every day for long periods of time to understand the impact on the psyche. It remains to be seen how the lack of physical proximity and intimacy with others impacts our emotional state. We are at the forefront of technology in the pharma and healthcare landscape, and we have to ensure it will be safe for patients and doctors to engage in this world. The many of the benefits of increased exploration and access to others should outweigh any concerns.
Virtual teleportation is still several years away from mainstream use, but developer work is beginning now, and it is important to start thinking about future use cases for the tech.
Life-Changing, in Ways We Can’t Yet Imagine
The announcements at Oculus Connect 6 have far-reaching implications, some of which remain unknown. More advanced AR and VR technologies create new avenues of exploration and connection.
Technology (e.g., computers, the internet, mobile devices) has already changed our lives in fundamental ways. Improvements in AR and VR tech will continue to change the way we perceive reality and our place in the world in ways that we could only dream about before.
We may not yet know the full impact that VR and AR will have on society as a whole, but it’s exciting to be a part of this journey and help shape the future of the technology. For healthcare, applications should become more patient-focused, helping communities become more connected than ever before.
Andrew Grojean is Senior Manager of Innovation at Intouch Solutions; you can reach him at firstname.lastname@example.org