The release of Apple’s new mixed-reality headset, Vision Pro, could cause a seismic shift in how users will experience the metaverse, with developers potentially moving away from the absolute isolation of virtual reality.
Unlike today's virtual reality headsets, which center on full immersion, Apple’s Vision Pro — unveiled on June 5 — can also superimpose applications onto the real world, letting users "interact with digital content in a way that feels like it is physically present in their space.”
Apple's AR headset revealed - Apple Vision Pro pic.twitter.com/UpNM7cH5yL
Speaking to Cointelegraph, KPMG’s Head of Metaverse Alyse Su believes the Vision Pro will shift developer focus away from purely immersive virtual worlds.
The headset introduces a new technology it calls "EyeSight," which uses lens trickery to make the user’s facial expressions look natural to outsiders. EyeSight also allows the display to switch between a transparent and opaque view, depending on whether a user is consuming immersive content or interacting with people in the real world.
“With the traditional or other headsets, there's this barrier between people who are wearing it and people who aren't. It feels like you're in two different worlds,” she said. “Now there's very few barriers between people, so you can have relatively seamless interactions.”
Apple EyeSight lets people see your eyes when you're using the headset pic.twitter.com/p773ZPjwRZ
Su said there is also a lot of potential in its eye-tracking technology, which can be used to help create personalized experiences.
Apple’s pupil-tracking technology works by detecting the mental state of users based on data from their eye movements and the response of their pupils to stimulus. It then uses
Read more on cointelegraph.com