Extended Reality: Will It Be More Widespread in 2024?
December 12, 2023
Introduction
Extended Reality (XR), also known as Mixed Reality or Spatial Computing, is a technology that seamlessly blends digital layers with the real physical world, creating a natural and immersive experience. This approach involves using devices equipped with sensors, cameras, and physical interfaces to integrate digital content with the actual environment through spatial mapping algorithms, computerized vision, and 3D spatial tracking.
In contrast to conventional Augmented Reality, where digital layers overlay the real image without integration into the three-dimensional environment, emerging immersive technologies enable companies to craft realistic and interactive experiences. These experiences harmoniously combine digital layers with the real scene, providing collaborative interaction spaces with diverse applications for both employees and customers.
Despite the continual advancements in these technologies, their adoption in industries remains relatively low.
Traditional Extended Reality Devices
Until now, the most widely used professional XR or Mixed Reality devices have been Microsoft's two generations of HoloLens. These devices feature a transparent visor that allows the real-world image to pass through, with "holographic lenses" on its surface, serving as structures to confine digitally projected images. However, a notable limitation is the restricted Field of View (FOV) for digital layers, causing elements to be lost from view when turning the head. The second version of HoloLens addressed some of these issues, integrating with new Azure cloud services designed for Mixed Reality, such as Dynamics 365 Remote Assist. This allows technicians using HoloLens 2 to connect with remote engineers, providing real-time video for precise guidance, enhancing collaborative problem-solving.
Recent releases of XR devices, following the translucent screen approach, have been somewhat disappointing. Magic Leap, for example, initially promised retinal projection to address FOV limitations but ultimately adopted a system similar to Microsoft's. With Microsoft canceling its third-generation HoloLens (project Calypso) in 2021, the direction of XR hardware seems to be shifting away from translucent glasses towards Virtual Reality headsets with Mixed Reality capabilities. This shift is evident in new devices from Meta, Pico, Oppo, Varjo, and the anticipated Apple Vision Pro.
The New Wave of Extended Reality
Virtual Reality headsets have evolved into devices with Extended or Mixed Reality capabilities. The significant increase in VR headset sales, driven by the gaming market, has spurred the improvement and cost reduction of these device components, creating a context where the adoption of sensors and additional computing capabilities required for a good Mixed Reality experience in a virtual setting is increasingly widespread. This system, unlike glasses with translucent screens, uses headsets that completely block external vision, allowing full immersion in Virtual Reality. However, for Mixed Reality mode, they capture the real exterior image through cameras and display it on the device screens, along with digital objects or layers, in a process known as passthrough. This system dramatically increases the FOV and seems to be the approach to Mixed Reality we will see in the near future. The launch of Meta Quest 3 is a clear bet by the company to introduce elements that improve its Mixed Reality capabilities, such as enhanced color passthrough, a FOV almost tripling that of Hololens 2, and a depth sensor that automatically generates a 3D mesh of real scene elements for integration with digital objects. While the adoption of pancake lenses, along with the new Snapdragon XR2 Gen 2 chipset, allows size reduction and brings the center of gravity closer to the neck, its ergonomics still need improvement. However, this is expected to improve with the progressive downsizing of all components.
The Arrival of Apple Vision Pro
The imminent release of the Apple Vision Pro (AVP) seems set to further boost the adoption of Extended Reality in its various applications. Its overwhelming technical specifications suggest it will enhance key elements that define a good Mixed Reality experience. It incorporates the Apple M2 chip, used in some of the most powerful computers in its range, complemented by another chip, the R1, dedicated to processing signals from the viewer's 12 cameras and 5 sensors. It features over 4K resolution screens per eye with Micro OLED technology, significantly improving sharpness and contrast (blacks will no longer be gray), and a spatial audio system enriched by sensors that map the environment in 3D. The AVP’s passthrough overcomes issues of distortions and deformations in the real image, transmitting the image to screens in just 12 milliseconds, enabling previously unthinkable applications, like working with real physical monitors for hours without removing the visor. Additionally, it includes an advanced eye-tracking system using LEDs and infrared cameras, a substantial element for gaze-based interface control, while enabling technical capabilities that improve fill rate like foveated rendering, dynamically reducing or increasing screen resolution depending on where the gaze is directed. High-resolution cameras and a LiDAR scanner are responsible for real-time environment mapping, generating a 3D mesh for scene elements and providing precise hand tracking for interface control, thus freeing up hands for operators. Another notable feature is its software, always a strong point for Apple. They have developed a completely new operating system, visionOS, featuring a fully three-dimensional interface that dynamically responds to natural light and casts shadows over the real environment, helping users perceive digital elements as integrated. Finally, its ergonomics, yet to be tested, seems to be another area where it clearly surpasses current MR headsets, which is highly relevant for its widespread adoption in any industry or professional application.
New Applications of Extended Reality
One of the areas where these technologies could have the greatest impact is in product lifecycle management. Through immersive simulations, designers and engineers can attach virtual modules to real physical machines that, with physical sensors, provide data to a digital system to evaluate the functioning of digital and physical parts together, reducing testing, optimization, and prototype manufacturing times.
In a business environment where teams are increasingly decentralized, collaborative design scenarios are common. These tools will allow geographically dispersed teams to work together in a common virtual environment, manipulating 3D models and exchanging real-time feedback, not only accelerating the design process but ensuring all stakeholders are on the same page and have simultaneous access to all information.
Another interesting application area is immersive training. Mixed Reality and Virtual Reality tools can develop digital twins for learning the operation of any physical device or element. They are particularly suitable for teaching any technique requiring manipulation of physical objects in space, whether in biomedicine, engineering, telecommunications, or any other industrial sector, achieving improvements in safety, reducing costs compared to in-person training, allowing agile and unified progress tracking, and enabling a fully dynamic system for updating training modules.
The Future of the Metaverse and Virtual Reality
The metaverse is a persistent and shared virtual space, where users interact with each other and with digital objects in real time, usually through avatars, and where an economy based on virtual currencies is often adopted, allowing the exchange of digital goods. Traditionally, metaverses were closely related to video games. Indeed, those of us who have been attached to this industry over the years watch with some astonishment how this term has burst into virtually all professional sectors as if it were a recent and revolutionary discovery. Indeed, since the launch of what is considered the first MMO, Ultima Online around 1998, many video games have met the criteria that should unequivocally define a metaverse. It is well known that when a term reaches such a level as a "buzzword," all companies want to "jump on the bandwagon," and as a result, much vaguer meanings of the concept begin to become generalized. Virtual spaces as common as social networks, X, Instagram, or Facebook, could ultimately be interpreted as metaverses, insofar as they are persistent digital universes where people "live" and interact with each other. However, these spaces do not meet a fundamental premise, that of offering a parallel and immersive virtual universe, a characteristic that their small 2D screens can hardly help to provide.
On the other hand, the emergence of other technologies such as cryptocurrencies, and related elements such as NFTs, which have quickly fitted into the concept of the metaverse, has caused even more confusion in the generalized perception of the term and has created an additional entry barrier that should not be necessary in all scenarios. It seems that these somewhat imprecise definitions of the term, along with poorly conceived or rushed implementations that have resulted in disappointing experiences, have generated confusion among companies and have not helped the concept to be adopted in its multiple areas of application. However, those who have been making metaverses for twenty years, the major players in the video game industry, have continued to evolve this formula very successfully. Take for example World of Warcraft by Blizzard, with two decades maintaining peaks of 13M monthly users, and Fortnite by Epic with monthly averages of active players exceeding 250M. These metaverses have become not only a space for entertainment but also a social network where our children "meet" to see each other daily. The major video game studios have clearly understood this potential, leveraging these spaces to develop virtual events with massive attendance that boost monetization and user retention, such as the live concerts of Travis Scott and Marshmello in Fortnite, with more than 12 and 10 million attendees respectively.
Leaving aside the metaverses of the video game industry, it seems clear that we must reflect on what should really be the criteria that better define them for more profitable development in the business environment. They will also surely end up focusing on more precise functional objectives in each case and industry, rather than as spaces where users congregate en masse, except in solutions directly related to communications between people, such as those of Meta or Microsoft. Let's not forget that no matter how attractive the virtual environments we can build are, users will only come and stay in them to the extent that they clearly provide value. It is in this aspect of the orientation to functional objectives where Virtual Reality and Extended Reality, given their new technological capabilities, can help to drive the adoption of metaverses in the business realm. The new stand-alone devices (wireless), with hand tracking (without controllers), with Mixed Reality capabilities, lighter, more powerful, and with greatly improved ergonomics that allow extended hours of use, are going to provide VR and MR solutions that solve problems or meet needs of the real world in collaborative and immersive professional virtual environments. Will we start to see this in 2024? We hope so.