Meta has provided a glimpse into the upcoming Meta Quest 3 headset, showcasing its enhanced spatial understanding capabilities. The official support page, accidentally published ahead of schedule, reveals that Meta Quest 3 will be officially unveiled at Meta Connect 2023 on September 27.
The Meta Quest 3 is optimized for mixed reality, allowing for a seamless blending of the physical environment with digital elements. This is made possible through high-quality color passthrough and an integrated depth sensor. By collecting spatial data, the device gains a deeper understanding of the spatial layout and objects within it.
Spatial data refers to information about the size, shape, and location of walls, surfaces, and objects in a physical space. Meta explains that applications that combine virtual and real-world environments rely on this spatial data to comprehend the space around the user and their position within it.
The Meta Quest 3 creates a digital model of the environment recognizing and labeling objects, estimating their size, shape, and distance from each other and the headset. This collection of spatial data enables various applications, such as attaching digital objects to physical objects, realistic interactions with the physical environment, and occlusion effects.
The Meta Quest 3 supports three types of spatial data: Scene data, Mesh data, and Depth data. Scene data provides a simplified model of the room, enhancing the user’s physical awareness. Mesh data defines the shape and structure of physical objects, enabling realistic interactions between digital and physical elements. Depth data provides information about the distance between objects, allowing for realistic rendering of virtual objects and occlusion effects.
The integration of a depth sensor in the Meta Quest 3 is a significant advancement, as it enables the device to support all three types of spatial data. This breakthrough promises a more immersive and seamless mixed reality experience.
Sources:
– Meta’s official support page on Meta Quest 3’s spatial data
– Twitter user Luna’s discovery of the support page.