With the visionOS DevKit available for developers to create Apple Vision Pro apps, we are discovering even more capabilities that this spatial computer will bring once it’s released in early 2024. Developer Steve Troughton-Smith is sharing some of these capabilities, including the ability to place a visionOS window flat on a desk.
With that, Apple Vision Pro apps could have a more realistic look and feel while doing some tasks, such as playing a song, typing on the keyboard, and more. Interestingly, Apple filed a patent in 2016, which was granted in 2020, about this technology that is being used with the Apple Vision Pro.
Although the link for the patent itself is now broken, AppleInsider covered it at the time. As reported by the publication, the patent states that “a natural way for humans to interact with (real) objects is to touch them with their hands. [Screens] that allow detecting and localizing touches on their surface are commonly known as touch screens and are nowadays common part of, e.g., smartphones and tablet computers.”
While the best option would be to “physically equip the object or the human body with a sensor capable of sensing touch,” Apple scientists thought cameras could help track the user’s fingers to a point – which is precisely what Vision Pro does.
Combining the information from the patent with the preview Troughton-Smith gave using visionOS shows that interacting with virtual objects can be very realistic if we have a surface.
Not only will the app stay on the surface, but it will be as natural as approaching a book on the table or a keyboard on a surface.
While we’ll still have to wait a few more months for the release of Apple Vision Pro, we’ll surely learn a lot more about Apple’s first spatial computer now that the SDK and visionOS beta are out.