Since the announcement of the original Oculus Rift DK1 Kickstarter, there have been questions as to if or when Apple may fully delve into VR as a platform. Over the last few months, I believe that signs are not only pointing to ‘Yes!’, but that Apple is hoping to leapfrog all of the other players the VR space right now: Facebook (Oculus), Sony (PlayStation), and HTC (Vive), to be specific, and even Samsung (GearVR, powered by Oculus) as well, to some extent.
So, where would this project have begun? First, I’d wager that sometime in 2015, once it was certain that the Rift and the Vive would be shipping in early 2016, Apple began taking VR as a platform seriously. Let’s be clear about this for a second, Apple is a platform company, first and foremost, with MacOS as the desktop platform, iOS as the mobile platform, tvOS as the entertainment/living room platform, and watchOS as the ultra light mobile platform. A few years ago, when it was near certain that Apple would be building some sort of automobile, it seemed just as certain that a carOS platform would be coming as well, but like the rest of the lighter weight platforms, carOS would have been a fork of iOS.
While working on carOS, Apple was most likely working on computer vision software: the ability for a digital system to intelligently recognize object in space, what they are, and what their context for that space would be. I believe that most of this research eventually ended up as part of the new ARKit software that Apple began pushing over summer 2017. The basics would be similar, understanding planes (flat surfaces), understanding objects in space, etc. Just like roads, walls, barriers, and other vehicles, ARKit can do most of what a computer vision system for autonomous driving vehicles is designed to do. And now, there are potentially thousands of developers who are out there getting their toes gh this technology, which just helps Apple refine the software even faster as they move forward.
Second, sometime in 2014-2015, Apple realized the blunder it had made regarding parallel processing on CPUs was a flawed approach, especially since the industry had begun pushing more and more of this heavy math over onto more and more powerful GPUs. This is evident in the announcement of a new MacPro, slated to be released sometime in 2018, also in the announcement of external GPU support for macOS, and the upcoming release of the new iMac Pro. Probably most damning of the iMac Pro presentation, was the inclusion of a SteamVR / HTC Vive demonstration, to show that, indeed, Apple was interested in building tools for Virtual Reality. Couple this with th ever increasing leaps Apple has been making on its own silicon designs, the future ARM chips that Apple will be making as GPUs are likely to be as fast and capable as any desktop GPUs on the market today.
Third, and this is where I start reading the tea leaves, Apple knows how to design and manufacture small, light weight devices. Any sort of AppleVR hardware (yes, Apple will design and manufacture it’s own VR HMD, and vrOS will be a fork of iOS) will take advantage of a lot of the new hardware that is shipping with the new iPhone X, such as the super high pixel density OLED display and the FaceID hardware system (specifically, the flood illuminator, the infrared camera, and the dot projector). Once iPhone X is being manufactured at scale (50M devices annually), Apple will have no problem designing similar parts to fit a new dedicated VR HMD. Additionally, it’s highly likely that Apple will create some sort of accessory akin to GearVR, allowing iPhone users to have a taste of vrOS.
So, what’s the dedicated Apple HMD going to look like? I think that the Oculus Rift Santa Cruz prototypes are a good indicator as to where the whole market will begin going over the next few years: an all in one device, with CPU, GPU, audio, gyroscopes, accelerometers, etc, all in the HMD, plus a series of external sensors (cameras) at strategic points around the outside, each with a very wide angle lens, to allow near 360 visibility. In the case of the Apple HMD, there would probably be sensors akin to the FaceID sensors mentioned above. The dot projector, flood illuminator, and infrared sensor would be able to allow the device to actively map whatever space the user was using, and would also be able to see the users arms and hands, similar to the Leap Motion devices. As mentioned above, the software to do all of this is currently being worked out as part of ARKit, though I’m not aware of anyone using the FaceID sensors to help map spaces, but I doubt it would be that much of a stretch. Additionally, having a few more sensors inside the mask, and under the nose could allow real life expressions to be conveyed through an in-game avatar. Just think, VR emoji that show moods.
Next, would the vrOS need controllers? Answer: Maybe? Probably? The AR system being able to see arms and hands will go a long way for a sense of presence, but having something physical to manipulate would be helpful in a lot of cases. If Apple would allow the Siri Remote to be able to be paired via Bluetooth, that could be a most rudimentary controller, but they could also work on (design and manufacture) controllers similar to the Oculus touch controllers, which seem much easier to use that the competing wands.
Add all of these things up, especially the announcement during WWDC’17 that Apple was building out systems to support the raw power required for virtual reality coupled with some of the really amazing prices of hardware the company has recently released, and it becomes clear that Apple already has much of what would be necessary to create a new, ground-breaking platform, in both hardware and software. Now the only thing left to do is wait and see if they have the courage to delve into this brave, new world.