Apple Pushes Forward into VR & AR Development

This year’s gathering at Apple’s Worldwide Developer’s Conference marks its first forays into the market of virtual and augmented reality. Are the tech behemoth’s efforts enough to keep up with its competitors? In this blog post, I review Apple’s latest announcements and whether or not the company will flourish next to Samsung and Google.

In a surprising move at the 2017 Worldwide Developer Conference (WWDC), Apple finally presented a peek of what they plan to do in the virtual reality (VR) and augmented reality (AR) space. For the past couple years, this new market has been dominated by several other companies such as Oculus, Valve, Google, Samsung, and Microsoft.

Prior to the conference, Apple CEO Tim Cook teased everyone by continually saying that “AR is a big deal like the smartphone.” The time has finally come for Apple to show what they’ve been working on and how far they’ve come in this area.

What was announced?

  • Hardware support for external Thunderbolt3 Graphics Cards and Pro Hardware

Oculus Rift founder, Palmer Lucky, previously challenged that the state of VR on the Mac is nonexistent due to Apple’s inability to put high end graphics cards in their Pro machines. Apple finally answered this claim by introducing a Thunderbolt3 enclosure from Sonnet, which supports high-end graphics cards required to support existing VR headsets like the Oculus Rift and HTC Vive. By adding support for external graphics cards, any Mac with Thunderbolt3 can now become a VR development work station.

Additionally, Apple teased the iMac Pro, which is a new sleek black all-in-one desktop that will have support from 8 to 18 cores and the brand new AMD Vega graphics chips. This is impressive, as the AMD Vega graphics chip was only released two weeks prior. The performance of the iMac Pro should be enough to hold us over until a true Mac Pro successor is released. Apple continues to say that a new Mac Pro is still in the works for next year, and we should see even greater performance in this machine when released.

  • Advanced Developer Frameworks for AR, Machine Learning, and Vision

Apple also went so far as to work directly with Unity and Unreal to build in Mac support. Apple added AR API support to its easy-to-use rendering engine, SceneKit. Apple demoed a sample iPhone application that enabled the user to drag and drop virtual objects onto a table. The impressive part of this demo was that the virtual objects are sized relatively to the real environment, and the light models from a virtual lamp could project its light and shadows correctly over real objects. This is an impressive first step, and should lead the way towards more immersive AR applications in the future.

Object detection using the iPhone’s camera became a lot smarter with the introduction of the Core Machine Learning and Vision frameworks. Developers can now take advantage of Machine Learning models and incorporate them into their applications. Using an image feed from the camera, the application can do live detection of that model. Using the Vision framework, applications can track objects and detect things like faces, text, barcodes and the horizon. There are many application that can take advantage of the Core ML and Vision frameworks that will be built into iOS 11.

What wasn’t announced

  • A dedicated VR and AR Headset

While the iPhone and iPad can be used for AR tracking, there are still several limitations by using these devices. Microsoft’s AR headset, HoloLens, is able to project an IR grid to identify the depth of objects in the room. Apple’s solution relies solely on the camera to identify objects. This means that Apple’s AR solution isn’t yet capable of projecting AR objects on a flat featureless surface, especially if there’s no motion detected on the device.

The iPhone is not a good candidate for using with VR since the resolution and refresh rate of the display is not high enough for VR. At minimum, the next iPhone will need to have a high-resolution OLED screen be able to render images at a fast enough pace, 90 frames per second, to prevent motion sickness. Samsung’s S7 and S8 series and Google’s Pixel phones already have the hardware required. We can expect Apple will follow in these footsteps if they want to make the iPhone the next big VR headset.

  • The Next Big Thing

While Apple typically does not focus on consumer products at WWDC, Apple only announced tools for developers to use to build AR and VR applications. There were no hints about Apple plans to do in this space. Sure, including these capabilities into the next iPhone will certainly help sell more iPhones, but there has to be more than that if Apple sees VR and AR as a market equal to or bigger than the smartphone itself. Everything that was announced at the conference gets Apple caught up with the rest of the competition, allowing for the ideas to start rolling. We will have to wait to see Apples’ next step to see if they can lead the pack as they have done so in every other market they’ve entered.

Where do we go from here?

This year’s WWDC announcements have certainly turned Apple’s iOS and MacOS strategy on its heels. Many people have felt that Apple has been continuing to ignore the needs and desires of Apple’s pro customers. As we speak, Apple is setting up their platform to become the largest AR platform by building in ARKit into iOS 11. Hundreds of thousands of iPhones will be able to take advantage of this in upcoming applications. From there, Apple is setting up the Mac to be a full VR development platform by adding support for powerful hardware, and software support from today’s leading 3D engines. For now, let’s just hope that Apple continues to push forward at a rapid pace to further advance the VR and AR technology space.

For more details on what was announced, head over to Apple’s developer pages for full details and videos of the conference events.

There’s more to explore at Smartbridge.com!

Sign up to be notified when we publish articles, news, videos and more!