Analyst: Apple’s AR glasses will run Mac chips

Apple is now expected to introduce the first-generation of its AR/VR glasses in Q4 2022, and a report from analyst Ming-chi Kuo tells us a little more of what to expect from these things. If true, you’ll one day be wearing your Mac like sunglasses.

To be fair, a lot of this we’ve heard already. We’ve known for years, for example, that Apple has been working with Sony on the lenses for these devices and we’ve waxed lyrical ad infinitum concerning what to expect once they ship.

We know they will be expensive, and that Apple has a developmental road map combining both software and incremental hardware enhancements to take this new product family on.

So, what’s new, and how might it relate to your business?

Kuo tells us this about the product:

  • Expect two processors, one a Mac-equivalent chip, another to handle sensor input.
  • It will be a standalone device, not an accessory – you won’t need a Mac or iPhone to use these things.
  • The headset will make use of two Sony 4K micro-OLED displays.
  • It supports a range of apps, rather than just specific applications.

What’s really critical is the expected division of labor; Kuo says the headset will have one processor with the “same computing power level as the Mac,” while another chip will handle “sensor-related computing.”

The additional chip is required because the sensors gather so much information that needs to be managed in real time. Kuo says the headset holds “at least” six to eight  optical modules to provide “continuous video see-through AR services.”

Is the market ready?

Apple is hitting the market at a positive point. Recent research commissioned by Wool & Water found that more than half of US consumers think AR/VR has improved in recent months and will play more of a role in future.

A recent Morgan Stanley report said: “Apple’s entry into the eyewear market will be the game changer for all participants as the technology gets normalized and popularized. Apple has a long history of disrupting new markets and ultimately growing the addressable market size well beyond initial expectations.

“The enormity of the technical challenge — compressing daylong battery, 5G, compute, cameras, LiDAR, projectors and wave guide lenses into a lightweight, attractive pair of glasses — is hard to overstate,” they said. “But we are approaching lift-off.”

A Deloitte report said something similar, but adds real-world use data:

  • Enterprise VR programs improve productivity by c.32%;
  • AR is better than video in workplace settings, errors decline and problem-solving is enhanced;
  • AR training yields a 75% learning retention rate. It more than doubles learning outcomes;
  • For learning in dangerous environments (such as for firefighters), VR is safer and less risky.

Apple also has a major advantage in the form of its complete control over Apple silicon. That means competitors will be left with the challenge of either developing their own processors or purchasing chips from the likes of Qualcomm and MediaTek, which means most in the space will be unable to yield the kind of “full widget” experience inherent to Apple’s design. We’ve seen that game play out before.

So, what’s the long game?

Kuo thinks Apple’s long-term plan is to replace iPhones with these things by selling a billion units of them during the next decade. I think that’s a dramatic sweep. Instead, I imagine Apple’s aim is less about replacement and more about coexistence. It seems inevitable given the power of the processors that Universal Control, for example, will let you switch the active display between Mac, iPad, and Apple Glass.

You can expect a range of AR representations of computing experiences in the space. With that in mind, it seems equally inevitable that enabling that seamless switch in display view will be controllable by glance, as well as via keyboard, touch, a pointer, swipe, gesture, or voice.

AirPlay support conceivably means anything you see in your Apple Glass display can also be beamed to a nearby TV or Mac, meaning the computing experience becomes even more independent of the hardware you use to access it. The emergence of “Mac in the cloud” services brings in a whole new usage case.

What are the business implications?

I see this as a Mac you have with you wherever you are.

Field service engineers, sales teams, C-class execs, or remote workers will all have access to pervasive computing experiences. You might walk into an office, sit in front of a computer and in seconds be accessing the identical computing environment to the one you use elsewhere.

While travelling, you may be using voice, gesture, and eye-tracking to play a game or send an email, and no one knows you’re using these glasses when you use your memoji while making a FaceTime call. (The latter feature will probably be an API you can also access in WebEx, Teams, or Zoom.)

The data on using these technologies in training also seems compelling, opening up opportunities in HR, education, and staff training systems.

For and against

There will be resistance. I think most enterprises will be deeply suspicious at the thought of employees wearing glasses equipped with always on video and audio sensors.

Apple’s commitment to privacy and processing on the edge device, rather than in the cloud, will help here, but the enterprise impact will be limited until business people are fully confident in the privacy and security of corporate processes and data.

Facebook/Meta knows this too, which is why it is attempting to lobby governments to regulate Apple privacy down to its level. I hope it fails in that attempt.

[Also read: Why Apple’s AR glasses will transform your enterprise]

There will also be acceptance. Retail, medical, emergency services, and law enforcement will be keen to explore the potential of such devices.

There’s plenty of logical-seeming deployments in remote product support, exploration, even agriculture. For the latter, think how useful it might be for a farmer to be able to maintain visual sense of the position of their cattle or the health of their land through data held and processed on these devices.

An iPhone, a Mac, an internet communicator

In fairness, we’ll see business use cases begin to emerge once the products do. But the predicted price of these things means Apple will have to convince consumers cash-strapped following years of pandemic to make the magic leap.

For me, that means the most likely pitch will be around high-end experiences, sophisticated applications, accessibility, gaming, and cutting-edge use cases, particularly around health.

It will be interesting to see how Apple exploits its mass market brand to convince consumers to invest in a brand-new but allegedly expensive product line.

The phrase, “an iPod, a phone and an Internet communicator” comes to mind. Apple’s new devices will be an iPhone, a Mac, and a tool that unlocks your experience of the entire virtual and physical world, crossing language and geographical barriers while potentially also enabling yet another revolution in accessible computing, a la Geordi La Forge.

Bottom line, one day you’ll wear your Mac like sunglasses.

All Kuo has done is make that reality sound more official. And don’t forget — the processing power required to handle all the data picked up by the many sensors in these glasses is similar to the extensive processing required to manage all the incoming data when building an autonomous car.

All of this now runs on Apple Silicon, which gives Apple an immediate advantage in any industry, so long as it stays ahead on chip development.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Copyright © 2021 IDG Communications, Inc.