By BRIAN X. CHEN NYTimes News Service
Share this story

I got a sneak peek into Apple’s vision for the future of computing Monday. For about a half-hour, I wore the $3,500 Vision Pro, the company’s first high-tech goggles, which will be released next year.

I walked away with mixed feelings, including a nagging sense of skepticism.

ADVERTISING


On one hand, I was impressed with the quality of the headset, which Apple bills as the beginning of an era of “spatial computing,” where digital data blends with the physical world to unlock new capabilities. Imagine wearing a headset to assemble furniture while the instructions are digitally projected onto the parts, for instance, or cooking a meal while a recipe is displayed in the corner of your eye.

Apple’s device had high-resolution video, intuitive controls and a comfortable fit, which felt superior to my experiences with headsets made in the past decade by Meta, Magic Leap, Sony and others.

But after wearing the new headset to view photos and interact with a virtual dinosaur, I also felt there wasn’t much new to see here. And the experience elicited an “ick” factor I’ve never had before with an Apple product. More on this later.

Fit and control

Let me start from the beginning. After Apple unveiled the headset Monday, its first major new release since the Apple Watch in 2015, I was permitted to try a preproduction model of the Vision Pro. Apple staff led me to a private room at the company’s Silicon Valley headquarters and sat me on a couch for a demo.

The Vision Pro, which resembles a pair of ski goggles, has a white USB cable that plugs into a silver battery pack that I slipped into the pocket of my jeans. To put it on my face, I turned a knob on the side of the headset to adjust the snugness and secured a Velcro strap above my head.

I pressed down on a metal button toward the front of the device to turn it on. Then I ran through a setup process, which involved looking at a moving dot so the headset could lock in on my eye movements. The Vision Pro has an array of sensors to track eye movements, hand gestures and voice commands, which are the primary ways to control it. Looking at an icon is equivalent to hovering over it with a mouse cursor; to press a button, you tap your thumb and index fingers together, making a quick pinch that is equivalent to clicking a mouse.

All the many uses?

Then came time for the app demos to show how the headset might enrich our everyday lives and help us stay connected with one another.

Apple first walked me through looking at photos and a video of a birthday party on the headset. I could turn a dial near the front of the Vision Pro counterclockwise to make the photo backgrounds more transparent and see the real world, including the Apple employees around me, or turn it clockwise to make the photo more opaque to immerse myself.

Apple also had me open a meditation app in the headset that showed 3D animations while soothing music played and a voice instructed me to breathe. But the meditation couldn’t prepare me for what was coming next: a video call.

A small window popped up — a notification of a FaceTime call coming from another Apple employee wearing the headset. I stared at the answer button and pinched to take the call.

The Apple employee in the video call was using a “persona,” an animated 3D avatar of herself that the headset created using a scan of her face. Apple portrays videoconferencing through the personas as a more intimate way for people to communicate and even collaborate in virtual space.

© 2023 The New York Times Company