“I briefly got my hands on Apple’s new high-tech goggles,
which impressed and creeped me out and raised a question: Why do we need these?
I got a sneak peek into Apple’s vision for the future of
computing on Monday. For about half an hour, I wore the $3,500 Vision Pro, the
company’s first high-tech goggles, which will be released next year.
I walked away with mixed feelings, including a nagging sense
of skepticism.
On one hand, I was impressed with the quality of the
headset, which Apple bills as the beginning of an era of “spatial computing,”
where digital data blends with the physical world to unlock new capabilities.
Imagine wearing a headset to assemble furniture while the instructions are
digitally projected onto the parts, for instance, or cooking a meal while a
recipe is displayed in the corner of your eye.
Apple’s device had high-resolution video, intuitive controls
and a comfortable fit, which felt superior to my experiences with headsets made
in the last decade by Meta, Magic Leap, Sony and others.
But after wearing the new headset to view photos and
interact with a virtual dinosaur, I also felt there wasn’t much new to see
here. And the experience elicited an “ick” factor I had never had before with
an Apple product. More on this later.
Fit and control
Let me start from the beginning. After Apple unveiled the
headset on Monday, its first major new release since the Apple Watch in 2015, I
was permitted to try a preproduction model of the Vision Pro. Apple staff led
me to a private room at the company’s Silicon Valley headquarters and sat me on
a couch for a demo.
The Vision Pro, which resembles a pair of ski goggles, has a
white USB cable that plugs into a silver battery pack that I slipped into the
pocket of my jeans. To put it on my face, I turned a knob on the side of the
headset to adjust the snugness and secured a Velcro strap above my head.
I pressed down on a metal button toward the front of the
device to turn it on. Then I ran through a setup process, which involved
looking at a moving dot so the headset could lock in on my eye movements. The
Vision Pro has an array of sensors to track eye movements, hand gestures and
voice commands, which are the primary ways to control it. Looking at an icon is
equivalent to hovering over it with a mouse cursor; to press a button, you tap
your thumb and index fingers together, making a quick pinch that is equivalent
to clicking a mouse.
The pinch gesture was also used for grabbing and moving
around apps on the screen. It was intuitive and felt less clunky than waving
around the motion controllers that typically come with competing handsets.
But it raised questions. What other hand gestures would the
headset recognize for playing games? How good will voice controls be if Siri’s
voice transcription on phones currently doesn’t work well? Apple isn’t sure yet
what other gestures will be supported, and it didn’t let me try voice controls.
All the many uses?
Then came time for the app demos to show how the headset
might enrich our everyday lives and help us stay connected with one another.
Apple first walked me through looking at photos and a video
of a birthday party on the headset. I could turn a dial near the front of the
Vision Pro counterclockwise to make the photo backgrounds more transparent and
see the real world, including the Apple employees around me, or turn it
clockwise to make the photo more opaque to immerse myself.
Apple also had me open a meditation app in the headset that
showed 3-D animations while soothing music played and a voice instructed me to
breathe. But the meditation couldn’t prepare me for what was coming next: a
video call.
A small window popped up — a notification of a FaceTime call
from another Apple employee wearing the headset. I stared at the answer button
and pinched to take the call.
The Apple employee in the video call was using a “persona,”
an animated 3-D avatar of herself that the headset created using a scan of her
face. Apple portrays videoconferencing through the personas as a more intimate
way for people to communicate and even collaborate in virtual space.
The Apple employee’s facial expressions looked lifelike, and
her mouth movements synchronized with her speech. But because of how her avatar
was digitally rendered, with the uniform texture of her face and the lack of
shadows, I could tell it was fake. It resembled a video hologram I had seen in
sci-fi movies like “Minority Report.”
In the FaceTime session, the Apple employee and I were
supposed to collaborate on making a 3-D model in an app called Freeform. But I
stared at it blankly, thinking about what I was seeing. After three years of my
being mostly isolated during the pandemic, Apple wanted me to engage with what
was essentially a deepfake video of a real person. I could feel myself shutting
down. My “ick” sensation was probably what technologists have long described as
uncanny valley, a feeling of unease when a human sees a machine creation that
looks too human.
A technological feat? Yes. A feature I would want to use
with others every day? Probably not anytime soon.
To wrap the demonstration with something fun, Apple showed a
simulation of a dinosaur that moved toward me when I reached my hand out. I
have seen more than my fair share of digital dinosaurs in virtual reality
(almost every headset maker that’s given me a VR demo has shown a Jurassic Park
simulation in the last seven years), and I was not excited about this.
Real people
After the demo, I drove home and processed the experience
during rush hour.
Over dinner, I talked to my wife about the Vision Pro. The
Apple goggles, I said, looked and felt better than the competing headsets. But
I wasn’t sure that mattered.
Other headsets from Meta and Sony PlayStation were much
cheaper and already quite powerful and entertaining, especially for playing
video games. But whenever we had guests over for dinner and they tried the
goggles on, they lost interest after less than half an hour because the
experience was exhausting and they felt socially disconnected from the group.
Would it matter if they could twist the dial on the front of
the headset to see into the real world while wearing it? I suspect it would
still feel isolating, because they would probably be the only person in a room
wearing one.
But more important to me was the idea of connecting with
others, including family members and colleagues, through Apple headsets.
“Your mom is getting old,” I said to my wife. “When you’re
FaceTiming with her, would you rather see her deepfake digital avatar, or a
crummier video call where she’s holding the phone camera up to her face at an
unflattering angle?”
“The latter,” she said without hesitation. “That’s real.
Although, I’d much rather see her in person.””
Komentarų nėra:
Rašyti komentarą