Something you can see - "Over the last decade AR hardware designers have laid the groundwork for a new generation of mass-market products, even as technical hangups still limit its viability.
Now imagine the same world — but your glasses scan every conversation to personalize a barrage of advertising. Some locations are replete with helpful holographic instructions, while in other places, neglect and poor connectivity make them few and far between. A sophisticated facial recognition system tracks every stranger you encounter... and, in turn, lets those strangers track your every move.
These are a few of the best- and worst-case scenarios for augmented reality, a technology that some of the world’s biggest tech companies are spending billions to promote as the future of computing.
> Facebook founder Mark Zuckerberg predicted in 2016 that televisions and phones would be replaced by holographic glasses.
> Apple CEO Tim Cook called AR “a big idea, like the smartphone.”
> Microsoft envisioned people watching the Super Bowl in its HoloLens headset.
> Google launched its ambitious Glass platform as a potential successor to phones, then helped propel the AR startup Magic Leap toward billions of dollars in investments.
> More recently, telecoms have partnered with AR companies like the Chinese startup Nreal, hoping high-bandwidth holograms will create a demand for 5G networks.
These companies’ products — as well as those of other major players, including Snap, Vuzix, and Niantic — often look very different. But most of them promise a uniquely powerful combination of three features.
- Their hardware is wearable, hands-free, and potentially always on — you don’t have to grab a device and put it away when you’re done using it
- Their images and audio can blend with or compensate for normal sensory perception of the world, rather than being confined to a discrete, self-contained screen
- Their sensors and software can collect and analyze huge amounts of information about their surroundings — through geolocation and depth sensing, computer vision programs, or intimate biometric technology like eye-tracking cameras
Over the past decade, nobody has managed to merge these capabilities into a mainstream consumer device. Most glasses are bulky, and the images they produce are shaky, transparent, or cut off by a limited field of view. Nobody has developed a surefire way to interact with them either, despite experiments with voice controls, finger tracking, and handheld hardware. . .Despite this, we’ve gotten hints of the medium’s power and challenges — and even skeptics of the tech should pay attention to them.
[. . .]
Writer and researcher Erica Neely says that laws and social norms aren’t prepared for how AR could affect physical space. “I think we’re kind of frantically running behind the technology,” she tells The Verge. In 2019, Neely wrote about the issues that Pokémon Go had exposed around augmented locations. Those issues mostly haven’t been settled, she says. And dedicated AR hardware will only intensify them.
Smartphone cameras — along with digital touchup apps like FaceTune and sophisticated image searches like Snap Scan and Google Lens — have already complicated our relationships with the offline world. But AR glasses could add an ease and ubiquity that our phones can’t manage. “A phone-based app you have to actually go to,” says Neely. “You are making a conscious choice to engage with it.” Glasses remove even the light friction of unlocking your screen and deliberately looking through a camera lens.
Augmentation also doesn’t just mean adding things to a wearer’s surroundings. It also means letting a computing platform capture and analyze them without other people’s consent. . .
Take facial recognition — a looming crisis at the heart of AR. Smartphone apps have used facial recognition for years to tag and sort people, and one of the most intuitive AR glasses applications is simply getting reminded of people’s names (as well as other background information like where you met them). It’s also a potential privacy disaster.
[. . .] But the EFF’s concern wasn’t premature. Andrew Bosworth, an executive at Facebook and Meta, reportedly told employees the company is weighing the costs and benefits of facial recognition for its Project Aria glasses, calling it possibly “the thorniest issue” in AR. And outside AR, some people are pushing for a near-total ban on the technology. Researcher Luke Clark has likened facial recognition to “the plutonium of AI,” saying any potential upsides are far eclipsed by its social harms. AR is a ready-made testbed for the widespread public use of facial recognition, and by the time any potential harms are obvious, it might be too late to fix them.
[. . .]
AR technology also isn’t going to develop in a vacuum. Despite talking up AR glasses’ novelty, figures like Zuckerberg and Cook still describe people using them almost exactly the way they use smartphones: as devices they carry around casually all the time. But ubiquitous, short-lived electronics like smartphones have imposed a steep environmental cost on the planet, and rolling out AR could add billions more devices that are replaced as readily as phones and powered by vast amounts of cloud computing infrastructure. “What are the environmental implications? And do those make sense?” Friedman asks. “If not, I think really the question for us is, what are the really critical areas in which some kind of augmented reality technology really does bring a substantial benefit to people?”
Either way, the past 10 years of tech have been a long struggle to manage crises once they’re already at a boiling point. Meanwhile, AR glasses’ status as a long-awaited, not-quite-there dream can buy us time to figure out what they can do for the world — and whether we actually want them."
No comments:
Post a Comment