Matrix effects, routes on your specs: what next for augmented reality?

  • 12/12/2021
  • 00:00
  • 3
  • 0
  • 0
news-picture

Trying on clothes without stepping into a changing room, seeing your menu choices in 3D, viewing an art gallery’s contents outdoors and, of course, catching a Pokémon. This is the world of augmented reality, and one of its key players announced further additions last week. The owner of Snapchat, the app that offers those quirky animal-face selfies, will give developers the ability to transform any local landscape or building. A user could scan Big Ben so it can be turned into a wobbly landmark when seen through a phone, or even put the Matrix in their living room. While Mark Zuckerberg talks about an alternative world in the metaverse, the likes of Snapchat are getting on with transforming this one. “I think we’re starting to see some real fun around being able to play with the power to manipulate, change and edit our physical environments,” says Bobby Murphy, a co-founder and the chief technology officer of Snap. For all the excited talk about the metaverse – the concept of a virtual world populated by digital representations, or avatars, of ourselves – augmented reality (AR) has been in our lives for years. AR is where a digital layer is put over reality, normally via your phone – although Snap recently unveiled whizzy spectacles that allow a similar experience. Think Pokémon Go, where players leave their living rooms to capture the characters outdoors, or Snapchat’s cartoon cat selfies. The next step for AR, and where it starts to bump into woolly notions of the metaverse, is glasses that put a digital overlay on the world – as well as allowing users to film what they see (while igniting a whole load of privacy concerns). This year Snap unveiled spectacles with a proper AR overlay. However, these glasses are for a select few developers only and not for public sale, and have clear shortcomings such as a 30-minute battery life. Glasses are where analysts see AR really taking off. Leo Gebbie, an AR expert at the research firm CCS Insight, sees a future where AR glasses screen street directions straight on to a pedestrian’s lenses. “Rather than looking down at your smartphone … you’ll have a nice display in front of you that simply says ‘you are going to turn left again and then get on the bus that comes around the corner’. That’ll all be provided to you in real time rather than requiring you to have to interrupt your phone to do that,” says Gebbie. He has tried the prototype Snap spectacles and says they have an “impressive” AR display but are “fairly heavy”, don’t look like normal glasses, and the battery life is too short. For AR glasses to really work, he says, they need to be lightweight, fashionable and clear various technical hurdles. CCS Insight forecasts that 71m virtual and augmented reality devices will be sold in 2025, with a market value of $22bn (£16.7bn), compared with 11m devices this year. As soon as the right glasses come along, a wave of products will arrive, in the same way that the advent of the iPhone in 2007 unleashed the app economy, says Giorgio Tarraf, the technology intelligence director at the trend forecasting business L’Atelier BNP Paribas. “Once we have the right device that can instil trust and comfort, that’s when the entire ecosystem of applications is going to emerge, just like the smartphone when the iPhone came out,” he says. Snapchat has 306 million daily users, who between them play with the company’s “lenses” – the layers of augmented reality that they can view through their phones – more than 6bn times a day. Snapchat, via its 250,000 lens creators, now offers 2.5m different AR lenses. New features unveiled at Snap’s annual lens fest last week – a virtual gathering of lens developers, or “creators” – included “world mesh” software that allows developers to scan the geometry of a room and put sophisticated AR overlays, such as the scrolling green characters from the Matrix, on to the room. It is also offering its “landmarker” product – which allows lens creators to impose images on buildings and landscapes – to more developers. Previously, Snap had to do the initial 3D scan that a developer then built upon, as it did with Big Ben, but now any developer can do the initial scan with an iPhone. The climate crisis is also becoming a feature of AR. Other lenses released by Snap this year include viewing how London’s Design Museum will be affected by extreme weather, and peering into the intricate ecosystem of the Great Barrier Reef. As with many technological advances, Murphy says the aim is for people not to notice that they are taking part in an AR environment. “The way that we try to build our technology and design our experiences is almost in a way that we don’t want any members of our community to really even be aware of the depths of the technology that they’re using,” he says. “It’s a much more embedded part of our culture than people realise.”

مشاركة :