This week, a look at the developments in smart glasses and early spatial computing, and the ways the industry could benefit. Scroll down to use Glossy+ Comments, giving the Glossy+ community the opportunity to join discussions around industry topics.
Google Glass is remembered by many fashion fans as a sci-fi-like accessory that made a gimmicky appearance on a 2012 Diane von Furstenberg runway. But fashion shouldn’t be as quick to dismiss the next iteration of smart glasses. That’s because multimodal AI is taking them to the next level.
On September 27, along with the Quest 3 mixed reality headset, Meta debuted Ray-Ban Meta glasses embedded with an AI assistant. The style was created in partnership with Ray-Ban owner EssilorLuxottica. Days prior, on September 20, Amazon announced fashion-forward Echo Frames, in partnership with Italian eyewear giant Safilo and its eyewear brand Carrera. Both sets of glasses are available to pre-order, for now, with the Echo Frames expected to go on sale in October.
The Meta glasses will allow people to capture hands-free images and videos, play music, and make phone calls using Bluetooth — the call feature is called Meta View. Compared to Meta’s prior model, the glasses also have better video quality, more advanced Instagram and Facebook livestreaming options, and an improved battery and speakers.
“The content opportunities are very exciting, especially in the post-Covid shift to brands wanting to do more experiences. [They’re doing more] experiential events, trips, and immersive AR experiences related to their products and services,” said Permele Doyle, president of influencer agency Billion Dollar Boy. The BDB agency was tapped for the influencer campaign for Ray-Ban’s first smart glasses, dubbed Ray-Ban Stories, launched with in partnership with Meta in the summer of 2022.
Ad position: web_incontent_pos1
The Stories launch came with a record amount of exclusive content for smart glasses, according to BDB. However, it’s been reported that Meta only managed to meet one-third of the 300,000–unit sales goal in the first seven months the glasses were on sale.
And even with the upgraded AI assistant integration, it is unclear whether the sales numbers for the Ray-Ban Meta glasses will be higher. With the new technology, users will be able to edit photos, translate text and interact with their layered physical-digital environment. Creators are an obvious target for the glasses, thanks to the camera and streaming integration. For TikTok-forward influencers like Tube Girl Sabrina Bahsoon, the glasses provide an opportunity to take followers to fashion shows, for example. And arguably, they stand to market the product better than prior attempts to market smart glasses. In 2022, Kendall Jenner was enlisted by Ray-Ban to promote the Stories style on Instagram.
“There is an opportunity for brands to utilize these smart glasses to show a full experience and have it be more interactive,” said Doyle. “We do quite a few influencer trips to fashion week, so we see potential in an influencer recording the experience live.”
Ad position: web_incontent_pos2
The launch from Meta is the latest move from the company to focus on multimodal AI, following its introduction of consumer AI chatbots and a large-scale language model, Llama 2. This is a type of artificial intelligence that can process, understand and generate answers for more than one type of data.
“By going multimodal next year, they are really starting to get into spatial computing,” said Cathy Hackl, chief futurist at advisory company Journey. “This is where these devices are able to see, understand and mesh the world around you, which is what the Meta Quest 3 is doing already.”
Because of the integration of cameras into the Ray-Ban Meta glasses, when worn, the AI will be able to 3D-scan its surroundings and understand its surroundings and locations. It opens up new possibilities for brands to understand their customer and for customers to get a more personalized experience. “With location scanning, brands can look at those locations and offer promotions and personalized recommendations when customers are near a shop, [for example]. Or they can offer styling advice,” said Doyle.
Like other iterations, AI capabilities seem to be the biggest selling point for this next generation of smart glasses, but fashion’s design influence is gaining traction in the space. For Amazon, the latest update to its Echo Frames is focused on the partnership with Carrera, as well as new audio advancements, new accessibility to Alexa assistance and an extended battery life. Carrera’s owner, Safilo, also holds the eyewear licenses for Tommy Hilfiger, Etro, Kate Spade and Stuart Weitzman.
“People wanted to continue to use the [glasses] for their entire day,” said Jean Wang, the director of Amazon’s smart eyewear division, pointing to early customer feedback about style and battery life. “People want the glasses to feel like they are part of the world and not necessarily blocking the world, especially in critical moments.”
According to Amazon’s data on its smart glasses users, over the last year, people using the Echo Frames have done so, on average, more than ever before, beating out the results of Meta’s Stories. Amazon declined to share how many glasses have been sold. The wearability of the AI smart glasses increases as more big fashion names collaborate on styles, according to Wang.
“It’s important for people to feel comfortable, as well as feel empowered with technology,” said Wang. “When we consider adding functions and features, it’s about making sure that we maintain that high bar of social acceptability, comfort and having people wear [a style] — not just for the looks, but also for what it does.”
For fashion brands, a partnership is often needed to integrate their product with a useful AI tool. Working with Safio, Amazon provided the glasses’ technology, the initial design ideas and the necessary distribution services. Safilo, meanwhile, finalized the design leveraging existing styles, took care of the production, and helped expand the physical and digital distribution.
“You’re going to continue to see fashion houses play in the [smart glasses] space. They’ve already been playing in it for a long time, but it’s been behind the scenes,” said Hackl. “Eventually, it’s going to be a device that we’re all going to want to wear — like our mobile phones, watches or Birkin bags — that signals something about us.”
Smart glasses also provide brands with the opportunity for more inclusivity, allowing for clearer communication with people with hearing or sight impairments. For example, thanks to the glasses’ design, listening to music or shopping via Alexa can easily be done while wearing a hearing aid. Audio commerce is a growing area for companies, although, so far, the integrations by Amazon for Alexa are limited to re-order items.
As the technology develops, Amazon could integrate optical technology for multimodal AI into its glasses, allowing them to recognize surroundings, analyze environments and facilitate purchases in shops using Amazon. That’s especially true since, back in January, Amazon Fashion partnered with Snap on AR shopping for traditional eyewear. However, Snap closed its AR enterprise services just six months after the announcement.
But the brand opportunity is not all about the hardware. “Fashion brand shouldn’t [just focus] on partnering with a hardware provider like Meta,” said Hackl. “They should also start to understand what the behaviors of consumers and wearers — especially younger Gen Z and Gen Alpha — are going to be [around smart glasses], in terms of how they consume content and how they engage with the world via these new devices.”
As technologies have become more readily available, the fashion industry has become increasingly focused on data and its importance when it comes to acquiring and understanding customers. Although there are still privacy considerations when it comes to environment scanning and AI, new data-gathering opportunities are likely. The wave of new smart glasses could mark a move toward a better understanding of the customer, while also providing new content and product opportunities.
“For any brand, whether it’s virtual or physical, understanding these early stages of customer behavior and computer-human interaction is going to give tons of data, depending on the data collection practices, about how consumers and their wearables use is going to develop as we move forward,” said Hackl. And at this point, based on the early stages of the internet and e-commerce, many brands know the importance of early mover advantage.
Inside our coverage
The Melissa playbook for reaching new shoppers with product collaborations
With Coperni’s new CD-Player Swipe bag, the dual-function fashion trend continues to thrive
With 95% of NFTs worthless, luxury fashion is finding other uses for web3
Want to discuss this with our editors and members? Join here, or log in here if you're already a member.