This story is part of Glossy’s series breaking down the big conversations at Shoptalk.
Soon, if you can dream it, you can wear it.
That’s according to executives from Google and ThredUp, while speaking with Glossy at Shoptalk about their pending, AI-enabled site search updates.
Building upon Monday morning’s on-stage discussion featuring Google’s global head of commerce, Maria Renz, a Google representative said the company sees great opportunity in its new AI image generation tool, internally dubbed “Dreamer.” Only limited by a shopper’s imagination, the tool has the potential to change the way people shop in categories including apparel and home goods, they said.
To use Dreamer, which is currently live but will become more accessible as the year progresses, one can search for any item they’ve decided they want, whether they’ve seen it and know it exists or not. For example, they may state, “light-colored, cropped, quilted spring jacket.” AI will then generate three different images based on that description. Shoppable options visually similar to those AI images will then populate under each.
Ad position: web_incontent_pos1
The Google representative noted that the Dreamer tool is especially handy when someone is attempting to build an outfit based on a single piece they bought — they imagined how they’d complete the look, but never carved out the time to do so.
Because such large language models are only as good as the consumer is willing to have a conversation, the consumer will shape their own shopping experience — the more detailed the inquiry, the more accurate the product portrayal of their fashion dream.
Likewise, according to co-founder and CEO James Reinhart, ThredUp aims to provide customers with outfit-building assistance based on their inspirations and ideas.
Ad position: web_incontent_pos2
“We’re working on a generative AI tool that will allow you to plan outfits,” he said. Someone could search for a look that is “Las Vegas conference-chic,” for example, and ThredUp’s AI would generate an image of a fitting outfit. At the same time, ThredUp products that are visually similar to those outfit’s pieces would be served up. Shoppers could then refine their search to achieve more desired, shoppable results. For example, if the original search results were too drab, the shopper could update their prompt to: “Las Vegas conference-chic with a pop of color.” They could also use the tool to find styling inspiration for a look they already own, with a prompt such as, “Gray pinstripe suit with trendy shoes and a modern bag.”
Already, ThredUp has updated its search functionality — as of early this month — to encompass visual language. “It’s 100 times better than the search we had before,” Reinhart said. For example, one could now search “St. Patrick’s Day” — which formerly turned out zero results — to find “thousands” of results that encompass items featuring leprechauns, shamrocks and shades of green. Likewise, searches like “Gwyneth Paltrow style” and even “Dora the Explorer” will now spit out results fitting the bill. For its part, ThredUp is using the new capability to develop curations of super-timely trends, which it features on its homepage as a “service” to its shoppers, Reinhart said.
“It makes us much more relevant to the trend-seeking shopper than we were just a month ago,” he said, adding, “I actually think AI is under-hyped. It has profound implications for a lot of businesses — but especially for ours, because we’ve got millions of unique products.”
In general, bringing inspiration beyond the store and into the realm of e-commerce is game-changing, the Google representative said. In short, AI now allows online search to be a top-of-funnel tool, whereas it was formerly confined to the mid and lower funnel.
There are several ways Google is already tapping into the new opportunity, including with its Google Lens visual search tool — Google currently sees 12 billion visual searches per month. The representative called Google Lens a shopper’s superpower, in that it allows them to not only find and shop a product featured in an image, but to also compare prices across available options in full-price and resale channels. An extension of Google Lens called Circle the Search, which launched for select phone models in January, allows consumers to circle an online image they like on any app, discover what it is and buy it without leaving the app.
Among its other shopping advancements, Google is now using AI for stable diffusion to enable a realistic version of virtual try-on. The technology breaks down a pictured item into pixels to show details like how a fabric folds on a body, for example.
Moving forward, a product’s search results could include Google Maps by showing all nearby stores where the item can readily be shopped, the Google representative said. However, they noted, that would require next-level sophistication, in terms of retailers’ inventory management.