Software and development tools
15.07.2023 07:40

Share with others:

Share

Is this the future of matching clothes? Google says it is

Google, which relies on generative artificial intelligence whenever possible, is rolling out a new shopping feature that shows clothes on real-life models.
Is this the future of matching clothes? Google says it is

Part of a wide range of Google Shopping updates coming in the coming weeks, Google's virtual clothes fitter takes an image of a garment and tries to predict how it will roll, fold, fit, stretch and form creases and shadows - on a set of real human models in different positions.

Virtual testing is powered by a new artificial intelligence model based on diffusion. Google developed it internally. Diffusion models—of which the most famous text-to-image generators are Stable Diffusion and DALL-E 2—learn to gradually denoise an initial image that is entirely noise and move it step by step closer to the goal.

Google trained the model using a number of pairs of images, each of which included a person wearing a garment in two unique poses – for example, an image of someone wearing a shirt standing sideways and another image of the model standing from the front. To make the model more robust (i.e., resistant to visual errors such as wrinkles looking deformed and unnatural), the process was repeated using random pairs of clothing and human images.

For about a month, US shoppers using Google Shopping can virtually try on women's tops from brands such as Anthropologie, Everlane, H&M and LOFT. A new “Try On” badge is available in Google Search. Men's shirts will be available once until the end of the year.

"When you try on clothes in a store, you can immediately tell if they are right for you," he said Lilian Rincon, senior director of consumer shopping products at Google, wrote in a blog post. It cites research showing that 42 % online shoppers believe that models in online stores do not represent the real picture, while 59 % feel dissatisfied with a product they bought online because it looked different on them than they expected.

Virtually trying on clothes is not a new thing. Amazon and Adobe have been experimenting with generative clothing modeling for some time, as has Walmart, which since last year has offered online functionality that uses customers' photos to model clothing.

Google has already tested virtual clothes fitting and partnered with L'Oréal, Estée Lauder, MAC Cosmetics, Black Opal and Charlotte Tilbury to allow users to find makeup shades on different models with different skin tones. In fact, generative artificial intelligence is increasingly intruding into the fashion industry, and has been met with opposition from models who say it further exacerbates the inequalities that have long existed in the industry.

In a blog post, Rincon pointed out that Google chose to use real models — and a diverse selection that spans sizes XXS to 4XL and represents a variety of ethnicities, skin tones, body shapes and hair types. However, she did not answer a question about whether the new clothes-trying feature will lead to fewer opportunities for models to be photographed in the future. Along with the release of the featured functionality for virtual clothing testing, Google is also introducing filtering options when searching for clothing. Yes, you guessed it, this too is powered by AI and visual matching algorithms.


Interested in more from this topic?
artificial intelligence


What are others reading?