The Google Search Labs platform has introduced a new virtual try-on feature. This function allows shoppers to use their own full-length photo as a “mannequin” to see how a selected outfit might look on them. The tool is currently available in test mode and is accessible through the Shopping tab in Google Search — a specialized search format that displays results as a product catalog.
When searching for clothes, users will see options available for purchase along with the ability to upload a photo. By selecting the Try On button, the uploaded image is digitally dressed in the chosen outfit. The system currently supports clothing categories such as dresses, shirts, trousers, and skirts.
How It Works and Current Limitations
The try-on feature is powered by artificial intelligence models that account for human body features and clothing behavior — including how different fabrics stretch, wrinkle, or drape depending on the body’s pose. While innovative, the function does not currently support all brands. It only works with sellers and manufacturers who have agreed to participate in the program.
Eric Schwartz, a journalist at TechRadar, tested the new feature using a jumpsuit reminiscent of Elvis Presley’s style. The resulting image contained visible signs of AI processing, such as small visual artifacts, but these did not significantly affect the overall impression of the outfit.
What to Expect Next
At this stage, there is no confirmed date for when the virtual fitting function will move beyond the test phase, notes NIX Solutions. However, the technology marks a step forward in making online shopping more interactive and personalized. As this feature continues to develop, we’ll keep you updated on any changes or broader availability.
For now, it serves as a helpful preview tool, offering users a general sense of how different items might look when worn — a convenience that may become an essential part of online fashion retail in the future.