How Google’s “try it on” changes shopping yet again

At Google I/O 2025, search officially became a stylist.
With AI-powered overviews, personalized product suggestions, and a new “Try It On” feature that lets shoppers see clothing on their own body with a simple upload, Google has re-arranged the traditional customer journey once again.
Discovery, comparison, and decision-making can now happen in one, AI-generated moment. Less browsing. Less site hopping. Just a styled suggestion and a swipe.
For brands, this isn’t a feature drop. It’s a fundamental shift in how people shop—and how your brand needs to show up.
The era of curated commerce
Historically, brands fought to rank, get found, win the click. But that era is fading fast. In this new AI-mediated experience, the question isn’t “Will they find us?” It’s “Will we be recommended?”
Google’s AI is no longer just indexing product pages. It’s interpreting taste and styling products. It’s surfacing what it believes users will like in the moment. And it’s doing that using your brand’s assets (such as images, copy, and metadata) as fuel.
The battleground has shifted from SEO to GEO to suggestions. And the assets you produce today will train the model that decides tomorrow.
From storefront to endless interface
Where does your storefront start, and where does it end?
Increasingly, the answer is: wherever the shopper is.
With features like Google’s “Buy for Me,” which tracks prices, compares products, and can even complete checkouts on your behalf, the storefront is dissolving into a network of agent-driven moments. What used to be a linear journey (click an ad, land on a product detail page, orPDP, and convert) is now a distributed choreography orchestrated by AI.
At DEPT®, we refer to this shift as “Endless Interfaces:” a new commerce paradigm where transactions are embedded everywhere, and the brand experience is reconstructed in places you don’t control. TikTok. WhatsApp. Google Search. Your storefront is being transformed, and your product data, content, and user experience (UX) need to keep pace.
In this landscape, the role of the PDP is extended, not erased. It must act like an API for your brand: modular, machine-readable, and expressive enough to fuel recommendations and autonomous agents. Because, increasingly, shoppers won’t browse—they’ll ask an assistant to decide for them.
If you’re only optimizing for your own domain, you’re already behind the curve.
Designing for the style, not the scroll
In this new landscape, brands are being styled into someone’s world.
A well-shot campaign photo is both a creative win and a strategic input. A lookbook becomes machine-readable proof of how your product fits into a lifestyle. Flat lays, 360s, UGC, and model diversity suddenly matter more, not because they impress a shopper directly, but because they feed the model a richer understanding of who your product is for.
It’s no longer enough for your content to be beautiful. It has to be describable by a machine.

What does this mean for brands?
When Google sets a new standard for shopping UX, it can win user trust and also establish a benchmark for others to follow.
Soon, shoppers won’t just appreciate immersive try-on or personalized suggestions, they’ll begin to expect it. And if Google can deliver that upstream, they’ll wonder why your site can’t deliver it downstream. That raises important questions: Should you partner with try-on technology providers? Can Google’s tech be extended into your owned experiences? Are your current content and data infrastructures ready?
There’s still hesitation in the industry, and rightly so. Factors including fit, fabric nuance, and brand identity matter. Some e-comm leaders have questioned whether virtual try-on is just a gimmick.
But the conversation has changed, because this isn’t about jumping on a trend. We’re entering a time when styling, fitting, and buying are no longer distinct phases. They’re simultaneous, and often mediated by machines.
So yes, fidelity matters. But so does being present in the experience at all. The brands that win won’t be the ones with the most perfect simulation. They’ll be the ones who show up early, learn fast, and make their content work across both human and AI touchpoints.
How to leverage Google’s “Try it on” feature
Invest in accurate, detailed 3D product data. The foundation of any virtual try-on experience is a high-fidelity 3D model of your products. Creating a pretty digital copy isn’t enough. You need to capture:
- Precise dimensions (lengths, widths, circumference)
- Fabric properties such as stretch, weight, and drape
- Construction details that affect fit (seams, cuts, closures)
Standardize & enhance your size and fit metadata. Every brand’s sizing is unique. A medium on one label might feel like a small or large on another. AI “Try It On” features rely heavily on metadata to understand these nuances. You should:
- Maintain detailed, accurate size charts and garment measurements
- Aggregate and expose “true-to-size” ratings and customer fit feedback
- Use standardized terminology and tagging so AI can interpret data consistently
Leverage customer fit feedback as AI training data. User reviews and returns data hold a treasure trove of insights into fit accuracy. Feeding this data back into AI models helps improve size recommendations and virtual try-on accuracy.
- Collect structured feedback about fit and comfort (e.g., “Runs small,” “Perfect fit”)
- Monitor return reasons related to sizing and fit
- Partner with AI providers to anonymize and utilize this data for training algorithms
Prepare for new creative workflows. Producing assets for AI styling and virtual try-on requires new workflows and collaboration between product design, marketing, and technology teams. This includes:
- Creating modular, machine-readable content assets
- Regularly updating 3D product scans as inventory or designs evolve
- Training teams on metadata standards and AI capabilities
AI is no longer simply powering search, it’s powering style. The brands that adapt their commerce experiences won’t just be found. They’ll be favored.