Enhancing shopping experience on Myntra & enabling users to make confident decisions using AR and…

View the original post

Enhancing shopping experience on Myntra & enabling users to make confident decisions using AR and Smart Suggest: a UI/UX Case study.

Disclaimer: This is a personal project and I’m not associated with Myntra in any way.

Context :

Myntra is the largest fashion e-Commerce giant in India and serves over 50 million people from metro and tier 2–3 cities as well.

I love buying clothes and shoes and sometimes just looking at them 😅, both online and offline, and when it came to choosing a design project to make a case study on, this is where I found my calling ❤.

Though Myntra has a really smooth user experience It’s our duty as a designer to see the glass half empty.
To explore the scope of a better shopping experience, I went on a small journey.

User Research:

I interviewed 3 groups of users on phone calls, a total of 11 users, out of which

  • 4 were frequent users
  • 4 were occasional users
  • 3 were new users

Pain points:

  1. Users look different wearing the products than what they thought they would when they ordered.
  2. It’s hard to judge products by just looking at the models.
  3. It’s great to see so many options at first but it gets harder to choose with so many options (Choice overload).

Problem Statement: Find a solution to help users get an idea of how would apparels look on them and also a way to lessen the cognitive load of users while shopping.

How are users bootstrapping these problems?

  • Order > Try > Keep/Return is the go-to way to solve issues of size or likeability.
    They have to return if they don’t like it.
    Some users order multiple items they like, then keep the ones they like and return the remaining ones.
    So either way, the user has to place a return.
  • In case of choice overload, users either take their friend’s help or just order items similar to what they always wear to avoid confusion.
    But this way they cannot explore what else could look good on them and they just keep repeating the same set of colors and styles.

Due to these, there might be a huge chunk of return orders which of course comes with a hefty logistic cost for Myntra.

So if we can solve these, we’d not only be improving the user experience and satisfaction, there can be a significant reduction in return orders as well.

Now, what can we do to solve these?
Let’s peek into how our friends are helping their users decide better. 😉

Competitive Research:

After competitive research, I zeroed down to two solutions that are likely to solve these pain points, we’re designing two features, i.e

  1. Let the users try items virtually and let them see how it looks on them. Let’s call it “Try On”
  2. Introduce a new feature that has a more personalized browsing experience and can suggest products that they’re more likely to buy.
    Let’s call this one “Smart Suggest”

1. Try-On 🥼

At the end of the day, the biggest barrier to buying a product is wondering what it will look like on us.

‘Try on’ is a feature that allows users to try pieces of clothing, shoes, and accessories directly on them using AR technology. It can help users have a shopping experience close to a physical one.

But how?

There can be two possible ways to execute this.

We’re going ahead with AR-based technology because it gives users a much close to real experience, can help them choose faster and easier with an interactive experience.

Still, I had a small doubt.
Trying on spectacles, shoes, and furniture is fine and has been tried and tested by many e-commerce platforms, but trying clothes can be a little challenging. The user might have to place the phone at a certain place and height to try a piece of shirt or jeans. Would the users make that effort?

Well, seems like there are good chances!

We see the adaptation of TikTok, Reels, and AR filters in Snapchats. The users are getting familiar with the concept of putting the phone at a distance to record/click themselves.
Users can be seen putting an effort either if it provides enough value or they’re having fun.
and we’ll try to give them both here. 🤞

Let’s talk about the screens now!

AR screen needs to hold the information like

1. Name and brand of the product
2. Price & discounted price ( To indicate that they’re getting it at a discounted rate)
3. Colors (some products comes with different color options that users can access on a single screen)
4. Sizes (To figure out which size fits them well)
5. CTA

Guidelines: According to Apple’s Human Interface Guidelines regarding Augmented Reality, I kept some minor constraints like Minimising text, helping users engage with AR, and leaving the most space empty to let the user engage better with AR.

Let’s see in how many ways can we design a screen like this!

Since we know, no screen becomes what it is on the first go, it has to go through many iterations, feedbacks, and phases when you say to yourself
‘feel nahi aa rahi’ 😂. Let’s go through some iterations I did for the finalized version.


But the name of the color isn’t still visible and is very important for the accessibility of the feature so we’ve added it as a little interaction.
whenever the user clicks on a color to change the item’s color, we’ll show the name of the color in the color box itself for a second, and then it can go back to the default state. See below.

Let’s see the flow of screens in their final glow up 🎀

You might be wondering why that capture button is there.

While trying out clothes in a trial room many of us click and save pictures to be able to see them later and decide or compare them with a set of clothes, so it might be useful for our users as well if they want to compare different clothes.

Now, what happens after we tap the capture button?

We’ve added a timer to give users some time to get ready because while trying out virtually, there are chances that the phone might be placed on a surface or away from the user for them to be able to try it out on full-body; in case of jeans, shirts, etc., in this case, users might use a little time to get prepared to click pictures.

Share button:

In the research, I found that a lot of users seek someone’s advice while shopping online as well by sending them a screenshot/link of the product.
Hence we’ve also given an option to share so that users can share the captured picture directly to the frequent chat without having to drop out from the app.


Instead of sharing just the captured picture to chats of our friends, we can include a link to the product with a nudge for them to try the feature. It can help create awareness about the feature and expand the user base.

Let’s see how would users see different products in AR mode.

In the particular use case of cosmetics, color is shown permanently on screen and not as an interaction because the color name is very important for these types of products as there are a hundred shades available for a single color, and the name of color changes with the slight change of tint that can be difficult to notice for our eyes. So it is crucial to let users know what shade they’ve selected on the screen.

Size Suggest in AR:

Since there have been many improvements in the AR technology and watching LiDAR coming into the play, it can judge the measurement of surface and the object and check the fit so it would be able to tell if a particular size of the product fits the user or not.

If it’s implemented it’s going to benefit users even more as it can suggest sizes by measuring the user’s body in the AR camera.

Just like this.

Let’s design our second solution.

Smart Suggest 🧠 :

We’re designing this feature for the users who face difficulty choosing the right products for them, face a lot of choice overload. (Pain point no. 3)

This feature can also decrease context change for users. Context change is when a user starts browsing with a product and ends up buying another product from a different category.

In 41% of the cases the final product a person ends up purchasing is from a different product category than what the user started the session with. This further establishes that context changes are common


A small story about how I reached this idea:

While solving this problem, I was doing some brainstorming, taking inspiration from different platforms, but when I was (over)thinking about this, a thought came to my mind, how would these types of users feel like in a mall?

Lost’, right?

How do they help themselves there?

They either take a friend along with them or take advice/service of a stylist or assistant available there.

and voila! I got my answer 😌

Having someone guide us through the process of shopping can be really helpful because of the small nudges, validation, and a second opinion.

It saves you time, the hassle of deciding alone and helps you get the best product possible for you.

Our goal is to provide a similar experience to a personal fashion stylist in the stores who suggests the best pieces of clothing that match the user.

But how are we going to do it?

  • By personalizing & Prioritising the shopping experience
  • By suggesting top products according to user’s needs and appearance.
  • By telling them how much a particular product suits them, by means of a rating system

But we should not do it in the entire app because it can harm user control & freedom. Not every user is facing this problem hence we’d have to build a separate feature for it so that users can access it whenever they need it.

And this rating can be based on the user’s appearance, past purchases & browsing history.

According to the stylists and merchandisers, although fashion is what a person chooses to wear, to look their best there are some pointers that play an important role while shopping for apparel. They are:

  • Skin tone
  • Height
  • & Body Shape

But it seemed weird to ask users about their physical appearance until one day I saw a little onboarding on Myntra, it was asking me about my body type, the fit I prefer, and the type of clothes I wear the most, to improve my experience. and I was more than okay to answer.

To further explore this user behavior, I dialed up a few of my research participants again and they told me they also got this little onboarding and they were, in fact, happy to fill in all the details because they knew that is going to be used for their better shopping experience and would reap the benefits in the longer run.

After having all the questions answered I jumped on the designs.

Let’s see where would users start interacting with this feature:

The home screen is what users see as soon as they open the app and it’s the appropriate place so that it becomes easier to find for users to whom this feature would be most relevant to i.e new users and users with a high drop-off rate. Basically, all the users who are prone to the paint point 3 (Choice Overload)
We’ve also given feature touchpoints in Explore and Categories sections so that users can access it from there after they’ve used it once and have become familiar with it because there’s no point showing it on the home screen forever.
Also, they can change the details that they filled in the feature onboarding from here as well.

Now comes the onboarding:

After the onboarding is done, users are taken to the screen where they’ll be asked about their shopping preferences so that a tailored browsing experience can be provided.

Some before & after scenes (iterations) for this feature:

Where it all started:

I’m thankful that one day I drew these messy-looking wireframes 😅 to omit the ideas that were running into my head for quite some time and validated them after discussing them with my friends and mentors. Or else I would’ve dropped this idea and would be working on a whole different problem.

Reflections and Takeaways 🧵

  • Crazy to see how studying mental models and relating problems with real-life examples can give such crazy ideas and inspirations.
  • I learned to empathize with users. After talking to users and designing this experience I realized how it can help to make the experience seamless.
  • Maintaining the design language of the app was a bit of a challenge as well as a limitation but it was fun learning about Myntra’s design by copying their screens, got to learn so much by simply doing this.
  • In the end, Iterations were my biggest teacher. Not settling for the first design I made, initially was a bit challenging, but as I went further I questioned my decisions and as a result, my designs automatically got better.

That’s all folks! 🎯

I hope you had fun reading it, please feel free to criticize or give feedback, and don’t forget to leave a few claps to encourage. ❤

Also, I’m looking for opportunities as a product designer, please reach out if you’re hiring or share with someone who might be hiring, thank you! 🙂

I love to talk about Fitness, cars, travel, and everything interesting, let’s connect on Twitter or Linkedin.

Enhancing shopping experience on Myntra & enabling users to make confident decisions using AR and… was originally published in Muzli – Design Inspiration on Medium, where people are continuing the conversation by highlighting and responding to this story.