top of page

UX Researcher who advocates people's voice, for your product therapy!

User Research Case Study for Aarogya Setu




Aarogya Setu is a GOI initiative, designed and developed by the National Informatics Centre team in a public-private partnership. The App is aimed at augmenting the initiatives of the Department of Health, in proactively reaching out to and informing the users of the app regarding risks, best practices, and relevant advisories pertaining to the containment of COVID-19. The app was launched in early April and the end of the month was the right time to connect with users to do a quick evaluation for the team.


Project Overview:


Aim: To do a quick UX Evaluation for Aarogya Setu


Methods: Mixed Method Approach, Qualitative interviews, Remote Usability Testing, Content Analysis, First Click Testing

Skills: Interviewing, synthesizing, facilitating, presenting, design critique

Deliverables: Insights Report Team: Me (Lead Researcher), Gaurav(Intern), Shipra (Independent Researcher)

Duration: 1 week


Design and Research Team + Jay Dutta (Design Lead)

Challenges:

  • The broad user base: For an app that was downloaded by 50+ Million people within the first 3 days of its launch, considering this broad user base we needed to be smart with defining scalable user groups for the study.

  • Lack of Data: Although everyone was a user here, the access to actual user groups who were at higher risk was restricted and unlike other times we did not have the current user data so we relied on snowball sampling for those groups.

  • Limited timeframe: The time and scale of this project were unlike other research projects. Because of high stakes things were kind of happening at the speed of light. And we needed to ensure that our research matches up with this speed and we had to think about how can we provide the product teams with quick quality evaluations.

  • Agile updates during the study:  Because the things were so fast-paced and designing for a global crisis meant staying up-to-date during these uncertain, changing times and needs. This brought in a lot of agile interface changes during the course of the study.


Takeaway 1: As the App kept updating throughout the study, which only meant so should our research protocol as we move ahead. 

Process:

Objectives:

To understand user’s expectations, concerns, and understanding around App. To identify major usability issues across the App.

Methodology: With strict lockdown and social distancing being practiced without a doubt, the only option we had was to conduct a remote session and leverage technology at its best. Considering objectives each session was planned for 30–45min.

Identifying the right user groups: Considering the broad user base we listed all the key considerations that needed to be incorporated in the user profiles. Then we prioritised the most crucial ones. We combined a few other considerations which were not so important but were good to have use cases. We ensured enough representation across all categories. 



Brainstorming on the user considerations to finalise user groups

Recruitment Strategy and Screener We contacted friends and family and used our Social Media to recruit the users. The screener was prepared and used to categorise and recruit people from these 4 defined user groups. In spite of having no incentive, a sense of goodwill among the participants made our job easy.


Note-taking template Having pre-defined categories in the template while taking notes helped us speed up the process to match the product team’s pace. 


Takeaway 2: Based on our objectives, we had already created pre-defined categories and took notes for all the sessions in that template and kept updating it as new interesting stuff came up.

Conducting the session Each participant was asked qualitative questions and then all were given the task scenarios: Onboarding(First-time users), Checking my status, Self Assessment, Precautionary Steps, Checking COVID Updates, Checking E-pass. Along the way, they were asked to think aloud and provide their expectations and feedback of the interfaces and the features. 


Takeaway 3: It was important to be even more vigilant in finding the right saturation point and not waste time in conducting more sessions.

Synthesising and Analysing insights For the analysis, we used the content analysis method where you quantify and analyse the presence of themes in qualitative data. You can know about content analysis here. Following the content analysis process, we quickly glanced through the notes and prepared the tally sheets to quantify findings which gave us an idea of emerging themes. Based on it we could find the major pointers. Other important qualitative findings that didn’t emerge but still were relevant for the team, were also considered. The sense of urgency meant that we had to keep our “FOMO” of missing out on interesting patterns aside. Having a process helped us rely on it and stick to it rather than getting lost in the sea of data.


Takeaway 4: Content analysis comes to the rescue in case of the fast-paced agile environment

"this is how our content analysis sheet looked like, seems crisp right :P "

The Outcome

An evaluation report that was quickly synthesised and complied in 1 day with actionable insights backed by the user quotes. 


our 'quick and dirty' report instead of fancy decks considering the need of the hour

The report communicated the general user feedback for the concepts and the interfaces/flows that required revision because of usability errors. One of the most interesting things that emerged from the study was knowing how people from diverse backgrounds with varying levels of tech-savviness read symbols, icons, and terminologies. We realised that every icon, symbol, and wording on the app had to be aligned with their mental models so that the communication was comprehended the same way by everyone and was consistent in the language of their choice. The report also reflected on the user’s understanding, expectations, and perceived notions as the team lacked this knowledge due to its recent launch. 


Impact: Having clarity on these directed the product team to decide the user needs and pain points to solve. The changes required due to unmatched user expectations were reflected on the App immediately. Because things were happening at a lightning speed. We found some of our insights were already getting implemented before we even communicated those. Eventually implementation matters right, and advocating them validated the changes they had just made.


While presenting I made sure of putting emphasis on the major concern of the users. It was interesting to find out how some obvious hunches had subsided and results were different from what we had expected. In the end, we also proposed some suggestions and strategies of moving ahead based on our secondary desk research of other solutions by governments and organisations around the world that could be more effective or desirable. When there are so many unknowns it’s advisable to look and cross-pollinate multiple solutions, instead of focusing on just one. The presentation was also shared with other important stakeholders and the govt leaders.


Impact: It persuaded the team to solve that concern and broke their stereotypes. It was one of the proudest moments when it got acknowledged and pick up by the team. When you think about it, we had to be accountable as we were affecting 60M lives with even slightest of change.


What Next?

One of the biggest limitations of this study was that we had restricted access to the actual at-risk user groups, especially the orange and red states. As the stakeholders found the study valuable so we are expecting to do an extension of this study where we will include more of those user groups. 

* Explicit information has been hidden to comply with the NDA

To know more feel free to reach out to me!
bottom of page