Department of the Interior Bison fenced in angle bracketsOpen data, design, & development at the Office of Natural Resources Revenue

Teaching digital skills: progress on usability testing by peer training

April 16, 2020

Last year, our term-limited UX designer taught us career government employee program analysts how to conduct user research. We want to tell you how it’s been going since then.

We can’t believe it has already been a year since we wrote about learning user research because we haven’t taken a break since we started learning. The past year has just flown by. We conducted several real studies for revenuedata.doi.gov, learned new software, brought in virtual UX interns, spoke about our work at conferences, and recently took on another Office of Natural Resources Revenue (ONRR) website for which we have started user research. Writing this blog post gives us a moment to pause and evaluate our work over the past year and, hopefully, gain some insight into what we’ve learned and how we can do even better going forward.

In the closing of our last blog post we mentioned that our next steps would be to conduct real studies, try using spreadsheets to take notes to cut down on time used to consolidate findings, and perform outreach activities to help other teams in our organization adopt our work approach.

Progress

We conducted 7 – yes 7 – distinct rounds of user research for our website over the past year. This includes a massive undertaking where we catalogued and analyzed every task or question a user has about the site data to make sure our product framing accurately reflected the needs of our users and that we are staying within scope as we move forward with new designs and incorporate more data. It involved going through previous interview notes, requests to our data retrieval team, and Freedom of Information Act requests. In the end, we used what we learned from the study and made minor changes to our problem statement and bigger changes to the key user scenarios and product goals in our product framing documentation.

Certain areas of user research have become easier. We use a standard process where we write a plan, conduct the interviews, analyze the findings, and present our research. We are becoming more comfortable doing user interviews. We are learning to look at the site and our data from a user-focused perspective and can provide more critical feedback. We are also able to now make simple changes to prototypes within design software and can help communicate these changes to our developers.

This year we are lucky to have two virtual interns to help us with our design and research process. We were able to quickly train them in the way we conduct user research, and they were able to share their academic training with us. Both interns brought more design and technical experience to the team and we were quickly able to help them become contributing team members. We learned a lot about user research and the design process from them throughout the year.

We have also recently taken on a new product, ONRR.gov, and have worked with the interns to audit the current site, create user profiles (personas), and complete two rounds of user research as well as a simplified homepage prototype.

Outreach

We are still excited about everything that we have learned and have been sharing it as much as possible. We have done training with another internal ONRR team and shared our work process so that they could do their own user research on a product in development. We’ve also spoken internally within the Department of the Interior to share how we work and why design thinking is important. We spoke externally at several conferences as well, including the United States Association for Energy Economics (USAEE) and Government UX Summit.

Lessons Learned

Since we have been actively participating in the user research process, we have learned a few lessons in each step in terms of planning, conducting the study, and analysis.

In the planning process, we have learned that defining very clear goals from the very beginning makes the rest of the planning much easier. We also learned that involving more team members in the planning process helps us make better decisions on specific goals for each study and has led to a more effective and useful testing process.

An area that remains difficult is recruiting users for interviews. We interview real users for feedback and don’t pay participants, so we have a really hard time recruiting people to interview. It takes a lot of time to source users and send out recruiting emails, and we have a very low rate of response. This remains an area that we need to improve.

One of the main lessons we learned from conducting the study is the importance of being organized and how it impacts the participant. For example, since we conduct all the interviews virtually, we learned that we need to be prepared for any technical issues by having instructions for them on how to share their screen or a conference line ready for them to use if there are any audio issues.

We learned that it’s very important to start the interview by introducing the test environment and to let the participant know that there are no right or wrong answers and we are not testing them but testing the product and observing how they are using it. Being prepared during the first few minutes of the interview is crucial for the participant to stay calm and will have a major impact on how the rest of the study will progress.

We also learned to actively observe during the study, and we have started to have more than one observer so one could be taking notes and the other observer is actively summarizing. This has helped our debrief session after the interview go smoothly since a summary is already written. The summary is important because we can’t record the interviews and we use it to consolidate our findings, so we don’t want to miss any important information. We did try using spreadsheets to take notes but found that it didn’t work very well. It was easy to get lost in the spreadsheet and hard to keep up with the pace of the interview.

In the analysis process, we learned the best way to summarize common findings is by organizing each finding by category and then importance after looking at all the problems across participants. To do that, we look at the scope and the severity of each problem. For the scope, we identify how widespread each problem is, and for severity, we identify how critical each problem is. Then we can rank all the issues found in the study, propose solutions to the team, and work the solutions in terms of importance.

Learning usability testing through peer training and on-the-job experience has been a very valuable for our entire team, and we are still learning. We learn more with every study that we complete and every iteration of our products. We are working smarter and more quickly with each study. There have been times where our expert user experience designer was not available to conduct an interview, and we were able to carry on with the scheduled interview without hesitation.

We now have the capacity to concurrently plan a study for one product while conducting a study for a separate product. We could also probably run a study on our own without our term-limited UX designer—but we love working with her, so we won’t until we have to!


Note : Reference in this blog to any specific commercial product, process, or service, is for the information and convenience of the public, and does not constitute endorsement, recommendation, or favoring by the Department of the Interior.

Maroya Faied profile image

Maroya Faied: Product Manager at the Office of Natural Resources Revenue.

Lindsay Goldstein profile image

Lindsay Goldstein: Digital Services Specialist at the Office of Natural Resources Revenue.