The Paperwork Reduction Act (PRA) was designed to decrease the burden that government would put on the average citizen. The idea was to minimize the amount of time we made people fill out official forms and surveys.
Fast forward to the 21st century. This law now applies to anything that tries to collect information (surveys, interviews, usability tests) even through email or social media. No, it doesn’t make sense that you have red tape blocking you from using these super-efficient tools.
Yes! In fact, it’s crucial that you do. Companies and startups are constantly tweaking and testing their digital products, and they rarely have as many users as you do (311 million Americans!), and if their interfaces suck, they get beaten by the competition. You don’t have that competition built in so you have to make sure you’re serving those hundreds of millions of people with something that works!
Because of the Paperwork Reduction Act, there are some constraints, but you can work within them. Here are the rules for different types of participants:
|Government employees||You can test as many as you’d like.|
|Normal people||You’re fine as long as you have less than 10 participants.* If you want to test more, you’ll need to get special clearance.|
You *can, however ask as many people as you’d like questions that are not identical. For example, if you’re conducting interviews to create personas.
Each agency should have a person called an “Information Collection Officer,” or some similar title. Go find her and she should help you through the process.
Minimum four weeks, maximum six months or more. If people in your agency have gone through this process before, and they were smart, they may have laid the foundation to let you do a “Fast Track Clearance,” which takes 3-4 weeks. If not, you’ll need to start from scratch, which takes longer.
Hopefully, you’ve been able to convince the executive/legal reviewer/other colleagues to attend the tests in person, if not, use this reporting format instead of a long report:
For each participant (P1 = first participant, etc.) have observers take notes separately about where the user has trouble.
At the end of the first test after the participant leaves, open an excel doc and make a big list of all their issues, and put an X in the P1 square. For the next test, put an X under P2 anywhere where the second user had the same issue as the first user, and add any new issues in a new row, with just P2 X’d.
Continue throughout testing all of your participants. At the end of the testing, you should be able to skim and see the biggest issues with the system. Before leaving for the day, brainstorm with the other observers what you can do to improve the biggest problems you saw. Circulate the report, with suggested solutions, that night for any comments/questions, and implement the suggested solutions.
Do not write a big, long report about usability testing. People will not listen, and it doesn’t include an “ask” – your ask should be “let me know if you have questions or comments about the suggested solutions” and attach the excel doc.
|Cannot figure out how to get “home”||X||X||Add “home” button to page|
|Doesn’t understand what the copy says||X||X||Rewrite in plain language|
|Was distracted by color||X||n/a|
Include “Real users (number)” in the list of people who have pre-cleared the document/tool/form before review. When/if people push back on your recommendations for political or “legal”* reasons, cite the user feedback to make your case and stick to what you found in your research. You are the only person representing the 311 million people who are counting on you in this conversation. Don’t give up. Also, use precedent. Show them examples of other agencies who have benefited from user research - see usability.gov and the Federal Government Mobile Apps Directory.
Be respectful, but investigate fully. Often, lawyers will make suggestions that are not actually legal guidance, but do make things much harder to use. Ask to see the legal citations they’re concerned about. Consider a “no” to a usability improvement a “maybe” and work to get to a “yes.”