DANIELLE HULIGANGA
USABILITY TESTING
The usability tests conducted to revise my infographics.
Participants
My intended audience was Cal Poly's Pilipino Cultural Exchange (PCE) club. I managed to get a total of 3Â participants for my Usability Testing.
- Participant #1 (P#1) was an expert--this person is currently a coordinator of PCE's folk-dancing team, Kasayahan.
- Participant #2 (P#2) was a member of Kasayahan.Â
- Participant #3 (P#3) was not a member of the folk-dancing team, but is a general member of the club.
Process
I was not able to meet with any of the participants in person, but I was able to conduct my testing over FaceTime. Each test took around 1 hour to complete.
​
I briefed each of my participants on what to expect, and I left room for initial questions.
I gave each participant a moment to look and read through the infographic. Then I asked the following questions:
​
- What was your first impression?
- What works well? Why?
- What can be improved? Why?
- What do you think the message is?
- Is the information presented clearly? Is anything distracting you or taking away from your understanding of the information? Why or why not?
- Is the information accurate and authentic? Why?
- If not, what can be done to help improve its accuracy and authenticity?
- Do the images/icons/visual representations help portray the information?
- What can be improved? Are there other visual representations you would recommend instead?
​
These questions were repeated for each infographic (5 infographics = 5 sets of questions; 45 questions total). They were presented to the participants through a Google Document, which they were given access to and instructed to fill out as they tested the infographics.
Results
After conducting my tests, I realized that I should have allowed the participants to talk through their thinking. Some of the participants spoke during their testing, but most of it was to ask a question or to clarify a piece of information. And while the Google Document allowed for an organized recording of their answers, it definitely inhibited the "stream of consciousness" thinking I initially aimed for. Additionally, some of my questions may have been too forward; in other words, I believe they structured the participants' thoughts. As a result, I fear the testing may have been compromised.
​
Having said that, each participant produced valuable feedback. All were critical and detailed. Interestingly, P#3's feedback was more detailed than that of P#1, the expert. I expected P#1 to have the most detailed review, but I found that P#3's lack of knowledge in the subject area resulted in more questions, thus yielding more thoughtful revisions.
​
When revising my infographics, I took into account both my professor's comments, as well as the feedback I received from my usability testing. The revisions for my infographics are detailed on their pages.