Presenting university COVID-19 guidelines clearly and intuitively.
During COVID-19, University of Washington Bothell created a webpage to inform students of campus health guidelines. This redesign visually communicates this information in order to reduce confusion and stress caused by lack of clarity in the original design.
We Were Asked To...
Redesign the University of Washington Bothell COVID Guideline webpage to make it clearer and easier to reference, and help reduce student stress about the pandemic. Our designs would be considered for the UWB website overhaul, which is currently in development as of 2022.
Overhaul the webpage's unintuitive information architecture
Differentiate the look of the site's navigation bar from the rest of its content
Replace or break up large paragraphs of text with images and diagrams
Take embedded links out of paragraphs and instead use easily noticeable buttons
Adding an entry page to make clear what kind of information is available on the site
UWB's COVID guidelines are much easier to access and understand, which reduces students' stress over returning to school following the pandemic. Additionally, the UWB webmaster has a better idea of what kind of layout conventions make text-heavy pages easier to parse.
Jan - Mar 2022
CSS 478 - Usability and
User Centered Design
Our team of seven split the work of research amongst ourselves, and the analysis of our findings largely fell to me. Once we performed usability studies on the existing site and our prototype, I analyzed the data and organized it into a list of key insights that we could clearly and easily present to our stakeholder. I also screened all of our study scripts and interview questions for wording that otherwise could have introduced bias into our results.
I led the group discussion on how to rehaul the site's information architecture in a way that would be clear and stay true to our card sorting exercises. After we had a more solid foundation, we split the prototyping work among the team. I was responsible for three of the pages, but also rigged much of the interactivity and helped keep the style and formatting consistent throughout.
Determining our Focus
Evaluating the Website
As we were tasked with improving an existing webpage, we needed to evaluate its existing design. The University of Washington (UWB)'s COVID-19 informational page was created as a resource for students looking to find out about UWB's pandemic-related health policies and on-campus testing resources. However, our evaluation and heuristic testing quickly determined that the organization of the site could be made much more intuitive to navigate. Many parts of the site were redundant or easy to miss, which isn't particularly conducive to being informational.
The home page of the UWB-COVID website as it was at the start of this project.
According to the UWB Webmaster, the UWB-COVID site was primarily viewed by current and prospective university students, so we knew we would be focusing on those demographics. In order to keep the entire team on the same page about our target users, we created two personas.
In order to provide context for the situations our users are using when navigating through the UWB COVID site, we wrote out four main scenarios we would focus on while collecting research on user satisfaction and the functionality of the site. These were as follows:
We created a journey map of our primary persona, Paul Hawk, a current UWB student, to provide context for his actions and motivations as he navigates the UWB COVID page and brainstorms improvement opportunities at each step of the process.
To keep our goals in mind while planning out this project, we wrote a list of the most important questions for us to keep in mind throughout the research process. We used this list to narrow down the focus of our research and make sure we were exploring data that would directly help us uncover issues with the current site and create solutions for those issues.
What pieces of information are our users looking for most often?
Is it easy to understand where to find topics at a glance?
Is it easy to navigate to more specific pieces of information?
Is the information on the site easy to read and parse?
How much of the available information is actually being read and remembered?
Are there any pages or sections on the site that get little to no attention at all?
How easy is the site to use, overall?
We obtained data about our users through a survey distributed in UWB's online social spaces via Google Forms. The purpose of the survey was to collect analytics about the UWB Covid information website and its usage, and the questions were intended to validate our personas and scenarios, as well as reinforce what we would later learn in our usability studies.
We asked a series of ten questions, which used a combination of open text responses, closed multiple-choice questions, and Likert scale ratings, to provide us with both quantitative and qualitative data. The questions first determined which user demographic the respondent belonged to, then presented the respondent with various pages from the website and asked them a series of questions about their impressions and ability to find certain pieces of information.
A few of the results from our survey.
The survey only got a small number of responses (10), so we decided not to lean on the statistics in our analysis, but as a source of supplemental data it was sufficient to confirm some of the findings from our later usability studies. The great majority of the respondents were current UWB students, and 50% of them had visited the webpage before. The majority of them were only somewhat confident that they would be able to find the pieces of information we asked about - not atrocious, but not especially good either.
Usability Testing - Scenario-Based Tasks
To find out which parts of the current website work and which ones don't, we conducted 6 task-based usability tests. We invited 6 test users who represented our primary user demographics and gave each of them a series of 5 tasks to perform on the website, followed by a post-testing interview to clarify their impressions of the site and the difficulty of performing the given tasks. The testing took place mainly online using screen-sharing over Discord or Zoom.
We wrote out 5 tasks that covered the functions of the site that users would be most likely to use. Each of the task scenarios included enough context that any test user could understand the reason behind the task. Once the user was given the task, they would do their best to find the relevant information, and speak their thought process aloud as they did. The task would end either when the user themselves felt that they had found the information they needed to find, or gave up because they couldn't find it at all.
Usability testing revealed several insights into which parts of the site work, and which parts need some improvement to offer a smoother user experience. These insights were as follows below.
Users consistently struggled to find certain pages
Users misunderstood the purpose of pages due to unclear wording in titles
Information the user needed was hidden behind other less relevant information
Users rarely noticed the navigation sidebar or index
Inconsistent visual language – the navigation didn't look interactive to users
The font is smaller and harder to read than everything else
The index looks exactly the same as all other links on the site
Users had to read through large chunks of text to find important information
Important links are hidden within paragraphs, and hard to spot even for seasoned users
Unclear section headings require the user to read all the text in order to know if they are even looking in the right place
Redundancy leaves users confused and uncertain
Inconsistency – some links use different wording and go to the same page, while other use the same wording and go to different pages
Directions to designated eating spaces are difficult to understand for users who have not been to campus
No map, and directions rely on landmarks that new students have no experience with
Information Architecture Overhaul
Since many of the site's issues were centered around navigation, our first order of business was to reorganize all the information on the site. We did not edit the content beyond removing sections that were especially redundant, simply rearranged it into sections that users could more easily understand. In order to figure out how to do this, we performed card sorting exercises both as a team and with a couple users who fit our primary persona.
A flow chart of our information architecture and how topics could be interconnected
After conducting our usability testing, we already had a few ideas for how to alleviate some of the issues we uncovered. Beyond basic readability and layout improvements, we wanted to test out adding an entry page to help orient users before they need to start reading text, and a series of quick links to commonly looked-for topics. Finally, we wanted to add a map to visualize the locations of designated eating spaces. We outlined how we would lay out common page components, decided on a color scheme and aesthetic that would be in line with UW's existing branding, and got straight to work.
Normally, we might have started with a low-fidelity prototype and tested that before putting in extra work, but since we already knew exactly what content and visual aesthetic the site would need to have, we felt comfortable launching right into making a higher fidelity prototype. Of course, we still solicited feedback at every stage to make sure we were at the right track, but overall this expedited method worked for the small scope of this particular project.
A couple pages showcasing some of the improvements we made.
The original site for comparison.
The prototype of our improved layout for the website.
Usability Testing - Round 2 of Scenario-Based Tasks
In order to test whether our changes had really improved the website or not, we performed the exact same testing procedures as in the first round of testing again. Once again, we used the same list of tasks and asked a set of 6 test users to perform them, and then give an interview about their impressions. The script and content of the tests were changed very little, mostly to account for wording changes and added/deleted sections between the two versions.
Once again, we received some great insights into what about our prototype worked and what didn't. Compared to the original website, we had improved on all of the issues we were targeting! Overall, users were able to easily navigate through all the tasks much faster than they were able to before. Where before they had voiced feelings of overwhelm and confusion, the overall impression of the prototype was that it was very straightforward.
Of course, there were still some things that could use further improvement. If we were to continue tweaking this project, we would likely arrange the entry page more efficiently, so that all of its elements could be visible without scrolling. Some of the text content could also use some more editing and cutting, but we chose not to tackle this since the text would likely be rewritten by the webmaster's team in the future regardless.
Based on our research and prototype, we were able to confidently recommend our improvements to the webmaster. We focused on recommending the following central improvements, but emphasized that our design works best as a whole working in tandem.
Overall, this project was a great opportunity for me and my team to practice usability testing and using Figma to create high-fidelity prototypes. This project was something I worked on early in my education as a designer, so many of these things were relatively new to me, and this project provided a great way to get to know the ins and outs of the process.