Redesigning and developing UMD website
As the Lead UX Researcher & Designer, I developed a university-wide website to serve a minimum 50K user base. The original website didn’t meet the Web Content Accessibility Guidelines (66% accessibility score) and wasn’t responsive. I redesigned and developed it to be responsive across all platforms, cross-browser compatible, improved the accessibility score from 66% to 98%, and reduced the number of clicks needed to access the required information by 75%. I also led the project and had conducted extensive user research - qualitative and quantitative to identify requirements, content strategy and drive design decisions.
My Role: User Research, User Interface Design, Prototyping and Evaluation, Front-end Website Development
Duration: 1 year
Tools Used: Axure RP, Google Analytics, UserZoom
Techniques Used: User Interviews, Stakeholder Interviews, Competitor Analysis, Usability Testing, Card Sorting, Statistical Analysis using UserZoom, Quantitative Research using Google Analytics, Heuristic Evaluation, A/B Testing, Empathy Mapping, Persona Creation, and User Journey Mapping.
The Who and The Why?
The Office of Institutional Research Planning and Assessment (IRPA) website didn’t meet the Web Content Accessibility Guidelines and had to be redone to provide it with modern look and feel, the best user experience, and meet the accessibility guidelines.
The primary users of the website are Deans, Assistant to Deans, Vice-President, Assistant to President, and many other university staff members who do not have much time to spend on the website and want quick access to important information. To analyze the areas that would need improvement, I lead and conducted extensive user research. The data derived through this helped me in driving the design decisions.
The design process included researching about the problem and analyzing the strengths and weaknesses of the current website. This was achieved through competitor analysis, quantitative research using Google Analytics, and heuristic evaluation. The next step involved interviewing the users and stakeholders to empathize with them and better understand their needs. This was followed by creating empathy maps, personas, and user journey maps. After I had identified the issues to be resolved, I conducted card sorting and formative usability testing to gain more insights about the problem in the website and learn how the users interact with it. These methods helped me in understanding that although the stakeholders wanted a modern look and feel to the website, the users wanted a better structure and organization of the content on the website. I realized that the stakeholder and user interviews that the improvements in the website that are important to the stakeholders are not as important to the users. Thus, my efforts were inclined more towards improving the information architecture of the website while simultaneously giving the website an elegant visual appeal.
Phase 1: Research
I started with researching about the problem in the website, knowing more about the target audience, strengths and weaknesses of the current design. To accomplish this, I conducted competitor analysis, quantitative research using Google Analytics, and heuristic evaluation.
Analyzed 30 Institutional Research websites across different universities to find their strengths that could be possibly implemented in this project. They were assessed on 17 parameters such as if the information was present on staff directory, data usage, Common Data Set; if they contained visualizations or interactive reports or fact sheets, were mobile-friendly, and what was their technology-stack. Some of the special cases have been discussed below.
Google Analytics was used to analyze the website traffic, the first page people check out on the website, their clicks and paths, navigation pattern, drop out rate, engagement rate for the period Oct 1, 2017-Jul 26, 2018. It was found that the average time spent on the website was only 3 minutes and people use it on Chrome browser on Windows desktop. Additionally, the screen resolution of the mobile devices of most of the people was 375x667 (which is iPhone 6-8). This information will be helpful while developing the website and making it responsive across all platforms. 82.7% of the users were new visitors and 33.26% of the users are from College Park. Given below are the screenshots of the web analytics.
Heuristic Evaluation helped me to identify usability problems in the design. The website was analyzed against 90 parameters such as visibility of system status, match between system and the real world, consistency and standards, aesthetic and minimalist design, and recognition rather than recall. Some of the issues identified have been discussed below.
Three different sections on the same website follow different style (shown below).
The terminology on the website is difficult to comprehend (shown below).
It is hard for the user to identify what’s clickable and what’s not. The text-heaviness also makes it difficult for people to access the information quickly (shown below).
The website contains misleading and redundant terminology (shown below).
Phase 2: Empathize
Conducted 14 semi-structured in-person stakeholder interviews. 53% of the people interviewed use the website frequently to look up something. 57% of the people wanted the look and feel of the website to be improved. 64% of the people want fewer clicks to access the content on the website.
Semi-structured in-person interviews were conducted with 8 primary users. 75% of the users have been using the website for more than 5 years now. All of them use the website to look up campus related facts on the website and wanted quick access to it. 62.5% of the people felt the website needs not just the modern look and feel but also better structure and organization of content on the website.
Support Staff Interviews
On interviewing 2 members from the support team in a semi-structured format, it was found that out of all the requests that they receive, 85% times the information is already present on the website and the rest 15% times is related to the data that the department does not collect or is publicly unavailable. 80% of the requests that they handle are from professors looking for data to report in the class or course evaluation information; deans, associate deans for enquiring about reporting and meetings.
Identified different type of users and created empathy map for them (as shown and discussed below).
Primary users (who interact directly with the website)- Employees at UMD who have been using the website for a long time, making them used to the interface. They don’t have much time to spend on the website and always want quick access to information. They call the IRPA staff members whenever they are unable to find the information.
Secondary users (who interact with the website indirectly)-
Employees at UMD who use the website indirectly and are still affected by it. They are the occasional users.
Support staff representatives at IRPA who provide technical support to the users and handle data requests. They use the website only to supply information or support to the users.
Tertiary users (public who interact with the website more indirectly than the secondary users)- New visitors to the website (i.e., people who are new to the interface) and are also not UMD employees.
The project focuses on the primary and tertiary users because the needs of the primary users hold the most importance and it is their expectations that this project will try to meet. However, it was also important to engage tertiary users (new visitors), as their opinions and perceptions could determine the success or failure of this project. Therefore, following personas were created and the project will focus on designing for these two user groups.
User Journey Maps
The two personas created (as shown above) helped me in producing the journey maps for these two user groups.
Phase 3: Define
Remote unmoderated usability tests were conducted with 5 participants using UserZoom after recruiting them using the screener questions. They were asked to perform 5 tasks with different scenarios and varied levels of difficulty. The test was followed by a questionnaire to ask about their opinions on the website and their comfort level in performing the tasks. The participants belonged to different demography. They were asked to follow the “think-out-loud” protocol which was helpful in figuring out their pain points. 4 of the 5 participants had used the website before, and 1 had never used it before. 2 of the 5 participants were primary users, and 3 were tertiary users. Out of all the tasks, Task 5, which was “Find out the essential elements of an undergraduate education as stated by Middle States Standard 12.“ was the most difficult and I learned that if we ask people to find a piece of information they are not used to finding on the website, they fail to find it. This implies that users have adapted themselves to the everyday navigation pattern and that the website is not intuitive.
Task Details and Results
Task 1: If you need to contact a staff member, how would you do that?
Result: All the participants completed it successfully.
Task 2: If you need to find the number of students in the freshmen class for Fall 2017, how would you look for that?
Result: All the participants completed it successfully.
Task 3: If you wanted to know how does the university define a student credit hour, how would you find that?
Result: 40% of the participants failed to complete it.
Task 4: If you wanted to find out the number of students with a minor in Spanish in Fall 2017, how would you go about doing so?
Result: 40% of the participants failed to complete it.
Task 5: If you were to find the essential elements of an undergraduate education as stated by Middle States Standard 12, how would you do that?
Result: 80% of the participants failed to complete it.
Given below is a summary of the results obtained from the usability test.
Content Inventory and Audit
After conducting usability tests, I identified that one of the major issues to tackle was poor information architecture, therefore, to improve that I conducted card sorting exercise which started with creating a content inventory for all things present on the website and auditing it to assess its usefulness.
After gathering data from the initial user research, I defined the areas on the website that need improvement, which included improving the information architecture. To accomplish this, I conducted remote open card sorting exercise with 60 users and 29 cards using UserZoom. They were asked to group the content according to their best understanding and label it according to what they feel should be the best term for each group. This was done in order to know how do the users perceive the content, and what would be the best way to organize and structure it on the website in their opinion. The data collected was analyzed using Dendrograms (Hierarchical Clustering).
Dendrograms (Hierarchical Clustering)
Dendrograms helped me to understand how participants tend to group the information together. Since there were 29 cards, it was essential to find a balance between the number of clusters and the size of cluster. Having a high minimum agreement resulted in less clusters but bigger in size and vice-versa. Therefore, a cut-off point of 68% with 6 clusters was decided so that I get around 6 clusters with the biggest cluster having 9 items.
After concluding the card sorting exercise, I analyzed and synthesized the results to create the website sitemap, laying out the content hierarchy, taking into account both user and business needs.
Phase 4: Design
Given below are the sketches of the homepage of the website and the various options I explored for the different sections on the website to come up with the best solution. To provide the primary users with quick access to important information related to the university, I thought of including a “Fast Facts'“ or a “Top 5 Reports” section on the homepage itself, and also following the Three-click rule for the website navigation so that the user is able to find any information in no more than three mouse clicks.
Phase 5: Prototype
I converted these sketches into paper prototypes and conducted A/B Tests with them to evaluate my designs and assess which design variation performs better.
Two different design ideas were tested with the users. Design Variant A had Carousal 1 (as shown in Sketch 2/4) and the Design Variant B had Carousal 2 (as shown in Sketch 3/4) to depict the “Top 5 Reports” section in the form of a carousal. Through this test, I wanted to know which design is a better way to provide the users with quick access to important facts about the university on the website. The users were asked to perform the following 5 tasks. The results showed that Design Variant A performed better than Design Variant B.
Task 1: If you wanted to know the number of undergraduate students in the university, how would you find that?
Task 2: If you were to access a report about new freshmen undergraduate enrollments, how would you do that?
Task 3: If you wanted to learn more about Instructional Workload and Outside Professional Activities, how would you do it?
Task 4: If you needed to request data from IRPA, how would you do it?
Task 5: How would you find contact information of an IRPA staff member, if you had to?
Design Variant A
Design Variant B
Along with the results obtained through A/B Testing, a common theme that originated across the users was the need for quick access to not just the facts about the university but also the commonly and frequently visited pages by them. Therefore, the high-fidelity prototype incorporated these suggestions besides following the three-click rule.
I used Axure RP for rapid prototyping and converting my paper prototype into a high-fidelity prototype. Here are the images of the designs of few of the important screens.
Here is a video of the walkthrough of the prototype.
Phase 6 & : Testing & Implementation
I conducted moderated in-person usability tests with 6 users to assess the efficiency of my designs. On testing the prototype with the users, it was found that my design had decreased the number of clicks required to access the most important report on the website by 75%. Thus, providing the users with quick access to the content on the website. The qualitative feedback received was also helpful to me during developing the website. I tested the website again on Siteimprove, a tool to test for site accessibility. The accessibility score has improved from 66% to 98%, even though the industry benchmark for educational institutions is only 70.6%. The website is also now responsive across all the platforms.
Some of the challenges I encountered during the process are:
Although the stakeholders wanted the website to have modern look and feel, the users had emphasized on improving the structure and organization of content on the website. Therefore, to overcome this, I had to find the right balance between the user needs and stakeholder expectations, and incorporate them accordingly into my design decisions.
Deciding on the right amount of clusters for the card sorting exercise was a bit of a challenge because it was my first time using the technique. So I found that a cut-off point of 68% with 6 clusters gave me the biggest cluster with 9 items and worked the best for this project.
Through user interviews, I learned that users do not want to invest time in learning a new interface, and they would rather abandon using the website. Therefore, I had to ensure that the website is user-centered and does not make the user learn the new interface, and is instead so intuitive that a user doesn’t have to think while using it.
I learned to use a variety of techniques and how to synthesize their findings to design the website.
Through this project, I came to know website improvements that are important to the stakeholders are not as important to the users.
I also learned how to negotiate with the stakeholders and handle situations when there isn’t an agreement.
I gained an in depth understanding of how to make an accessible and responsive website.
I got an amazing opportunity to tackle a real-life challenge and work on it end-to-end to solve it.
I would like to test the website with secondary users in the future, if time permits.
I am working on deploying the website on AWS and the website is expected to be launched soon.
I am also working on integrating the website with other dependent platforms.