To achieve the usability goals we set for our new service catalogue, we knew we needed to answer a key question, "How does our audience understand the services we provide?"
To find out, we decided to conduct a series of exercises that included:
- Benchmarking analysis
- Focus group workshops
- Prototype testing
What is ServiceNow?
Last summer, Â鶹AV IT Services introduced a new tool, called ServiceNow, to manage customer support requests, incidents and changes to the services we provide and maintain (see ). The current phase of our multi-year IT Service Management (ITSM) project involves redesigning our service catalogue to better align with the needs of the Â鶹AV community and changes in the technology landscape.
Benchmarking analysis
We began by reviewing and comparing the service catalogue displayed on the homepage of our website with other universities’ IT service catalogues (primarily focusing on universities who were using ServiceNow).​ We also spoke to representatives at each of the universities to get details on the process they had followed when implementing ITSM and the successes and challenges they had encountered.
Outcomes:
- ​Identified strengths and weaknesses of our existing catalogue structure and those of other universities
- Gained an understanding of emerging standards for structuring and organizing content in ServiceNow​
- ​Identified successes and challenges encountered by other universities in implementing ServiceNow
- ​Identified labels to include in our focus group tests
Focus group workshops
We were very pleased by the response we received from the Â鶹AV community when recruiting participants for our focus group workshops. 24 students, faculty and staff members attended four focus group workshops held in early 2018.
Outcomes of our focus group workshops
In each of our workshops, participants were divided into small groups and asked to identify their preferred names for our services and service categories. For each exercise, we provided name suggestions based on the outcome of our benchmarking analysis. Participants were also invited to propose new names if they felt the ones provided were inadequate.
At the end of each workshop, participants presented diagrams they had created with their preferred services and category names. They provided explanations for the choices their group had made, including comments about difficulties they had experienced when making decisions.
Data collected from our focus group workshops was used to inform the structure, category labels and service names for 2 navigation prototypes.
It's complicated...
Recurring comments made during the workshops also helped us identify and/or confirm challenges our clients face in understanding and consuming our services.
Though we realized a properly constructed navigation wouldn't necessarily solve these issues, we felt it was important to strive to implement solutions to alleviate these challenges as much as possible:
- Overwhelming number of services: Most of our audience members are only familiar with a small fraction of the 150+ services we provide - usually the services that are absolutely necessary to their day-to-day work. The high volume of services makes it difficult for clients to discover/learn about additional services they might find useful.
- Misconceptions: Clients can have misconceptions about our services, even the ones they are familiar with e.g. the misconception that email and Outlook are the same service.
- Unfamiliar, technical names: Some of our services have names that are not intuitive, e.g. EZ Proxy, Banner and Yammer.
- IT versus other university services: Clients sometimes have difficulty distinguishing between the services we provide, and the services provided by the departments who use the systems we support, e.g. we support HR systems, but we have nothing to do with HR policies.
Prototype testing
For our prototype tests, each participant was presented with one of the two navigation prototypes we had created and given 12 tasks to complete. For each task they were asked to find an answer to an IT services related question e.g. "Where can you find out how to reset your Â鶹AV password?​"
​Prototype differences
One of our key reasons for testing two different prototypes was to determine how many categories we should have.
Prototype A
- 12 top-level categories​
- Few second-level categories​
- Little repetition of services across categories​
Prototype B​
- 8 top-level categories​
- More second, third and fourth-level categories​
- More repetition of services across categories
Ìý
Analysis
When analyzing the results, we looked at the navigation paths participants had taken when completing each task. We identified common issues that participants encountered that needed to be addressed, and compared the results for the two prototypes to identify which was more successful.
"Final" recommendations
Our recommended navigation structure ended up being a mix of the two prototypes that we tested - a shallow navigation structure with 12 top-level categories and repetition of services across the categories.
We were also able to provide recommendations for changes to service labels, service category labels and descriptions, as well as guidelines for additional ways to browse services including:
- By audience ​
- By popular services
- Alphabetical
- Search suggestions
Next steps
While we know our recommendations won't result in a perfect solution, we're feeling pretty hopeful that they will result in an improved user experience that is better aligned with general audience needs and expectations. As our ServiceNow project continues (the full rollout is expected to wrap up in 2020) we're also anticipating the need for additional evaluation and user testing.
In advance of the launch of our new service catalogue in ServiceNow, some of the recommended updates will be seen on the IT Services website in early 2019. Check back for updates!