Your government

User experience year in review 2024

By sdbergqu
February 11, 2025

It's been another busy year for the UX office as we strive work with our government partners to deliver digital services that are simple and fast for people to use. I thought I'd highlight some of the projects we've worked on over the past year. 

Yukon.ca

By far the biggest research project was a user experience evaluation of Yukon.ca assessment - the government's main website. Our goal was to better understand how well the government's main website meets or does not meet user needs. Read more about what we learned on this blog. Our team lead - Lily has been working with departments to create a roadmap that is focused on better meeting the user needs identified through this research. There will be more on this over the winter and spring. 

Forms modernization

Another large project for the eServices team is a forms modernization project. The reason I'm mentioning it here is because we see this as an opportunity to deliver services to Yukoners in a way that meets their expectations. In many cases we are starting with a paper form and in the process of making an online form - we are greatly improving the user experience. 

In addition to Drupal web forms we are now using Docuphase and Adobe Experience Manager. The bulk of the work we've completed this year is applying the Yukon.ca Design System and related online design patterns to the platforms. This has been an excellent opportunity to revisit and evolve many of our design standards. And we've been able to incorporate many UX best practices into the templates and guidance so it's easier for teams to meet the government's Digital Service Standards

We selected the first batch of forms we're piloting because they represent a few use cases and they are relatively simple. They have allowed us to learn about the best ways to design and build and the teams we've worked with appreciate the value add we offer to improve the user experience. In fact, when they see the improvements and what is possible, they want to do even more!

We are aiming to launch these simple forms in around a week from the initial design and we are finding our way to that goal. One lesson we've learned is that even forms that seem simple on the surface have many hidden surprises we get to tackle. 

You will hear much more about these projects this year so keep an eye out!

Evolving online design patterns

This past year say many a COTs product procured to meet business needs. This was an opportunity to see how we could apply the Yukon.ca Design System to these to ensure the services function and look consistent with other digital services. We also spent time figuring out how to apply our standard online design patterns and forms standards to these services. In some instances we were able to evolve these patterns to incorporate a variety of new use cases. 

This has been very interesting work and I expect to have updated design patterns in the Digital Service Delivery Guide over the winter.

See our existing online design patterns and forms standards.

Formalizing a process for capturing citizen satisfaction

Citizen satisfaction or sentiment is the same as customer satisfaction. It is a measurement by which we can see how satisfied citizens are with the government's digital services and websites. It's an important metric because the government should be able to:

  • identify usability issues;
  • find solutions to address those issues; and
  • test and implement solutions to improve the user experience over time. 

At a minimum, when we launch digital services and websites to the public, teams that manage the service track and report on 4 key performance indicators. These are: number of completed transactions, completion rate, digital update and citizen satisfaction.

Over the past year I've started to examine how we can create more capacity for staff who monitor user feedback so they can quickly identify issues and address them. As the person who is responsible to make sure all digital services and websites are performing as they should, I want to be able to strategically identify services that are underperforming so I can work with the teams to address usability issues.

You can read the blog post to learn all the details about this project and what to expect in the coming year

Streamlining Service Maturity Assessments

Quite a bit of work this year was spent evaluating our process for conducting service maturity assessments

In Pre-discovery, eServices determines which of these mandatory assessments a project must go through before it launches to the public. The purpose is to make sure teams understand and the service meets requirements to align with the government's Digital Service Standards

The first part of work completed was to evaluate the existing assessments. To do this I interviewed past clients to get their feedback on the process and identify where we could improve it. I also met with other Canadian jurisdictions to find out what their processes looked like. I wanted to understand their process and level of rigor. 

What I learned from the conversations with past clients and project managers

The majority of clients I spoke to found there was value in the assessments. They don't build digital services and this is all new to them so they liked the check in and opportunity to talk about the project. They also wanted to make sure they were "following the rules" so they didn't get to their launch date only to find out they had to complete a Security and Threat Risk Assessment or they had to complete usability testing and could not launch the service. 

These were often services where the requirements flagged in the assessment report were minimal and we were able to work with the team to find solutions. 

When I spoke to project managers they were split on the service maturity assessments. One echoed the sentiments of our department client "They are helpful." and "The requirements were not a surprise - many echoed the project team's feedback." Another vendor I spoke with did not like the process at all and found it to be repetitive and onerous. "I already know all of this and it seems like a waste or time." and "I don't understand why it's important for eServices to know this."

In the first case - the project manager very visibly had the eServices required assessments and reviews indicated in the project plan. There was time and budget to address any resulting requirements. They were also open to review the reports together and talk through solutions and how to implement them. 

The second project manager was also engaged, but the conversations were more challenging and I had to justify the rationale for many of the questions. 

Overall I found the project managers that attend the assessments with their department clients to support them see the value of the assessments. Those who attend without their department client or in place of them do not see the process as valuable and they do not have as good an understanding of the process as they think they do. 

This was very interesting and helpful to understand these perspectives. From these conversations I had questions that framed my next steps.

  1. Can we streamline the process more to make the best use of everyone's time?
  2. Can we simplify the report and make it clear if a project passes or fails an assessment?

Streamlining the process

I am very biased when I say this, but we have a service delivery process we follow for all public-facing digital services and websites. We have a well-established assessment and review process that aligns with the service delivery process. And when we meet with teams in pre-discovery - we communicate which service maturity assessments they have to complete and how many eServices reviews our team must complete. 

To streamline the service maturities even more, this past year I created a combined Discovery and Alpha assessment. I have also created assessments that are more streamlined by keeping the UX-related pieces like user research, accessibility, application of the Yukon.ca Design System and related online design patterns, etc. I removed other parts of the assessments that are important, but more related to privacy, security and processes like approvals. These will be in a separate checklist the project's service delivery manager will complete with the team. We will start to test this out in the spring so see if this reduces the time needed:

  • for teams to participate in the assessment; and
  • for the UX manager to conduct the assessment and write the report.

Simplifying the assessment report

Part of the work described above will simplify the report and keep it focused on the user experience. The other question I've been investigating is how we can standardize the assessments so there is a pass, pause, stop for each section and an overall pass/fail. The reason for this is when working with a variety of teams, the requirements we share are taken as suggestions and may not be implemented before the service launches. The main reason for this is the level of detail included in the reports.

Sometimes the instructions to meet a requirement is black and white. For example, you must include alt text on all images on the site. Other times the high level user need must be addressed, but there are multiple ways to address it. Including a description of the issue, describing why it does not comply and then explaining the options means the reports are long and can read as ambiguous. So they aren't the easiest for the department client or project manager to cut and paste into a spreadsheet or ticketing system. 

I am currently working on these new assessments and after my team's review we will start to roll them out.

Looking to the year ahead

As you can see from the highlights above, as a one-person user experience professional my time is split between working on process, updating guidance, writing policies and standards, service design, facilitating and leading workshops, running assessments, reviewing new services and products, conducting research and anything else that relates to my area of expertise. 

It is never dull and there is always work to be done!

What types of projects are you working on this year? Comment below to let me know and if you have any solutions you think might help with any of these projects - let me know.

Comments

By submitting your comment, you understand it will be published on a public website if it meets our commenting policy. Read our privacy statement to see how the government handles your information.

Be the first to comment
Add new comment
Close
Was this page helpful?
Date modified: 2025-02-17