Here we are again, discussing M-23-22. To refresh your memory, Delivering a Digital-First Public Experience (M-23-22) memo was published in September 2023. The memo serves as guidance to federal agencies as they continue to implement the 21st Century Integrated Digital Experience Act (21st Century IDEA).
We already integrated the requirements into our metrics and wrote about it in the blog “Maximizing impact of federal websites: integrating metrics with annual goals and policies”, and addressed or planned to address areas that need improvement. We also created a customer experience (CX) plan to further guide our efforts. Our CX plan focuses on human-centered design, accessibility, multi-channel support, and feedback and continuous improvement. These focus areas reflect the already established goals of our team.
The self-assessment catalyst
We had an introduction meeting with the Department of Interior’s (DOI’s) new Chief Digital Experience (DX) Officer, Andy Lewandowski. During our conversation, he asked us how compliant we were with M-23-22. We took his question to mean how compliant our websites are in terms of percentage. Prior to the meeting, we had identified key areas that we needed to improve – accessibility, mobile responsiveness, and design consistency – to name a few. We hadn’t quantified these key areas though. In the moment, we committed to quantify our level of compliance.
After the meeting, we started brainstorming how we would do the analysis. We decided to directly transfer the requirements into a spreadsheet. If you’re a regular reader, you know we love spreadsheets!
The spreadsheet and the scoring process
In our M-23-22 scoring spreadsheet, we created the following columns:
- Requirement number
- Requirement
- Subcategory
- Specifics
- NRRD*
- ONNR.gov*
- Blog-onrr.doi.gov*
- Blocker
- Progress
- Notes
The three columns with website names (NRRD, ONRR.gov, and Blog-onrr.doi.gov) are where we mark the level of compliance (marked with an *). We decided that we would rate each site as fully compliant, partially compliant, non-compliant, out of scope, or not applicable (N/A).
Alex (DX) and Lindsay (CX) worked to fill the spreadsheet out together and then met individually with other team members to walk them through the spreadsheet and ask each for their assistance in their areas of expertise.
After we filled out the spreadsheet, we met together to go over it in detail. Reviewing it ensured that we all had a grasp of each requirement and agreed about our level of compliance.
Working as a team to fill this out made sure that we all agreed about our level of compliance.
Our results and how we scored ourselves
Through this collaborative scoring process, we produced comprehensive documentation in the ONRR scoring spreadsheet. The scoring spreadsheet helped us to think about what each requirement meant then compare our products side-by-side. Other agencies can use this template scoring spreadsheet if they want to do a similar exercise.
While there are over 90 individual requirements, we will walk through each overarching category where we scored ourselves. These overarching categories were defined within M-23-22. We will discuss the general intent of each category and our team’s successes or areas of improvement related to that category.
Nine overarching categories
Accessible to People of Diverse Abilities: This category focuses on the design of accessible experiences, following accessibility standards, testing for accessibility, conducting inclusive research, and promoting accessibility and user feedback. Over the past three years, the ODDD team has continuously worked on improving accessibility, especially as accessibility requirements and guidance changes. The ODDD team wrote a blog about their work on 508 compliance and Trusted Tester training.
Consistent Visual Design and Agency Brand Identity: This category focuses on the use of USWDS, the establishment and use of brand design, creation of an online resource for visual design and branding, the use of a government domain name, consideration of user perceptions, and reducing user friction or mistrust. During the redesign of both onrr.gov and NRRD, visual design and website branding were considered heavily for these websites. Similarly, the blog was carefully designed with attention to the blog’s branding. Since then, the team has considered implementing changes that would align with the visual design and branding of all the websites. New changes to design and brand will be considered using the United States Web Design System (USWDS) components that aim to create more consistency across the federal government. The ODDD team wrote a blog on USWDS and onrr.gov.
Content That Is Authoritative and Easy to Understand: This category focuses on keeping the content up-to-date and removing duplicates, using plain language, considerations for translation and localization, the governance of content, and public awareness campaigns. Currently, our websites’ content is a major focus for our team; it is consistently monitored and continually improved by the ODDD team’s content strategist. Several blogs have documented ODDD’s journey to improve content: content audit, plain language, and content management strategy.
Information and Services That Are Discoverable and Optimized for Search: This category focuses on providing users with a functional search, content that is optimized for search engines, content creation dates, and websites that allow automated web scraping. The improvement of the search engine for onrr.gov has been in progress since 2022, as we researched and identified the accessibility limitations associated with its functionality. Many of the onrr.gov search issues were resolved with guidance and help from Search.gov, which we have discussed in a blog about the improvement of our search function. The team still aims to improve the metadata and other techniques for search engine optimization for onrr.gov. NRRD’s search can also be improved and the ODDD blog does not have a site search.
Secure by Design, Secure by Default: This category focuses on encryption, providing secure and usable authentication, designing secure experiences, conducting security assessments and testing, providing ways to report security issues, and avoiding unnecessary third-party resources. The ODDD team has undergone a thorough security assessment this year that has helped to identify areas where security can be improved. Each year this assessment will be completed to ensure our websites are up to high security standards.
User-Centered and Data-Driven Design: This category focuses on user engagement throughout design and development, user testing, incentivizing participation, making data-driven decisions, and using web analytics. Human-centered design is a core principle of the ODDD team – we make design and development decisions based on user research and testing. The ODDD team wrote a blog about the modernization of our website and the use of human-centered design.
Customized and Dynamic User Experiences: This category focuses on designing customizable experiences, protecting user privacy, pre-populating forms with user data, and communicating to users through their preferred channels. This category does not apply to most of the services that our public websites provide or is outside of the scope of the ODDD team’s responsibilities.
Mobile-First Design That Scales Across Varying Device Sizes: This category focuses on designing mobile friendly websites and digital services, testing on different mobile and tablet devices, using modern understanding of usage patterns and protocols, optimization for performance, and avoiding the implementation of unnecessary mobile apps. This is a category that our team flagged as an issue for our websites, mainly onrr.gov and NRRD. We have seen an increase in mobile use of the NRRD site and have several issues in our backlog that aim to improve the mobile user experience across our sites.
Other Digital Experience Requirements: This category is a catch all for other important requirements related to digital experience, including: privacy, software development principles, required links, digitization of forms and services, signatures, customer experience and digital service delivery, and standardization. While this miscellaneous category contains a variety of ways to improve digital experience, some of them have already been prioritized and improved by our team. Our team lists required links about legal compliance, points of contact, the agency’s mission, and more. However, some other requirements, such as the digitization of forms and services, is a much larger effort that requires coordination outside of our team with support from higher management.
Overall scores for each ODDD product
We then developed an overall score for the individual ODDD products in each category, using the scores for the individual requirements.
-
NRRD: The self-assessment found that this website was 76% fully compliant, 17% partially compliant, and 7% non-compliant.
-
Onrr.gov: The self-assessment found that this website was 77% fully compliant, 17% partially compliant, and 6% non-compliant.
-
ODDD blog: The self-assessment found that this website was 68% fully compliant, 17% partially compliant, and 15% non-compliant.
Approximately 20 requirements from M-23-22 were not applicable or outside of scope for our 3 websites. Some requirements were not applicable because ONRR does not provide that specific service to the public. Other requirements were considered “outside of scope” because the ODDD team does not have ownership of a particular product; cross-team coordination and direction from high-level management is required to implement the requirement.
While our team celebrated the parts of the website that were fully or partially compliant, we found the parts that were non-compliant and out-of-scope to be of great value and the most important for our growth. From here, we could create a plan to improve our products.
Next Steps
Now that we have a good sense of where our products could use improvement, we developed a plan for moving forward.
- Creating a plan: We created an online task-based project where we can track individual requirements for each product.
- Prioritize: We prioritized the requirements that will have the greatest impact and can be taken on by our team in the near-term and the long term.
- Acknowledging our limitations: Our team has limited time and resources, so we must be careful to set realistic goals for implementing some of these requirements.
- Discuss the vision: Our team aims to have larger conversations with high-level management. We want to discuss the intentions of this law and memo, and where it can be practically applied to ONRR’s public services. Some of these services reach beyond the ODDD team, so we must have a champion from management to support the coordination across other teams at ONRR.