user research + a/b testing + strategy
ROLE
Director of Information Architecture – Proposed, led, and completed all activities.
THE CHALLENGE
The University announced the CMS would be replaced due to usability and performance issues. I requested to be involved and evaluate the usability of the candidates since I was the only Usability Expert on staff. Eventually, this led to a UX Strategy to create a unified vision of what needed to be built, and how it would be used to guide not only the selection of a CMS but its implementation.
THE APPROACH
Design Thinking was used to evaluate the CMS Candidates. A Discovery phase included: identifying detailed, realistic needs for both stakeholders and users to inform business decisions, finding and designing solution, gathering quantifiable and qualitative feedback, and vetting and projecting success.
I utilized the UX Methodology to inform a definition of what is a useful and usable CMS for GW from both a user and stakeholder perspective. My plan was to identify user groups, their abilities, and tasks, conduct multi-variant testing on the replacement candidates, and define user requirements. I also expected to identify latent business requirements as UX does that very often. I was granted 6 weeks to do it. In that time I:
Conducted User Profile Survey to identify and draft user groups.
Conducted contextual inquiries and stakeholder interviews to identify latent user groups, flesh out user groups, and identify primary tasks finalize user and business requirements.
Created Personas and defined user requirements to communicate end-user needs.
Designed and conducted multivariate, scenario-based Usability Testing of potential replacement products to evaluate their ability to support user and business needs. Testing metrics were defined and agreed upon by my team. The standard was set high. Efficiency, Effectiveness, Ease of Learning, Error Tolerance, and User Satisfaction measurements were gathered from both my testing and user satisfaction surveys (Weighted Likert Scale).
Designed satisfaction surveys for users, stakeholder, technical, product teams (Weighted Likert Scales - Quantitative and Qualitative) to focus each group on their relationship and use of the CMS rather than everyone weighing in on how usable and useful it would be for users and the organization overall.
Regular updates were made to keep the University teams updated to progress, insights, and opportunities for input to foster ongoing support and gather needed feedback.
Conducted Participatory Design Sessions for critical CMS screens – Dashboard, content creation/edit to gather latent requirements and ideas, while fostering buy-in and consensus.
One of the personas that were used to communicate the findings and implications of the User Research. They were quickly accepted and used by stakeholders around the organization as users saw themselves in the personas. They would be an anchor for all my reporting and presentations.
Using the personas and task flows, I did a cognitive walk-through/heuristic evaluation and rate each potential candidate's potential to support the university's needs and users.
THE RESULT
The A/B Usability Testing revealed that the final 3 CMS candidates would not meet user needs in terms of common tasks, and the usability of each was low for the CMS users. Additional candidates were sought with a greater understanding of the organization's needs, and Drupal was selected for its flexibility and the ability to customize and evolve the functionality. I was also asked what CMS I would recommend, and I concluded that Drupal would offer the effective and efficient solution immediately with the ability to evolve over time to effectively support all the organizational and user needs to be revealed during this effort.
The testing metrics defined for testing were then reported back along side the quantitative and qualitative feedback elicited from Satisfaction Surveys to show the similarities in performance and user perception. I presented the results encouraging the testing participants to chime in with their perspective of the candidates.
A Report of Findings was prepared for the university. The testing metrics that were defined for testing were then reported back alongside the quantitative and qualitative feedback elicited from Satisfaction Surveys to show the similarities in performance and user perception. I presented the results an asked the testing participants to chime in with their perspective of the candidates.
MINIMIZING WORKFLOWS
Each organization had specific desires, and wanted individual workflows. I identified patterns from the interviews, and created a large workflow chart (5' x 3') and conducted walk-throughs with stakeholders who I thought had similar workflows. The session flowed smoothly with groups exchanging ideas on how to work more efficiently and agreed upon this workflow. This also depicted what the CMS needed to support.
I invited representatives from teams with similar workflows to review a workflow I sketched from the stakeholder and user interviews. They gave feedback and learned from each other how to work more efficiently and effectively. We streamlined every organization wanting their own workflow to agree on 3.
One of the latent requirements this effort uncovered was the need for content sharing. Several university level organizations had trouble managing content that was used by different groups in the organization such as Policy. The ramifications of the inconsistent and/or outdated information had legal and financial implications.
IMPACT
During this effort I introduced GW to Design Thinking - Making business decisions with a realistic understanding of the organizations' and users' needs, working collaboratively and iteratively. The established methodologies were used to identify detailed, realistic needs for both stakeholders and users to inform both business decisions, finding and designing solution, gathering quantifiable and qualitative feedback, and vetting and projecting success. I established User Experience as a trusted methodology and established my team as a collaborative partner in managing GW's digital properties.
DELIVERABLES
Personas
Report of Findings
Work flows & Interaction Models
CMS interface direction and functional requirements
Back to Top