Product Design

Aqua: Making assessment as easy as it gets


After proving that a simpler scoring experience was possible, the product team at Taskstream set themselves to develop a simpler way to manage assessment processes across campuses.


Project Context
3 month long UCD consultancy to define IA, UX Strategy and UX for MVP.
After that I joined as a full-time Sr. UX Lead and took system navigation design as my first project.
My Role
As UCD consultant I brought user -research insight to inform the Product team’s strategy and help identify key workflows to support.
Synthesized research into design principles that can guide future development.
Defined foundational interaction pattern and design language.
Skill Set Used
Heuristic Evaluation, Contextual Inquiry, Affinity Diagrams, Competitive Analysis, Artifact Analysis, Personas, White-boarding, Prototyping, User Testing.


Intro

Taskstream has been a leader in the assessment software space for the past 15 years. Their Learning Achievement Tool is a very robust solution which supports a great array of use cases, however, such flexibility comes at a cost: it can easily generate unnecessarily complex interaction paths for the users.

After partnering with AAC&U to design a simpler scoring experience   for the MSC project, Taskstream had a unique opportunity to bring that simplicity to campuses across the country, so the  PO team set themselves to create an easier way of managing assessment processes across campus.

Engagement Goals

  • Attract new customer base with a streamlined user experience
  • Improve the assessment experience for current customers
  • Reduce implementation and customer support cost
  • Validate workflow defined for the VALUE Rubric scoring project to identify commonalities and differences with a regular campus assessment effort.
  • Define a pattern library to guide all new product development efforts

Research


In order to create a really simple solution for assessment on campuses, we first needed to map out the problem. We conducted exploratory user research to better identify our user needs and to explore the way users currently deal with the problem both inside and outside the existing system.


UCD RESEARCH
Affinity Diagrams

I find Affinity Diagrams a great way of making different types of data bites collected during research to mix and blend on the wall, unveiling new relationships between them and allowing for new insight to emerge.

RESEARCH
Cardsorting

We explored concept grouping with an Open Card Sorting exercise in person and then tested the proposed IA with an closed card sorting exercise online.

RESEARCH
Flow Model

The flow models allowed us to better understand assessment as a system. They also helped us identify the different actors involved and which information passed hands between them in different institutions.

SYNTHESIS
Consolidated Flow Model

The consolidated flow model allowed us to identify the common paths across multiple institutions and to identify truly divergent scenarios.

RESEARCH
Heuristic Evaluation

We used heuristic evaluations to learn from previous successes and pain points and determine a baseline for user and development team expectations.

Synthesis
Personas

These personae evolved from our initial research, and they've matured as we keep learning about our users and validating them at every chance we got.

Arrow
Arrow
Slider

We used the following exploratory methods:

  • Heuristic Evaluation of legacy systems
  • Domain Expert Assisted HE
  • Listen to training calls
  • Artifact Analysis (Evaluators’ guides, glossary,etc)
  • Think-aloud interviews with assessment coordinators and evaluators
  • Card sorting for IA and navigation

We synthesized our research using several different models:

  • Affinity diagram with Product Owner
  • Flow models
  • Sequence Models
  • Consolidated responsibilities model
  • User flows


Insight

Guiding principles arisen from our research

#1

Help build a better culture of assessment on campus: Meet them where they are

We realized that while many AC were willing to deal with the inherent complexity of assessment, most faculty viewed this processes as meaningless extra work, thus helping assessment coordinators get faculty to buy-into the process of assessment and engage was a key goal.

Proposed approach:

‘Meet them where they are’ in terms of engagement.

  • -Make it really easy for faculty to complete their task
  • Avoid constant decision-making. Define defaults.
  • Allow assessment coordinator to delegate if she has buy-in, but also to take care of the work if the culture of assessment in the organization is not quite there yet.
  • Avoid redundant tasks. Make sure contributors such as faculty and students don’t have to submit their work twice to the system for different purposes.
“I’ll take whatever they [faculty] can give me”




#2

A guided experience: There’s gotta be a better way, the Taskstream way!
From all of the emerging patterns one really stroke us: several power users mentioned feeling unsure of whether they were using the system ‘the right way’. The lack of positive feedback loops and the minimal sense of completion provided by the current system created in our users an underlying feeling of distrust and lack of control. Creating guided experience that provides positive reassurance and boost our user’s confidence while remaining flexible enough to keep her always in control became one of our main tenets.

Proposed approach:

  • Avoid rabbit holes: keep the users aware of the context of their current task
  • Clearly define main actions for each page. Not every user is an advanced user. Move advanced features down in the visual hierarchy.
  • Avoid complex workflows with unclear start and end points.
“There’s gotta be a better way…”


Proposed Solution

We proposed a solution based on 3 tenets: The concept of ‘project’ as a wrapper for assessment activities, a navigation structure to support that concept, and a pattern library that would allow us to embody our guiding principles throughout the interface.


UX Strategy:

The concept of a Project as organizing narrative.


Navigation:

Making the project narrative the backbone of our UX.


Interaction Pattern library:

Patterns to support guidance


UX Strategy:
Assessment workflow as a project: What, where, who?

Our idea for encapsulating this workflow was to package it inside a “project” that could support a single phase or measure in  a general education or program assessment plan. The project would allow them to configure the outcomes for measurement and the courses and evaluators that would participate.

We wanted to create a simple workflow to map to these key questions: “What are we measuring? Where are we collecting the evidence? How are we measuring it? Who is evaluating the evidence?”


The work of managing evidence (collecting the assignments and work artifacts) for assessment coordinators and faculty had to be not just simpler but more connected to the context of the project.

We had a great foundation for the scoring experience, but iterated on it to support additional use cases, like scoring an assignment with more than one rubric, and scoring multiple files with a single assignment.

All of those phases – the project setup, the collection of work and the scoring – are critical points of engagement, but are ultimately means to an end: steps to get to meaningful data about student learning outcomes. So we sought to remove the friction at each of those steps and offer a payoff at the end, in the form of intuitive, interactive reports about student performance.

Arrow
Arrow
Slider



Navigation:
Organizing navigation around the concept of the project


Originally the MVP database was structured around capability modules, making each area of the product –manage submissions, scoring and reporting– a separate feature that needed to then ‘load’ the proper project data.
The research around navigation made it clear that users’ organized their mental model one project at a time, so we created a nav strategy to support that narrative.
We explored concept grouping with an Open Card Sorting exercise in person and then tested the proposed IA with an closed card sorting exercise online.


We used these insights to inform our layout: we used cards in the homepage as a way to show key info for each project and allow users to track project status and scoring progress, while the ‘project layout’ navigation structure aligns with the assessment narrative: Plan – ID /Manage Evidence – Assess – Analyze

Interaction Pattern library:
Defining consistent patterns and good practices

Here are some examples of the library of patterns and best practices we’ve built.

 
Sticky navs
Sticky navs allow us to re-prioritize and adjust the content on the page as the user scrolls. As part of creating a ‘clear-path’ design language our sticky navs for immersive layouts (Full-screen and sliding panels) always include the page main actions and ways of navigating out of the current screen.


Immersiveness
We propose using immersiveness as a way of supporting our principle of ‘guided experience’. We use immersiveness in two ways:

Full-screen mode:
To keep the user focused in the current most important task (ie: scoring student work)

Sliding Panel:
To allow the user to open and complete secondary tasks, and easily come back to the point where they left off their primary workflow.


From zero-state to desired state
In order to create a guided experience we make sure each workflow clearly communicates an initial state and a desired end point.

Inviting empty states
Every element requiring user input has a clear empty state (never an error message, as was common practice in the legacy system) that invites the user to provide such input.


 Clear primary and secondary page-level actions
Every page and modal requiring user input has a clear ‘finish-line’ call to action. Example: if a content type can be saved as draft but ultimately requires the user to take action to publish it, the ‘Publish’ button should be treated as primary and located on the right top corner of the page (for left-to-right languages), as it is the page’s final step and will move the user forward and away from the current page.