DESIGN FOR LEARNING

Empowering Teachers With Data

Leveraging a data driven approch for pedegagical decision making.

BACKGROUND

Renaissance Learning is a leading educational technology company. Their products have been used by K-12 schools nationwide. A few years ago, the company designed a build-in tool, the Goal-Setting Wizard in their product, to help teachers use their data and research to set appropriate goals for students. However, this tool is rarely used by their users.

Our CMU capstone team has partnered with Renaissance to redefine the problem and created a solution that empowers teachers to use data-driven approaches to setting student goals. The project lasted six months. The solution has been designed, prototyped, tested, and shipped.

MY ROLE

  2015.05 -
  • UX Research Lead
  • UX Designer
  • Graphic Designer

SKILLS

  • Interview and contextual Inquiry
  • Usability Test
  • Interaction Design ad Prototyping
  • Graphic Design
  • Dat Visualization
1

PROJECT SUMMARY

The Problem, Solutions, and Impact
"Star’s Goal-Setting Wizard is a great tool for teachers. We built this product with solid scientific research and years of students’ data. However, only 5% of teachers use this tool, and even fewer people use it correctly."
- Eric, Renaissance Learning

The "5%" Problem

A few years ago, Renaissance Learning designed a build-in tool, the Goal-Setting Wizard in their education product, to help teachers use their data and research to set appropriate goals for students. However, only 5% of their users use this tool, and even fewer people use it correctly.

Setting goals is a fundamental aspect of assessing a student’s progress. By setting a goal for a student, a teacher creates a criterion against which to measure student growth. With this criterion, that teacher can measure the effectiveness of their interventions, decide how to allocate resources for students in need of support, and consider the types of interventions that the student needs. In addition to providing feedback about intervention efficacy, challenging yet reasonable goals can engage students and promote growth. Renaissance Learning’s psychometric research and years of students data are extremely valuable to help the teacher make decisions. However, the low usage of this product damages the interests of both sides.

Learn from the uesrs

What do the users want and how can they perceive the value of the product?

To direct this project, we used a human-centered research approach including contextual inquiry, interview, cognitive task analysis. We approached classroom teachers, administrators, and intervention specialists to understand goal-setting workflows and gain insights about goal-setting processes in Multi-Tiered Support System (MTSS) systems. Those user research results helped us to understand users’ pain points and frustrations. The insights brought us a whole new angel to view and reframe the problem.

Our Solution

Users want the product to be trustable (scientific), providing sufficient student information and pedagogical suggestions in an intuitive way.
The client wants the design to be integrated into their current product and to have an enduring impact in future generations of products. They want it to leverage the power of their research and database to help users make decisions.

We designed a product that empowers teachers to use data while goal-setting for students in their classes. Vittore’s solution pairs a tutorial on goal-setting best practices with an “in-vivo” pedagogical agent that guides users through their goal-setting processes.

THE TUTORIAL

The tutorial trains teachers on goal-setting best practices, how to use the goal-setting interface, and how the pedagogical agent can aid their decisions. This tutorial is designed to provide easily-understood, actionable information about goal-setting with data.

THE PEDAGOGICAL AGENT
(Chrome Extension)

The pedagogical agent provides alerts where users deviate from best practices and visualizes student data to ease interpretation of these metrics. The alerts are offered as unobtrusive recommendations to teachers at relevant times in their workflow to offer improvements to their practices and to provide easily-accessed reference materials.

Impact

Our solution reduced 60% of common usage errors, increased the percentage of right decisions from 10% to 80% in final product evaluation, and received high ratings on user experience.

Due to the different knowledge level about data, some users might not be able to understand data metrics that are used in the product. However, almost all the users can apply them to make an appropriate decision.

2

RESEARCH PROCESS

Understanding Our Users

Field Study

The majority of the research that informed the direction of this project stemmed from two human-centered research methods: contextual inquiries and think-alouds. We interviewed educators in different roles (administrators, classroom teachers, and intervention specialists), teaching experience, geographic areas, and experience with data. More specifically, we used semi-structured interviews with the subject about their work and their work tools in the context in which the subject completes their work. This is a method suited towards understanding the work environment and culture that may influence the subject.

We used coding and affinity diagramming to analyze data from the contextual inquiries

Coding is a process of categorizing quotes and notes from raw interview data into relevant and informative codes determined by the protocol and guiding research questions. Some examples of our codes include: GS (Goal Setting), IoR (Interpretation of Reports), and CwS (Communication with Students).

Affinity diagramming involves grouping raw data from contextual inquiries into summative, hierarchical thematic and insight categories. This is a bottom-up process in which categories emerged organically from the interview notes. It allowed us to consider how the findings fit into categories of instructional design including educational goals, instruction, and assessment. Also, we marked frustrations and possible design ideas in the affinity diagram.

While doing affinity diagram, we used graphic models allowed us to organize interview findings to uncover user behavior patterns. In this project, we used Cultural (Value) Model and Flow models. Value model helped identify value points, expectations, and influences between stakeholders. ISequence flow models were used for a detailed view of an intervention specialist’s workflow.

Understanding the Workflow

Sequence Flow Model shows the intent, interaction between people and tools along their workflow in great details. In this model, we narrowed down our scope and analysed the whole intervention and progress monitor process from an intervention teacher’s perspective.

Understanding the Value Points

A cultural model reveals the value points, expectations, and information flows between administrators, teachers, students and other roles in school education. For this project, expectations of tools were also considered. In our cultural model, we found that the school education culture has impacts on the following aspects of progress monitor and intervention.

Insights

Insights are behavior patterns, breakdowns, and pain points generated from the affinity diagrams and models, used to guide product developed. We compared the models to best-practice workflows to identify breakdowns and clustered user interview notes to find pain points.

While the team generated ten key insights from the spring research, the four insights that most guided the product are highlighted in this section.

1
Data Visualization
Teachers find visual representations that display student requirements very helpful.
"It’s nice for the teachers to see this - when you color code it, you can see if better than just having numbers in there."
- Elementary School Administrator
2
BENCHMARK AS GOALS
Goals reflect desired - not expected - performance. Teacher set goals according to state standard (benchmark), which might beyond students’ ability
"The goal we set is because that’s where we truly want them, we want them to be at the 60th percentile."
- Middle School Administrator
3
MEASURABLE GROWTH
Teachers want growth, but they don't measure the growth against a reference.
"We want to see constant and steady progress, and set goals to make sure this progress continues to happen."
- Middle school principal
4
DATA MISCONCEPTIONS
Teachers don't use some important data metrics due to distrust, difficulty in understanding, and mismatch between desired and realistic goals.
"Average growth for a student is between 35 SGP and 65 SGP. That’s nuts, that’s just too big."
- Middle School Administrator

Ideation and Pitch

Personas are compiled from the behaviors, motivations, and needs of the many users whom we interviewed. Based on the work completed this spring, we found that there are four main personas that represent the key stakeholder’s use and views of data: Mr. Wisdom, Miss Rainey, Ms. Mitchell, and Mrs. Horan. Mr. Wisdom, our “data expert,” represents the teacher who understands data well, sees its value in classroom practices, and enjoys helping other teachers to use data. Ms. Mitchell, our “data enthusiast,” sees data as useful for predicting her students’ performance but feels pressured to get everyone to proficient levels. Miss Rainey, our “data skeptic,” prefers observational data about her students and believes numbers miss important information about students. Mrs. Horan, our “data advocate,” relies on data to demonstrate school performance and growth each year and establishes decision rules to use data.

We brainstormed product features based on those personas and test those ideas with other users.

3

DESIGN PROCESS

The Problem, Solutions, and Impact

Prototyping and Iterations

Rapid prototyping cycles were used to assess learning outcomes of the product, to evaluate usability, and to collect user preferences through user-centered design methods. Seven iterations of prototyping were conducted, with foci on content, data visualizations, actionability, usability, and visual preferences to make research-based decisions for the final product. We catogries those seven runs into three phases.

4

Design Decisions

How We Appliced Human-Centered Design and Learning Sciences
Show the value of the product: Optimize users' workflow
Design for the Painpoint: Teach math vs. Help user achieve best practice
Design for Novice Users: Pre-training & Correct misconceptions
Provide Appropriate User Control