Time on Task
Mixed-Methods Case Study
First phase of a mixed-methods research project to design a quantitative time on task model and validate this definition against users' perceptions of time spent grading & effort.
The outcome of this work was a beta time on task component informed by user interviews as well as updated legal supporting its rollout
Challenge
Customers have expressed desire for analytics to estimate how much time teaching assistants (TAs) spend grading in Labflow
Time spent grading is a valuable metric for science departments to keep track of whether TAs are staying within their contracted appointment
Estimation techniques, once developed, could also be extended to measuring other user actions like how long students spend engaged in learning activities
Time on task makes Labflow more competitive as the first entrant in the market to offer such analytics
Dane thinking about this work within the broader social and political context
As a former graduate student in a union, I was tracking the waves of collective action happening on university campuses.
A number of prominent universities using Labflow had TAs or faculty striking, or threatening to do so.
I wanted to be sure that we at the company level were fully aware of our legal responsibilities under states' and federal law for providing data that potentially be misused in labor relations.
Objective
Worked across management, product, and engineering to define the goals and risks of developing time on task measurement techniques
Understand the common use cases for time on task data by interviewing current instructors and faculty
Test evolving time on task model & viz. with early adopter universities to refine and hone the quantitative approaches to modeling the data and validate its outputs
Develop a research & development plan for rolling out the functionality and measuring user sentiment (planned)
Project Outline
This is an ongoing project that started in Fall/Winter 2022. The project if currently focused on developing and validating a time on task model for estimating grading time.
Faculty interviews: Jan. 2023
Quantitative model development & validation: Jan. – Mar. 2023
Legal compliance review: Mar. 2023
Visualization design & development: Sept. 2023 – Present
📏 Scope
Design, iterate, validate time on task model
Identify how graders and instructors define and perceive active time on task
Compare operational definition of time on task with users' perception of effort (planned)
📦 Deliverables
Production-ready quantitative model to estimate grading time
Report establishing the ecological validity of the estimation technique
👥 Role
Data Scientist
Mixed-Methods UX Researcher
How can we estimate user time time spent grading and to what extent does that align with their own self-reports and perceptions of effort?
Research Objectives
Develop a fast, reliable, and accurate computational model to quantify user time on task
Validate assumptions about grading time against users' expectations
Explore relationships between users' perceptions of effort, time estimates, and product satisfaction (planned)
Research Methods & Findings
Methods: In-Depth Interview, Self-report / Member checking, Survey (planned)
Findings:
Instructors feel friction with administration to make sure graduate students are working within the bounds of their university contract (often 10–20 hrs/week)
Data can also provide objective basis to facilitate 1:1 coaching for graduate students
Data should be both aggregated as a mean/range and presented by individual grader
Most instructors had not thought of the legal implications of how they use time on task data, but many were sensitive to its potential for misuse
Grading time on task could be modeled with good resolution data capture + 20 minute sliding window Kernel Density Estimation (KDE)
Estimates align with self-reports, discrepancies tend to be <10%
1. Faculty Interviews
"We often get admin [department] pushback. They want to know if we're assigning a balanced workload. I also want to be able to identify my inexperienced TAs."
— Dr. Jackie Powell, University of Pittsburgh
"I want to be able to see summaries of grading time like averages [for activities], but also have it broken down by individual TAs."
— Dr. Angela Bischof, Penn State University
Notes from a one-on-one interview with a faculty member interested in grading time data
Conversations with 5 faculty revealed that there are at two major for grading time data.
Departmental compliance
TA professional development
The use of the data at the departmental level presents the clearest case for legal problems.
Most instructors had not even considered the legal implications of this data and were not aware of their university or state regulations.
2. Develop & Validate Quantitative Model
Grading events are emitted for discrete actions performed by TAs like:
Assigning points
Typing personalized feedback
Selecting from pre-defined rubrics
User events also augment this to fill in gaps in time not caught by grading actions.
"Quick Grade" interface in Labflow. Assigning point values, typing feedback, and selecting rubric items all emit individual grading events.
Discrete grading events are captured in BigQuery. I then developed a few computational techniques (Kernel Density Estimation depicted)
Dane considering his model options for estimating time on task
I needed to identify performant computational models for estimating the time on task.
I iterated on a few different possible models and settled on using a KDE model for its handling of distributed clusters.
Initial comparisons with self-report indicated the method was accurate for distributed grading actions.
Clustered bar chart comparing the self-report grading times for three reports against two estimation methods (histogramming, Kernel Density Estimation)
3. Legal Compliance Review
The company contacted an employment lawyer on retainer to inquire about laws in the states where we operate. We explored the most restrictive jurisdiction of California to understand what needs to be in place to meet our legal obligations under the California Consumer Privacy Act (CCPA).
4. Visualization Design & Development
Concepts for possible ways to represent time on task for specific activities as well as individual activities
Research Impact
Beta candidate of the time on task visualization component in Data Insights
Grading time is being implemented by universities to save money & reinvest in students
Updated legal language in both Terms of Service and Privacy Policy.
Alpha release candidate of time on task visualization demoing UI functionality
Grading time estimation is empowering science departments to make data-driven decisions on recruiting teaching staff.
The University of North Texas (UNT) was able to rebalance course loads, saving the department $150k!
UNT then 🙌 reinvested this money in students to lower course costs. 🎉
Case study of how grading time estimates saved UNT money
LIMITATION OF LIABILITY
To the extent allowed by Texas law and the U.S. Constitution, in no event shall Catalyst Education LLC, nor its directors, employees, partners, agents, suppliers, or affiliates, be liable for any indirect, incidental, special, consequential, or punitive damages, including without limitation, loss of profits, data, use, goodwill, or other intangible losses, resulting from (i) your access to or use of or inability to access or use the Service; (ii) any conduct or content of any third party on the Service; (iii) any content obtained from the Service; and (iv) unauthorized access, use or alteration of your transmissions or content, whether based on warranty, contract, tort (including negligence) or any other legal theory, whether or not we have been informed of the possibility of such damage, and even if a remedy set forth herein is found to have failed of its essential purpose.
Legal agreements for ToS and Privacy Policy were updated to clearly explain the legal agreement underpinning our data practices.
These changes were included to cover:
Indemnification
Loss of liability
Lawful Basis for Processing
We also intend to make time on task an opt-in service.
Dane planning to do gather feedback on the design concept
The beta design is now ready for release to a handful of early adopter universities who have expressed interest in reports on this data.
The next step is to sit down with these early adopters and evaluate areas for improvement.
Stay tuned...