Standards based grading
Standards based grading
Standards based grading

Jan 2025 - May 2025

1 Designer, 5 engineers

Toddle

Project overview

Toddle’s existing grading method was rigid - admins couldn’t link assessments to learning standards, teachers couldn’t track progress over time, and students saw grades as arbitrary numbers.

How might we design a system that connects every assignment to measurable standards, while keeping it simple for time-strapped educators

We spent 3 months understanding how Competency Based Grading vs The Traditional Score Based Grading method that we have differs and the method which teachers grade using standards. By leverage key user insights, competitive analysis and agile design methods, we identified key pain points in the existing structure and brainstormed ways to add a layer of standard based grading to the system. Adding this on the platform helped by capture schools in the US public school system and the Australian School system, improving Toddle ratings and driving significant engagement and new user onboarding

My role

Product design, user research, Competitive analysis, Stakeholder management, user testing, Product management and project management.

The team

1 designer, 20 Engineers, 1 Engineering Manager

Timeline

Jan 2025 - May 2025

What was the impact?

  • Onboarded 20+ new schools within the first 4 weeks of demoing the product, with an expected 18% increase in revenue by the end of the release cycle.

  • 92% of schools we piloted the demo with switched to Toddle from competitors, with full rollout pending final release.

  • Automated rubric and descriptor generation using AI, reducing creation time by 70%, helping us stay true to our goal of giving teachers back 10+ hours each week (validated through internal testing against previous processes).

The Problem

Toddle’s existing rubrics were static and disconnected from learning standards.
Teachers couldn’t track student mastery over time.
Unable to align assessments to school or district-wide standards.
Unable to personalize feedback based on skill gap
Traditional Score based Grading

“You got 78/100”

“You got 78/100”


Final grade average of all scores / recent most grading / best fit

Focus: Points, tests, percentages

High score ≠ full understanding

Standard based Grading

You’re developing in structure, proficient in evidence

Final grade = mastery of specific standards

Final grade = mastery of specific standards


Focus: Skills, concepts, growth

Progress over time is key

Understanding standard based grading

We mapped the end-to-end lifecycle of standards across key user roles admin, coordinator, teacher, student, and parent to surface actionable insights. We also synthesized user needs into clear problem statements to inform product direction and design priorities. This helped define touchpoints, data requirements, and ownership across the ecosystem, forming the foundation for upcoming feature development.

USER INSIGHTS

Digging out feature requests, talking to schools and experts

PERSONA

Stakeholders involved and their roles

Admin

Age Range: 35–45

Role: Oversees curriculum setup and manages school-wide systems and roles

Responsibilities: Admins manage the school’s tech tools and academic framework. They assign roles and permissions, define course structures, and decide which standards should be used in each course.

Context: Often part of a team that includes the school principal and IT or tech office staff, admins balance control with flexibility, creating guardrails while allowing teachers autonomy in how they teach.

Needs: They need systems that allow them to configure standards, grading methods, and course templates, while also giving teachers the agency to adapt within those boundaries.

Course teacher

Age Range: 35–45

Role: Leads the planning and delivery of academic courses across grades

Responsibilities: Course teachers design unit plans and decide how the course is taught. They select relevant standards from admin-imported lists and structure content around them. In schools without a strict course structure, they often take on the role of class teachers.

Context: They play a central role in ensuring curriculum coherence. Depending on the school’s setup, they may work independently or with a team to align teaching with standards.

Needs: They need flexibility to structure courses, tools to track how standards are being taught, and visibility into instructional alignment across units and assessments.

Class teacher

Age Range: 28 to 35

Role: Manages instruction, grading, and student progress at the class level

Responsibilities: Class teachers create assignments, worksheets, and quizzes for the students in their class. They design unit plans, grade student work, share it with families, and track progress throughout the year.

Context: While course teachers define subject level planning across grades, class teachers are focused on day to day teaching and learning within their own class. They work closely with students to support both academic and social emotional growth.

Needs: They need simple tools to build assignments, monitor student understanding, and help students feel confident and capable as they progress toward mastery.

Student and family

Age Range: 6 to 17

Role: Learns through teacher guided instruction across subjects

Responsibilities: Students attend classes, complete assignments and tests, and apply their learning across different topics. They work toward skill mastery and academic progress.

Context: Students want to understand what they are learning and how they are doing. They aim to do well in exams and feel confident in their growth. Families support this journey and want visibility into their child’s performance.

Needs: They need clarity on progress, next steps, and where they stand. Families need simple ways to stay informed and involved.

Admin

Age Range: 35–45

Role: Oversees curriculum setup and manages school-wide systems and roles

Responsibilities: Admins manage the school’s tech tools and academic framework. They assign roles and permissions, define course structures, and decide which standards should be used in each course.

Context: Often part of a team that includes the school principal and IT or tech office staff, admins balance control with flexibility, creating guardrails while allowing teachers autonomy in how they teach.

Needs: They need systems that allow them to configure standards, grading methods, and course templates, while also giving teachers the agency to adapt within those boundaries.

Course teacher

Age Range: 35–45

Role: Leads the planning and delivery of academic courses across grades

Responsibilities: Course teachers design unit plans and decide how the course is taught. They select relevant standards from admin-imported lists and structure content around them. In schools without a strict course structure, they often take on the role of class teachers.

Context: They play a central role in ensuring curriculum coherence. Depending on the school’s setup, they may work independently or with a team to align teaching with standards.

Needs: They need flexibility to structure courses, tools to track how standards are being taught, and visibility into instructional alignment across units and assessments.

Class teacher

Age Range: 28 to 35

Role: Manages instruction, grading, and student progress at the class level

Responsibilities: Class teachers create assignments, worksheets, and quizzes for the students in their class. They design unit plans, grade student work, share it with families, and track progress throughout the year.

Context: While course teachers define subject level planning across grades, class teachers are focused on day to day teaching and learning within their own class. They work closely with students to support both academic and social emotional growth.

Needs: They need simple tools to build assignments, monitor student understanding, and help students feel confident and capable as they progress toward mastery.

Student and family

Age Range: 6 to 17

Role: Learns through teacher guided instruction across subjects

Responsibilities: Students attend classes, complete assignments and tests, and apply their learning across different topics. They work toward skill mastery and academic progress.

Context: Students want to understand what they are learning and how they are doing. They aim to do well in exams and feel confident in their growth. Families support this journey and want visibility into their child’s performance.

Needs: They need clarity on progress, next steps, and where they stand. Families need simple ways to stay informed and involved.

MAPPING USER INSIGHTS TO THESE ROLES AND SUBSETS

Lifecycle of standard based grading

We mapped the end-to-end lifecycle of standards across key user roles admin, coordinator, teacher, student, and parent to surface actionable insights. We also synthesized user needs into clear problem statements to inform product direction and design priorities. This helped define touchpoints, data requirements, and ownership across the ecosystem, forming the foundation for upcoming feature development.

HOW DO THESSTAKEHOLDERS INTERACT WITH THE PRODUCT?

User flow

We tried to understand how these different roles - admin, course teacher, class teacher, and student and famlily interact with the product and what they do on the platform

WHY ARE USERS LEARNING TOWARD OTHER APPLICATIONS? HOW CAN WE CLOSE THE GAP?

Understanding offerings from existing solutions

To understand the landscape better, I conducted a detailed analysis of leading platforms that offer standards-based and score-based grading. While I’ve withheld specific competitor names to maintain confidentiality, the research preview showcases key insights. My process involved exploring help center documentation, setting up test accounts on freemium platforms, and leveraging language models to validate and deepen my findings.

WHAT WAS THE CURRENT FLOW AND HOW DO WE ADD THIS LAYER ON THE SYSTEM?

Impact on the system

We looked at the existing flow of Score based assessment and grading and looked into how we could go about adding a layer of standards to this setup and what changes we will need to make in the existing setup. while we revamped the entire looks and feel and called it toddle 2.0. the pink boxes are versions of what was exiting and yellow lines below were the new things to be added apart from making improvements to the existing systems

CREATING PHASE WISE ROADMAPS TO ALIGN STAKEHOLDERS TOGETHER

Aligning with product vision and goals

Given the scope of changes needed and dependencies across the system, we created a roadmap that spanned 3 to 5 months. This work ran in parallel with major Toddle 2.0 upgrades, including a completely new course structure, refreshed navigation, and the introduction of roles and permissions. To manage complexity, we broke the work into phases and used a feasibility, desirability, and usability lens to prioritise decisions. Coordinating across teams was essential, as each stream was contributing to different parts of the puzzle.

Defining interactions and prototyping

Throughout the course of this project, we made several decisions to improve the workflow and experience for each user type. These were shaped by ongoing conversations with demo schools and partner organisations. For clarity and confidentiality, only a selection of key interaction highlights and changes are shared here.

Turned insights to actions: Mapped user needs and benchmarked competitor workflows to translate findings into early prototypes.

A/B testing with users: Gathered detailed requirements directly from partner schools and built interactive demos to validate direction and gather feedback.

Collaborated with developers: Worked closely with engineering teams to evaluate feasibility, identify edge cases, and co-create a realistic development road

Reiteration and refinement: Improved designs through continuous feedback loops — ran regular cadence calls to resolve dependencies, address developer questions, and adapt to evolving constraints.

A quick recap of the solutIon

Product interactions

Please not that for brevity and confidentiality, I have not included decision making points and iterations and have omitted some details. Feel free to reach out to me personally to learn more about the project!

Please not that for brevity and confidentiality, I have not included decision making points and iterations and have omitted some details. Feel free to reach out to me personally to learn more about the project!

1.

Creating a standard set

Based on user insights, we enabled standards to be created directly on the platform, imported from official sets, or added via Excel sheets.

Admins could draft and configure their own sets using a flexible data grid. We designed a reusable component for this, allowing copy paste, drag to reorder, and quick editing.

Each standard could include custom codes or IDs for easy reference, optional tags, subject mapping, and vertical or horizontal rollup classification (explained later). Admins could also archive or activate specific standards and fully configure the set as needed.

Checkbox: Select rows; shows row number on hover.

List Parent: Defines parent-child standard hierarchy.

Power Standards: Flags high-priority standards.

Grade Levels: Maps standards to grade levels.

Tags: Custom labels for filtering and grouping.

Grading: Shows if graded individually or by rollup.

Description: Optional context for the standard.

Subject: Links standard to a subject area.

Last Updated: Shows who last edited and when.

2.

Creating grade scales and adding rubric descriptors

 Grade scales can be created with alphabetic or numeric values, with color options by level. A mastery value defines when a student has met enough standards to be considered proficient, and is used in calculating final grades using mapped cutoff and point values.

Admins can set up rubric descriptors as templates for teachers. These can be task specific and editable using a flexible datagrid with select, copy paste, and AI suggestions - making configuration faster than competitors, who lack this feature.

Rubric data can also be exported and imported via Excel. Many schools already store this in spreadsheets, and we built support for that workflow knowing competitors do not offer it.

Grade scales can be assigned a name, colour and a mastery column to define which grade level is the mastered level, cutoff values and grade values to use standard calculatin methods
We created a flow for mapping rubric descriptors to standards as per their grade scales with AI interactions to help map grade scales with AI

3.

Using Calculation Methods

Admins can map each standard set to a calculation method that defines how student mastery is determined. For example, if they use the mean method for both individual and rollup grading, the final mastery score is calculated by averaging the grade values across all assessed standards, dividing by the number of assessments they were graded on, and converting the result back to the grade scale - yielding a final level like “Distinguished” or “Proficient”.

After reviewing competitor platforms and gathering user insights, we defined a set of calculation methods to support different grading preferences. These methods vary depending on whether the grade scale is numeric or alphabet based.

4.

Creating a rubric with AI in an assignment

Admins can map each standard set to a calculation method that defines how student mastery is determined. For example, if they use the mean method for both individual and rollup grading, the final mastery score is calculated by averaging the grade values across all assessed standards, dividing by the number of assessments they were graded on, and converting the result back to the grade scale - yielding a final level like “Distinguished” or “Proficient”.

After reviewing competitor platforms and gathering user insights, we defined a set of calculation methods to support different grading preferences. These methods vary depending on whether the grade scale is numeric or alphabet based.

5.

Using the Mastery gradebook

New gradebook added alongside assessment gradebook and learning goals gradebook. this appears for schools using standard based grading.

Standards are mapped against each student with a mastery summary (percentage of the no of times the student has achieved mastery; in this case, distinguished, as that was the selected level in the gradescale.

Teachers can filter out standards by type, units, graded/not graded, etc and also download and share it offline

Teachers can also use the help <planning to rollout to all pages> where they can use ai to ask questions and go to articles from help centre and videos

A quick peek into the solution

For hiring managers, interviewers or anyone curious about the full process, send me an email. I would love to share more.

Shoot me an email

Shoot me an email

Shoot me an email

Shoot me an email

Shoot me an email

Shoot me an email

Failures and takeaways

Problems

  • A significant amount of time went into creating customized demo decks for each school, which slowed down progress.

  • Cross-team collaboration often took longer than expected. I spent considerable time realigning new team members or explaining project context from scratch.

  • I lacked a strong grasp of IB standards early on, which affected decision-making and slowed feature mapping.

What could i have done better?

  • Highlighting text I could have pre-identified a set of representative schools and created a reusable demo framework with minor tweakable components, instead of building from scratch each time. This would have saved time while still allowing for personalization where it mattered most.

  • Having a clear list of stakeholders and setting up a kickoff call early on across teams would have helped align expectations, reduce handover gaps, and avoid spending time repeatedly explaining the project scope.

  • Spending more time upfront to understand how IB schools interpret and apply standards would have helped me make faster, more informed decisions while mapping workflows or proposing solutions. Domain depth makes a difference, especially in education.

  • While I had a document that captured the project context, key assumptions, and open questions, I didn’t consistently update it. Keeping a document alive would have helped onboard collaborators faster and reduced repeated clarifications.

  • Lastly, I didn’t always protect time for deep, focused work. There were days I was constantly switching contexts or reacting to inputs. Better calendar discipline and clearer prioritization could have helped me create space for uninterrupted design and reflection.

Read what my peers and Head of Design say about me collaborating on the project!

Read what my peers and Head of Design say about me collaborating on the project!

Here’s what my peers and stakeholders have shared about my approach to product thinking, design execution, and team collaboration.

Here’s what my peers and stakeholders have shared about my approach to product thinking, design execution, and team collaboration.

Like what you see? Let's connect!

Like what you see? Let's connect!

I’ve built everything from grading platforms to games, always focused on making things that work well and feel right. Take a look at the ideas, the craft, and the outcomes.

I’ve built everything from grading platforms to games, always focused on making things that work well and feel right. Take a look at the ideas, the craft, and the outcomes.

made with love

and also hate, because perfectionism isn’t born out of love, it’s forged in frustration, obsession and an unrelenting pursuit of something better.

2025

Sulakshana

made with love

and also hate, because perfectionism isn’t born out of love, it’s forged in frustration, obsession and an unrelenting pursuit of something better.

2025

Sulakshana

made with love

and also hate, because perfectionism isn’t born out of love, it’s forged in frustration, obsession and an unrelenting pursuit of something better.

2025

Sulakshana