Education 25 min read

Using Data from Online Assessments to Improve Instruction

Learn how to analyze assessment data, turn insights into actionable teaching decisions, and implement a sustainable data-review routine that improves student learning.

Pen on paper with data and charts

Photo by Isaac Smith on Unsplash

Introduction

Every day, online assessment platforms collect massive amounts of data. Which questions students get wrong. How long they spend on each question. Which students score lowest. Performance trends over time.

Teachers can see all this data. Most don't know what to do with it.

A teacher logs into the platform, sees a dashboard with dozens of data points, and feels... overwhelmed. "This is interesting, but what do I actually do with it? How does this change my teaching?"

This article solves that problem. You'll learn:

  • What data is actually available
  • How to analyze it (not requiring statistics degree)
  • How to turn analysis into actionable decisions
  • Real examples of teachers using data to improve instruction
  • How to implement a sustainable data-review routine

By the end, you'll see assessment data not as numbers on a dashboard, but as a conversation with your students about what they understand and what they need.

What Data Is Available from Online Assessments?

Online platforms generate different types of data. Understanding what's available helps you know what to look for.

Question-Level Data

Available metrics:

  • Percentage of students who got the question correct
  • Average time spent on the question
  • Distribution of answers (how many chose A, B, C, D for multiple choice?)
  • Difficulty index (how well does this question separate high performers from low performers?)

What this tells you:

  • Which concepts the class struggles with (low % correct = concept needs reteaching)
  • Which questions are poorly written (if half the class gets it wrong AND high performers also get it wrong, maybe the question is unclear)
  • Which students need help (who consistently chose wrong answers?)

How to use it: If 30% of the class got question 3 wrong, you know concept 3 needs more attention. If 80% got it wrong, maybe retest that concept before moving on.

Student-Level Data

Available metrics:

  • Individual student scores on each assessment
  • Progress over time (how are scores trending?)
  • Mastery status (has this student demonstrated competency on this skill?)
  • Submission time (on-time vs. late)
  • Patterns (does this student consistently struggle on certain question types?)

What this tells you:

  • Who's mastered the content (ready for acceleration)
  • Who's still struggling (needs intervention)
  • Who improved most this term (celebrate growth)
  • Who hasn't submitted (follow up on missing work)
  • Any individual patterns ("This student always gets fill-in-the-blank questions wrong, but does well on multiple choice")

How to use it: Use student-level data to identify who needs intervention, extension, or just a check-in.

Class-Level Trends

Available metrics:

  • Class average on each assessment
  • Performance distribution (what % of students are at each proficiency level?)
  • Trend over time (is class performance improving?)
  • Comparison to prior years (how does this year's class compare?)

What this tells you:

  • Whether your instruction is working (improving trend = yes)
  • Whether the class is ready to move on (75%+ mastered = yes)
  • Equity concerns (are some demographic groups outperforming others?)
  • Effectiveness of specific instructional changes (changed approach + performance improved = effective)

How to use it: Class-level data informs pacing decisions. "Is it time to move on?" Look at class data. "Is our new strategy working?" Look at trend.

Disaggregated Data (If Your Platform Supports It)

Available metrics:

  • Performance by demographic group (achievement gap visible?)
  • Performance by prior achievement level (are all students progressing?)
  • Performance by question type (do students do better on multiple choice vs. short answer?)
  • Performance by time of day (do students perform better in morning vs. afternoon assessments?)

What this tells you:

  • Equity issues (do some groups consistently underperform?)
  • Effectiveness of different instructional methods for different learners
  • Whether specific accommodations or supports are working

How to use it: If data shows girls outperforming boys on verbal reasoning but boys outperforming girls on spatial reasoning, you might adjust your instruction to support both.

Time/Engagement Data

Available metrics:

  • Time spent on assessment overall
  • Time per question (which questions take longest?)
  • Submission timing (submitted on deadline vs. last minute?)
  • Attempt patterns (multiple attempts vs. single attempt?)

What this tells you:

  • Engagement level (did students rush or take time?)
  • Question difficulty (longer time on a question = harder question OR better thinking)
  • Procrastination patterns (submitted at deadline = student procrastinated)
  • Effort level (multiple attempts = student cared about getting it right)

How to use it: If most students submit at the last minute, maybe adjust deadline or send reminder. If students rush through (very quick overall time), maybe increase time limit or encourage slowing down.

From Data to Decision-Making: The Framework

The framework: Look → Analyze → Decide → Act → Monitor

Step 1: LOOK - Examine the Data

After each assessment, spend 15-20 minutes reviewing:

Quick scan:

  • What's the class average? (Expected or surprising?)
  • What's the range? (20-95% or more compressed?)
  • Which 3 questions had lowest pass rate?
  • Which 3-5 students scored lowest?
  • Is this performance in line with what I expected?

Write down:

  • 3 biggest surprises
  • 2 questions where significant portion of class struggled
  • 2-3 students who might need intervention

Step 2: ANALYZE - Ask Why

For each piece of data, dig deeper:

If most students got a question wrong:

  • Was the concept adequately taught? (Did I spend enough time on this?)
  • Was it well explained? (Did I use clear examples?)
  • Was the question clearly worded? (Or did students misunderstand what was being asked?)
  • Was it appropriately difficult? (Am I assessing the right cognitive level?)

If some students got it right but others didn't:

  • Are these the high performers vs. low performers? (Normal expected distribution)
  • Or is there a surprise? (A usually strong student bombed this? An usually weak student nailed it?)
  • What's the pattern? (Does it correlate with a specific skill or demographic group?)

If a student scored surprisingly:

  • Expected to do better: Why did they underperform? Didn't understand? Careless mistake? Didn't try?
  • Expected to do worse: Why did they outperform? Studied more? Finally understood? Got help?
  • Did they have circumstances? (Sick that day? Family crisis? Technology issues?)

Step 3: DECIDE - What Will You Do?

Based on your analysis, choose from these options:

Option A: Reteach

When: Significant portion of class didn't master (typically 30%+ didn't get it)

How:

  • Different teaching approach (if you explained it one way, try a different way)
  • More concrete examples
  • Hands-on activity or demo
  • Targeted to students who struggled most

What: Address the specific misconception. If 60% of class thinks X is true but it's actually Y, spend time specifically correcting this.

When: In next lesson or two (while memory is fresh), before moving to new content that builds on this

Example: "Exit tickets showed 50% of class misunderstood osmosis. Tomorrow I'll use the raisin-in-water demo (hands-on) to show the concept before reviewing."

Option B: Intervene with Struggling Students

When: Some students mastered, others didn't (creates heterogeneous performance)

How:

  • Small group instruction (pull aside 3-4 struggling students, work with them specifically)
  • One-on-one tutoring
  • Peer tutoring (pair with student who mastered)
  • Additional practice with scaffolds

What: Targeted review of the concept using approach that might work better for these learners

When: During class (while other students work on something else) or outside class

Example: "Quiz data shows 5 students are below 70%. I'll pull these students aside during independent work time and use manipulatives to teach fractions."

Option C: Extend for Advanced Students

When: Many students have mastered, some want more challenge

How:

  • Give extension problem (more complex version)
  • Independent research project
  • Allow them to help teach others (peer tutoring)
  • Let them design their own assessment

What: Deepen understanding in original topic, or apply to new context

When: While you're reteaching others or all students are working independently

Example: "Most students mastered equivalent fractions. I'll give the advanced group fraction problems requiring multiple steps and comparison."

Option D: Move Forward

When: Significant majority (75%+) has demonstrated mastery

How: Teach next concept, building on foundation

What: New content

When: After reteaching if needed, before too much time passes

Example: "80% of class got 18+ of 20 questions correct on fractions unit test. Time to move to decimals."

Option E: Adjust Assessment/Instruction Design

When: Data suggests problem is with how you taught or assessed, not with student learning

How:

  • Reflect: "Did I teach this well? Did students have enough practice?"
  • Document what didn't work
  • Plan different approach for next year
  • Possibly re-assess with different format

What: Change for future, document for next iteration

When: Plan for next year; possibly redo assessment if current one feels invalid

Example: "Quiz data shows students did poorly on this question, but when I explained it differently in class, they understood. The question was poorly worded. I'll rewrite it for next year."

Practical Data Review Routine

You don't have to spend hours analyzing data. A simple routine works:

Weekly Routine (15 minutes)

Monday morning:

  1. Open assessment data from last week
  2. Scan: What's the headline? (One sentence)
  3. Identify: One concept the class struggled with
  4. Note: One student who needs a check-in
  5. Plan: What's one small thing I'll do differently this week based on this data?

Example:

"Quiz data: Class did great overall (87% average). But 50% missed question on fractions with unlike denominators. Plan: Add 5 min drill on this Friday as refresher."

Monthly Routine (30 minutes)

End of month:

  1. Review all assessments from the month
  2. Trends: Which topics keep causing trouble?
  3. Students: Who consistently scored below class average?
  4. Trends: Is class performance improving over the month?
  5. Decision: What's the #1 instructional change based on this data?

Example:

"Data review: Fractions have been problematic all month. Three students consistently below 70%. Trend: improvement week-to-week but still not at mastery. Decision: Form small group for focused fractions practice."

End-of-Unit Routine (45 minutes)

After finishing a unit:

  1. Comprehensive review of all assessments for the unit (both formative and summative)
  2. Performance distribution: What % of students mastered? (75%+?) What % still need support? (below 70%?)
  3. Item analysis: Which specific concepts caused most difficulty?
  4. Disaggregate: Do all demographic groups perform similarly or are there gaps?
  5. Evaluate: Did this unit go well? What would I change for next time?
  6. Plan: What support do struggling students need moving forward?

Example:

"Unit on cell biology: 78% of class at mastery (A/B), 18% developing (C), 4% below expectations (D/F). Struggled most on photosynthesis vs. cellular respiration comparison. Three students need additional support. Gap noted: Girls 5% higher than boys on this unit average (investigate why). Plan: Two-week intervention group for struggling students before moving to genetics unit."

Using Data for Differentiation

What is Differentiation?

Differentiation means teaching to meet students where they are—not treating everyone identically, but tailoring instruction to individual needs while maintaining same content standards.

How Assessment Data Enables Differentiation

Assessment data tells you exactly where each student is. Use this to inform how you group students and what they work on.

Example workflow:

Week 1: Administer quiz on fractions

  • Group A (5 students): 90%+ mastery
  • Group B (10 students): 70-89% (getting it but not fluent)
  • Group C (5 students): Below 70% (still struggling with foundation)

Week 2-3: Three different instruction tracks

  • Group A: Extension work. "Design a real-world problem involving fractions and solve it"
  • Group B: Guided practice. Scaffolded problem sets with some teacher support
  • Group C: Reteaching with manipulatives. Go back to foundation, use hands-on tools

After Week 3: Reassess

  • Some students from B move to A (now ready for extension)
  • Some from C move to B (foundation solid, ready for practice)
  • Groups are fluid (not permanent)

Key: Data-driven grouping. Not "these are our low students" (permanent, harmful). Instead: "Right now, these students need this support" (flexible, focused, temporary).

Creating Flexible Groups

Step 1: Review assessment data

  • Identify three performance levels (above grade level, at grade level, below grade level)
  • Assign students to groups based on current performance

Step 2: Design different instructional activities

  • Above: Extension, enrichment, new applications
  • At: Guided practice, feedback
  • Below: Reteaching, scaffolding, manipulatives

Step 3: Implement different activities (could be simultaneous)

  • Possible: Teacher works with below group, at group does guided practice, above group does extension project
  • Rotate weekly so all students get teacher time

Step 4: Reassess frequently

  • Every 1-2 weeks, update based on new data
  • Move students between groups as they progress
  • Celebrate movement ("You're ready for extension now!")

Important Cautions

Don't:

  • Create permanent ability groups ("the smart kids," "the low kids")
  • Assume a student in lower group stays there
  • Use data to label or limit students
  • Track students into low-expectation paths

Do:

  • Use groups as flexible, temporary support
  • Reassess frequently and move students
  • Provide support to catch students up, not leave them behind
  • Maintain high expectations for all students
  • Track skills, not students (right now student needs this support, doesn't mean they always will)

Data Tools & Dashboards

What Good Assessment Analytics Show

Dashboard 1: Class Overview

  • Mean score (average)
  • Median score (middle score)
  • Range (high to low)
  • Distribution (how many A's? B's? C's? etc.)
  • Trend (improving over time?)

Dashboard 2: Question-Level Analysis

  • Which questions were hardest?
  • Which questions best predicted overall performance? (discriminator)
  • Time spent per question
  • Common wrong answers (reveals misconceptions)

Dashboard 3: Student Tracking

  • Individual student scores over time (progress)
  • Mastery levels in each skill
  • Recommendations (what to work on next)

Dashboard 4: Equity Analysis (if available)

  • Performance by demographic group
  • Achievement gaps visible
  • Where support is needed

What to Look For in a Platform

Essential:

  • ✅ Easy-to-read dashboard (visualizations, not just raw numbers)
  • ✅ Question-level breakdown (which specific content is hard?)
  • ✅ Student-level tracking (individual progress)
  • ✅ Downloadable reports (can save over time)

Nice to have:

  • ✅ Alerts (platform flags unusual patterns)
  • ✅ Multiple filter options (can analyze different ways)
  • ✅ Trend visualizations (easy to see improvement over time)
  • ✅ Peer comparison (how does this class compare to prior years?)

Red flags:

  • ❌ Overwhelming analytics that don't tell a story
  • ❌ Data but no guidance on what it means
  • ❌ Hard to export (locks data in system)
  • ❌ Intimidating for non-data-savvy teachers

Using Data Without Advanced Dashboard

If your platform doesn't have sophisticated analytics, you can manually review:

Option 1: Export to spreadsheet

  • Download question-by-student data
  • Sort to see patterns
  • Manually count (% correct on each question)
  • Identify students scoring lowest

Option 2: Manual review

  • Read through submissions
  • Note patterns ("Lots of students made this same error")
  • Check which questions had most wrong answers
  • Identify students needing support

Option 3: Qualitative review

  • Pay attention during assessment
  • Which students looked confused?
  • Which asked for clarification on specific questions?
  • Combine observations with scores

Real Examples: Data-Driven Decision Making

Example 1: Quiz Shows Pattern, Leads to Targeted Intervention

Assessment: 9th Grade Algebra Quiz on Solving Linear Equations

Data:

  • Question 1 (x + 5 = 12): 90% correct
  • Question 2 (2x = 10): 85% correct
  • Question 3 (3x + 2 = 11): 60% correct
  • Question 4 (2x + 5 = 3x + 1, variable on both sides): 40% correct

Analysis:

"Students can solve simple one-step equations. Problem emerges when steps increase (3 steps instead of 1-2) AND especially when variable appears on both sides. Pattern: Students lack 'collecting like terms' strategy and can't handle multi-step equations."

Decision:

  • Reteach "collecting like terms" with visual strategy
  • Use error analysis: show common mistakes and how to fix them
  • Practice with scaffolds (provide template, then remove)

Action:

  • Wednesday: 30-min reteach on collecting like terms
  • Wednesday-Thursday: Practice worksheets with scaffolds
  • Friday: Re-quiz on questions 3-4

Monitor:

  • Friday re-quiz results: 75% now correct on question 4 (up from 40%)
  • Plan for tomorrow: Continue practice with students still below 70%

Lesson learned for next year: Need more practice with multi-step equations before introducing variables on both sides. Will add one week of practice this year.

Example 2: Data Shows Equity Gap

Assessment: 7th Grade Reading Comprehension

Data (disaggregated by English proficiency):

  • English as primary language: 80% average
  • English language learners: 55% average
  • Gap: 25 percentage points

Analysis:

"ELL students are performing significantly lower. Could be: vocabulary too advanced? Background knowledge assumed? Reading speed barrier? Need different support?"

Decision:

  • Pre-teach vocabulary before assessments
  • Provide glossary during reading
  • Partner ELL students with strong readers for practice
  • Form ELL small group for 30-min focused instruction 3x/week

Action:

  • Monday-Tuesday: Pre-teach 10 key vocabulary words
  • Wednesday: Re-do assessment with glossary available
  • Thursday-next week: Small group instruction

Monitor:

  • Track gap weekly: Is it narrowing?
  • Does it narrow with scaffolds (glossary)?
  • What other supports help?

Lesson learned: Equity doesn't happen accidentally. Intentional design and scaffolding needed.

Example 3: Data Shows Assessment Issue

Assessment: Writing Prompt on Personal Narrative

Data:

  • 18/25 students scored "Proficient" or "Advanced"
  • 7/25 scored "Developing"
  • Surprise: Three usually high-achieving students in "Developing" group

Analysis:

"This doesn't match what I expected. These three students are strong writers. Either something was confusing about the prompt OR they struggled with the specific assignment. Let me look at their work..."

Review:

"Ah! Their writing WAS strong but didn't match the prompt. Prompt said 'Tell a meaningful personal story' but I didn't define 'meaningful.' Students interpreted this differently (told stories, but not the kind I was expecting). This is a prompt problem, not a student problem."

Decision:

  • Reword prompt to be clearer
  • Provide rubric and examples upfront
  • Give students 2-3 specific prompt options (narrower focus)
  • Re-assess with new prompt

Action:

  • Rewrite prompt: "Tell a story about a time you overcame a personal challenge. Minimum 2 pages."
  • Share rubric showing exemplars
  • Re-assess in 1 week

Result: 22/25 now Proficient or Advanced

Lesson learned: Ambiguous prompts create valid assessment problems. Be specific. Share rubric upfront.

Example 4: Data Shows Need to Accelerate

Assessment: Series of Multiplication Facts Quizzes (3rd grade)

Data:

  • 15/25 students have 90%+ mastery on three consecutive quizzes
  • These students are asking for harder problems
  • Giving them more fact practice = boredom, behavior issues

Analysis:

"These students are ready. They don't need more practice on facts—they need application and challenge."

Decision:

  • Create extension group: Multi-step word problems using multiplication
  • Give them project: "Create your own word problems"
  • Let them tutor peers
  • Provide enrichment math activities

Action:

  • Form acceleration group
  • Different activities while others practice facts
  • Both groups learn multiplication, different level of challenge

Result:

  • Accelerated group: Engaged, challenged, developing deeper skills
  • Other group: Space to practice with teacher attention where needed
  • No boredom, no behavior issues

Lesson learned: Using data to accelerate is as important as using it to intervene.

Privacy & Ethical Data Use

FERPA Compliance

FERPA (Family Educational Rights and Privacy Act) protects student data.

What this means:

  • Student assessment data is protected information
  • Can share with: parents, authorized school staff, student (if age-appropriate)
  • Cannot share: publicly, with vendors without permission, on insecure systems
  • Parents have right to access student data

Your responsibility:

  • Use data to help students
  • Keep data secure
  • Share appropriately (not in public, not casually)
  • Get parent permission before sharing with third parties

Ethical Data Use

Do:

  • Use data to help individual students
  • Share results with families (private conversations or reports)
  • Aggregate data to inform instruction
  • Be transparent about how data is used
  • Involve students in data review ("Here's what your scores show. Let's make a plan together.")

Don't:

  • Share student scores publicly (embarrassing, violates privacy)
  • Use data to label students permanently ("She's a bad test-taker")
  • Data shame (don't use data to demoralize; use to support)
  • Assume correlation is causation (students from poor neighborhoods score lower ≠ poverty causes low achievement; often due to fewer resources/support, not ability)
  • Share data with third parties without permission

Communicating Results to Families

Good practice:

  • Share individual student results with families
  • Explain what scores mean (in plain language, not jargon)
  • Highlight strengths AND areas for growth
  • Offer specific ways families can support
  • Invite questions

Example:

"Dear Parents,

[Student] recently completed our unit assessment on fractions. Overall, [student] showed strong understanding of identifying equivalent fractions (90% on this section). [Student] is still working on adding fractions with unlike denominators (65% on this section).

To support [student] at home: Practice adding fractions with items at home (pieces of pizza, chocolate bars, etc.). The hands-on practice really helps!

[Student] is on track and will continue practicing. Let me know if you have questions."

Key Takeaways

  1. Data exists to tell you what to teach differently. After each assessment, ask: "What does this data tell me about what students understand? What will I do differently tomorrow based on this?"
  2. Question-level data > Student-level data > Class-level data. Start with "which specific concepts are hard?" Then identify students struggling. Then look at class trends.
  3. Low-stakes frequent assessment generates most useful data. High-stakes summative assessments show achievement but don't guide instruction as well as formative data.
  4. Most useful question: "Which students need what support right now?" Not "who's good/bad at math?" but "today, who needs help with fractions?"
  5. Act on data quickly. Data is only useful if it changes what you do tomorrow, not next year.
  6. Differentiation is the whole point. Data enables you to give different students different support based on where they actually are.
  7. Transparency builds trust. Share data with students and families. Invite them into interpretation.

How nopapertest.com Supports Data-Driven Instruction

  • Question-level analytics: See exactly which concepts students struggle with
  • Student-level tracking: Monitor individual progress, identify who needs intervention
  • Class-level trends: Know if class is ready to move on or needs reteaching
  • Actionable insights: Platform highlights patterns for you (not just raw numbers)
  • Rubric grading data: Analytics on rubric criteria show which skills are weak
  • Exportable reports: Download data to track over time
  • Privacy-secure: All data handling FERPA-compliant, secure storage

Ready to use assessment data to actually improve instruction?

Try nopapertest.com free to see how clear, actionable analytics can guide your daily decisions about differentiation, reteaching, and acceleration.

Get Started Free