How do I interpret student domain-based progress monitoring results?

Classworks Domain-Based Progress Monitoring helps teachers to measure the effectiveness of the intervention by taking into account the students:

  • Chronological grade level of the grade level of the CBM
  • How their actual scores compare with the trend line based on the Target Rate of Improvement (ROI)

Step-by-Step Guidance

Navigate to the Progress Monitoring screen, locate the student to view, and click "View Details" along the right side of the screen.

The student's Student Detail screen will open to the Progress Monitoring section

Parts of the Graph

Student Rate of Improvement

A summary of the student's growth is shown at the top of the graph.

  • Target Rate of Improvement information is shown in the top right corner. In the sample below, a Moderate rate of improvement is in place. To stay on target to meet this goal, the student needs to increase their score by 0.909091% each week.
  • Actual Rate of Improvement information is shown in the top left corner. In the sample below, the student is On/Above target because they are increasing their scores by more than the target rate. In this sample, the student is increasing their weekly score by 1.64362% each week.

Legend

Defines the different dots and lines found on the body of the graph

  • Blue dots are placed to represent each actual score earned
  • A yellow line shows the student's actual trend line
  • A green line shows a trend line from the student's baseline score to the goal set by the teacher

Labels for Each Axis

  • Y Axis: Possible score range from 0% to 100%
  • X Axis: Each week of the 12-week progress monitoring session


Reading the Graph

Look at the sample graph above:

  • Compare the student’s actual performance trend line (Yellow Line) to the rate of improvement target line (Green Line)
    • Over time, the student’s trend line should be sloping very closely to the target rate of improvement (ROI) line
    • If it is sloping steeply above or below the target line, the student’s intervention might need an adaptation
  • Look at the graph above. Notice how closely the two lines are.  In this example, the student has struggled but now appears to be responding well to the interventions currently in place.

    The graph verifies this statement because the yellow line showing the student's actual growth closely aligns with the green line. The graph in this sample shows the growth needed for the student to meet the goal.  


Interpreting the Progress Monitoring Detail Narrative

Below the line graph is a detailed table that provides context about the intervention in place to support the student, and additional information about the student's weekly performance.

  • START in the top left corner: The Description is entered by the teacher who starts a progress monitoring session.
  • Date: The actual date when the student completed their probe each week.
  • Score: The actual score earned by the student on each test.
  • Duration: The actual length of time the student spent completing the probe. This is shown in hours:minutes:seconds - Domain-based probes contain between 11 and 15 questions.
  • Note: The comment entered by the teacher. This provides context for skipped weeks or other important events that could impact a score.
  • PC: This indicates that the teacher made a change to the intervention or instructional plan that the progress monitoring session is tracking to evaluate its effectiveness.

In the sample below, notice

  1. The student spent about the same length of time completing each of the three probes completed so far.
  2. There is a gap of a few weeks that begins at Week 3. The teacher added a note explaining these as holiday break weeks.
  3. The teacher adjusted the intervention plan after the student completed their Week 6 probe. This is noted with PC after Week 6.


Interpreting the Cumulative Item Responses

Below the narrative section is a grid that lists the specific skills or objectives included in each probe, whether the student answered the questions correctly or not, and their running average for each objective.

In the sample below, notice:

  • Two of the objectives focus on main ideas and details, and the sequence of events
  • How the student answered each question is shown with a green or red dot
    • Green = correctly answered
    • Red = incorrectly answered
  • The student's running average is shown at the far right of each row of dots. This student has a running average of:
    • 14% for main idea and details
    • 86% for sequence of events

Reviewing the Actual Question and Response

Click the red or green dot to open a new window. Here, the teacher will see the actual question asked, how the student answered it, and the correct answer (if different than the answer choice made by the student)

Still need help? Contact Us Contact Us