RESOURCES FOR
MONITORING & SUPPORTING
Why Is Monitoring the Deal-Breaker
|
Skip to Bottom for Practical Tools and Templates! |
As an Improvement leader, it is important to focus on a small number of high-leverage strategies that will impact change in adult behavior, which results in improved student achievement. Developing an effective monitoring and support plan will be the difference between successful implementation and running in circles with no actionable achievement. This is the spot where the polarity of management and leadership must be negotiated. Are you an instructional leader or school manager? Let's take a look and see:
Reflection: Are you a leader or a manager?
|
What is Fidelity?
The degree to which the program is implemented as intended by the program developer, including the quality of implementation. Consistency + Accuracy + Integrity = Fidelity, and must include : Quality of Delivery and Adherence to the plan. Why is Fidelity Important? Fidelity ensures instruction has been implemented as intended, helps link student outcomes to instruction, determines intervention effectiveness, and informs instructional decision-making National Center on Response to Intervention-Using Fidelity to Enhance Program Implementation The perfect answer is BOTH! It takes vision to lead people, but management to bring ideas to fruition.
|
Measuring the Impact of Educational Programs
Measuring the success of your strategy takes planning and a variety of tools. Relying on state summative assessments will not give you the data you need to make the changes necessary to assure shifts in instructional moves resulting in student improvement.
Here's an example many will recognize:
Monitoring Implementation, Providing Support
Educational Leadership: December 2011/January 2012 | Volume 69 | Number 4
After the initiative was launched, leadership team members systematically monitored implementation throughout the school by doing biweekly walk-through observations in each classroom. The leadership team members split up and visited classrooms in pairs or groups of three so they could observe instruction, share the data, and debrief what they had seen. They then compiled the data from all classrooms to understand what was happening schoolwide. They were pleased to find increases in the instances of student talk; technically they were moving toward their goal.
They were not satisfied, however, with the quality of the student talk. They observed that students did not usually formulate extended responses or interact with one another. The teacher was still typically the hub of communication in whole-class and small-group discussions.
Although the team had equipped teachers with stems and prompts to encourage more student talk, those tactics had yielded only basic responses to teacher questions. The mechanical use of the prompts did not result in discourse among students that would improve their oral-language development and higher-level learning. In addition, the team observed that implementation was uneven among different teachers across the school.
These observations enabled the leadership team to make timely adjustments to the initiative. After looking back to the research literature to better understand the strategy, the team changed its short-term goal to include both quantity and quality of student talk and created a rubric to define the quality of student talk. During a staff professional development session, team members introduced the rubric and clarified for teachers the purpose and updated implementation expectations of the initiative.
To address the uneven implementation, the school provided differentiated, targeted professional development to individuals and groups of teachers who needed it. Groups of teachers discussed their progress at weekly collaboration meetings. They shared their individual successes and challenges, and the instructional coach provided feedback from observations and modeled strategies that encourage student talk. The coach and administrators met individually with teachers who had specific concerns about implementation or needed more specific guidance, such as demonstrations in their own classrooms.
Monitoring Implementation, Providing Support
Educational Leadership: December 2011/January 2012 | Volume 69 | Number 4
After the initiative was launched, leadership team members systematically monitored implementation throughout the school by doing biweekly walk-through observations in each classroom. The leadership team members split up and visited classrooms in pairs or groups of three so they could observe instruction, share the data, and debrief what they had seen. They then compiled the data from all classrooms to understand what was happening schoolwide. They were pleased to find increases in the instances of student talk; technically they were moving toward their goal.
They were not satisfied, however, with the quality of the student talk. They observed that students did not usually formulate extended responses or interact with one another. The teacher was still typically the hub of communication in whole-class and small-group discussions.
Although the team had equipped teachers with stems and prompts to encourage more student talk, those tactics had yielded only basic responses to teacher questions. The mechanical use of the prompts did not result in discourse among students that would improve their oral-language development and higher-level learning. In addition, the team observed that implementation was uneven among different teachers across the school.
These observations enabled the leadership team to make timely adjustments to the initiative. After looking back to the research literature to better understand the strategy, the team changed its short-term goal to include both quantity and quality of student talk and created a rubric to define the quality of student talk. During a staff professional development session, team members introduced the rubric and clarified for teachers the purpose and updated implementation expectations of the initiative.
To address the uneven implementation, the school provided differentiated, targeted professional development to individuals and groups of teachers who needed it. Groups of teachers discussed their progress at weekly collaboration meetings. They shared their individual successes and challenges, and the instructional coach provided feedback from observations and modeled strategies that encourage student talk. The coach and administrators met individually with teachers who had specific concerns about implementation or needed more specific guidance, such as demonstrations in their own classrooms.
Continuous Improvement Abbreviated Plan Samples
Tools to Assist with Effective Monitoring
Instructional Learning Cycle (PDF) (Word) This easy-to-use progress-monitoring dialogue provides opportunities for teachers and their colleagues to reflect on the quality of classroom instruction, particularly surrounding strategies in the school improvement plan. Monitoring Map (PDF) (Word) Use this matrix to organize administered assessments (formative, summative, interim) and determine overlap, effectiveness, and connection to student achievement. |