Student Growth for Ed Eval:
|
State Level Data (0-20%)MDE recommends the use of state level data for grades 4th - 8th grade only with math and ELA. The law suggests 3rd, 9th - 11th grade as well, so many districts are using a building level growth measure to attribute to all teachers in the building.
Districts may choose to attribute group attribution (building, grade level, etc.) to all teachers or assign students to teachers of specific grade level/content areas taught. Either way, the law is specific that growth and assessment data shall be 20% of the overall evaluation (20% state, 20% local, 60% observation). Teachers who are exempt from the state level data (early childhood, primary grades, center based programs, educators who teach less than 15 students per year, etc.) For teachers who are exempt, their overall evaluation is 0% state, 40% local, 60% observation). There are likely four methods for measuring state level growth and assessment data educators have heard recently: 1. Student Growth Percentiles (SGP) are a normative measure recommended by MDE for ed evals for grades 4-8, math/ELA only. Additional resources available on the SI Timeline. 2. Adequate Growth w/SE is becoming more available through platforms such as Eidex. AG w/SE is a more intuitive measure that identifies the percentage of students who grew enough to maintain proficiency or on track to become proficient. Since this measure combines growth and proficiency, it is less normative and more criterion base. The podcast below explains the WHAT as it is a portion of the longer podcast covering WHAT and WHY found on the AGwSE page. See also HOW to use Eidex for Adequate Growth
3. AGPs are used on the Parent Dashboard and School Index by setting a target for student SGP. MDE does NOT RECOMMEND these for ed evals. Resources around AGPs available on the Accountability Page. NOTE: AGPs are not same as AG w/ SE.
4. Value-Added Models (VAM) are often referred to as the "black box" of student growth based on the complexity of the statistical models used to estimate a students predicted achievement compared to their actual achievement. MDE will be releasing information in the Sept, 2018. Here is a video overview from SAS EVAAS. |
Local Level Data (20-40%)PA 173 calls for every educator teaching K-12 students to use student growth and assessment data. The majority of teachers may be exempt from individual attribution of state level data. Therefore, 40% of the overall evaluation would be on local data and 60% based on the observation.
PA 173 allows local data to come from "student learning objectives (SLOs), achievement of IEP goals, nationally normed (i.e. NWEA, STAR, iReady, etc) or locally developed assessments (interim, exams, portfolio, etc.) that are aligned to state standards." Using local data should use a standard setting process or follow the SLO process. Each of the docs pictured above is a direct link to the PDF. Portions of the Practitioner's Guide from CCSSO focuses on the two most common forms of growth for an SLO: Simple Gain and Categorical. In addition, Wayne RESA has divided up the above Guidance Document for Measuring Student Growth into six helpful sections to answer the following:
|
Colorado Department of Education has five guiding principals for evaluating educators fairly:
- Data should inform decisions, but human judgment will always be an essential component of evaluations.
- The implementation and evaluation of the system must embody continuous improvement.
- The purpose of the system is to provide meaningful and credible feedback that improves performance.
- The development and implementation of the evaluation systems must continue to involve stakeholders in a collaborative process.
- Educator evaluations must take place within a larger system that is aligned and supportive.
What about non-tested grades and content areas?
What about the other 20% of student growth for tested grades?
According to the American Institute of Research (AIR), 60% of our states across the nation answer this question regarding student growth with non-tested content/grades by using a PROCESS to create Student Learning Objectives (or SLOs). Maryland, as an example, uses state level testing (if applicable), then varies the Student Learning Objective and the Building Index (aka SPI) to varying percents depending on the content and grade level. Rhode Island likely has the best resources to date (2016) on Student Learning Objectives, including guidance, interactive modules, samples and rubrics.