Digging Deeper into Gathering and Studying Data
To view the entire process in a linear format, use the guidance provided on the School Data Profile/Analysis Page. Remember, you are allowed to simply COPY the CNA from last year and simply make some updates for this year.
|
Prior to truly digging into data, start any meeting by engaging participants in an activating activity. One quick idea is having people find a person they have not connected with today and share one Good Thing that they feel is working well in their classroom or grade level or building (Stage 1). Another quick opening question might be "Why are you in education?" In addition, on the Getting Started page, we include a video in which Simon Sinek explains the importance of starting with why; if this has not been established, take a few minutes to view the video and get participants on the same page with the reason(s) behind the task they are about to complete.
Next, narrow the focus on who will be gathering and studying which data (Stage 2). For example, if the focus is on achievement data, you may divide into content area groups. Once your groups have been established, use the Data Dialogue Protocol and the accompanying Data-Driven Dialogue spreadsheet to complete your data analysis.
Next, narrow the focus on who will be gathering and studying which data (Stage 2). For example, if the focus is on achievement data, you may divide into content area groups. Once your groups have been established, use the Data Dialogue Protocol and the accompanying Data-Driven Dialogue spreadsheet to complete your data analysis.
Observations recorded fall under Stage 2: Explore and must be non-judgmental and descriptive, absent of causality and inference. We must gather multiple data and drill down before drawing conclusions. A sample of the data gathered will appear with observation at each stage. The examples below are not comprehensive but are intended to show an example that can be duplicated. Remember anything in orange is a hyperlink and post pictures when selected will enlarge.
M-STEP Proficiency Data (2014/15 school year data limited)
Of course, MI School Data is one of the easiest sites to see proficiency data quickly. Unlike previous years, M-STEP will not allow for trend data or comparative data with the state, ISD, district or another school graphically. The Spring 2015 M-STEP data is limited to the four proficiency categories as seen on the right (average scale score and standard deviation are not publicly available as in the past). Each portion of the bar will provide a pop-up window with the percentage of students for that proficiency category. If you scroll down further, or download the PDF, you'll see a nice comparison of the school, district, ISD and state under "Entity Breakdown," see a sample below.
NOTE: Click any image to enlarge the image. NOTE 2: "Entity Breakdown" is for an aggregate of all students only, the data is not provided for subgroups. |
Utilize M-STEP reports for 3-8 ELA and Math; plus ALL Science and Social Studies (including HS).
|
|
Every district should have at least one person who has access to the BAA Secure Site. Several districts have a data warehouse that will generate additional reports for them (i.e. Data Director, Our School Data, Illuminate, etc.). You may download a Student Data file in the BAA Secure Site if you are looking to compare the average scale score as we have done in the past (NOTE: 3rd grade now has a proficiency of 1300 rather than 300 - 330 in the past). The picture slide show on the right may not be enlarged, therefore, here is a PDF instead showing the State Averages and some interesting estimates for proficiency cut-offs based on percent correct. NOTE: In order for a 4th grade student to be proficient, they needed a 43% in Math, 47% in ELA and 85% correct in Science.
|
How might we inform curriculum with M-STEP?
This is the most common question many ISDs are receiving across the state. The simple response is that state level assessments are not purposed to drive instruction. Rather, their purpose is to evaluate a large student body on a sampling of questions tied to assessment targets related to the grade level.
However, depending on the content, there are some resources that can be utilized to better inform our curriculum. As you will see below, math is the cleanest, followed by ELA with science and social studies more difficult to draw any conclusions. Since every district should have at least one person who has access to the BAA Secure Site, within this site, educational leaders may download a sample Individual Student Report (ISR) as seen below:
However, depending on the content, there are some resources that can be utilized to better inform our curriculum. As you will see below, math is the cleanest, followed by ELA with science and social studies more difficult to draw any conclusions. Since every district should have at least one person who has access to the BAA Secure Site, within this site, educational leaders may download a sample Individual Student Report (ISR) as seen below:
The ISR reports above show a sample student who not only scored in the proficiency range, but the confidence interval (95% chance if test taken on a different day) would also fall in the proficiency range. For science and social studies (left picture), look closely at the claim/discipline areas to see what is emphasized on the state level assessments and where a typical proficient student struggled. Most data warehouses will offer a report to aggregate the claim/discipline areas like the one on the left from Our School Data. Therefore, when the aggregate is compared with the ISR, we can see that "Civics and Government" has the highest percentage correct (75%) but the smallest portion of the test (4 points). When examining the SCI and SS standards, most standards get one question each.
The ISR report on the right for 8th grade math has the claim/discipline overall score indicator removed on this website. We recommend that schools do NOT USE the overall claim/discipline for ELA and Math. The reason, some students received a green check for scoring ONE question correct out of 12 questions (or 8.3% correct): Claim Score Percent for Green Check "At or Near proficiency for claim" based on one sample ISD. What seems very revealing is that the Target Analysis on the ISR aligns extremely well with the original testing blueprint from SBAC. These targets are still applicable since MDE is using the SBAC question bank to create the M-STEP and will continue as the test moves to pseudo-adaptive (difficulty levels adjusts within grade level content) in 2016. Notice the math targets are identical to standard clusters for Michigan Adopted Standards (a.k.a CCSS).
Therefore, schools will likely want to reference MDE's Crosswalks for ELA and Math (found on the M-STEP page), the cluster and standards document for ELA and Math (CCSS), and possibly the SBAC Blueprint for Math and ELA.
Therefore, schools will likely want to reference MDE's Crosswalks for ELA and Math (found on the M-STEP page), the cluster and standards document for ELA and Math (CCSS), and possibly the SBAC Blueprint for Math and ELA.