Archive for November, 2012

Too early to determine success or failure

USED recently released some data on year 1 SIG school performance. The release has re-initiated the turnaround firestorm. While some of the discussion is useful, many bloggers/ed leaders are jumping to conclusions far too quickly. To see some of the blogs/comments: see here, here or here.

Many of us who work in the school turnaround environment could have predicted the year 1 results USED released, some schools improved, some got worse. Unfortunately, that is to be expected in the first year of a turnaround effort. When modifying an existing system (with either tinkering or dramatic change), one still must fight against the status quo. There is no perfect model for a school turnaround and course corrections must be made along the way. As a result, the first programs and practices implemented might not have been the most effective and will be removed or adapted throughout the three-year implementation period.

Honestly, I’m surprised that so many schools were able to achieve double-digit gains in year 1. Year 1 schools often focus on culture and climate. While some academic improvements are likely, the real academic growth won’t come until instruction is the focus, in years 2 and 3. Just looking at year 1’s assessment data should not imply success or failure of a turnaround. We must also look at culture and climate indicators (i.e. student, teacher and parent satisfaction surveys, student AND teacher attendance, in/out of school suspensions, etc), in addition to student academic performance. If we’re evaluating turnaround efforts on assessment data for year 1, we are setting those schools up for continued failure. The schools (teachers and leaders) must know that they have the public’s support to implement a multi-year plan to make real and lasting changes. To say that a turnaround has failed so quickly implies that all of the improvements that were made were not effective, and in most cases this is simply not true, they just have a long way to go.

The other major piece that’s missing in much of this discussion (i.e. all the blogs and the early research) is what changes are actually taking place in these schools? How does the achievement (and behavioral) data compare when you look at turnarounds vs transformation vs restarts? How does the political will of leaders (school boards, superintendents, principals) impact the turnaround effort? If teachers were replaced, was there a sufficient (and highly effective) pool of teachers to rehire from? Were community partnerships formed to address all of the social needs turnaround schools often have? How hands-on or hands-off was the state education agency in helping schools and districts make real and lasting changes? What systems, conditions, and practices were embedded into the larger district to ensure sustainability and continued growth (post-SIG funds).

Until we look at 1) a more robust (and diverse) collection of data to get a real picture of what’s happening in a school over multiple years, and 2) the structures, supports, and implementation processes of turnarounds, it is far too early to draw definitive conclusions in support of or against the revised federal SIG program, and why it is or is not working. Until then, we must support the incredibly hard work that is being done in these schools, continue to work towards equitable educational opportunities for all students, and wait for additional research.

Leave a Comment