Measuring School Performance without a Strategy
An article in Saturday, June 25's Wall Street Journal, "School Reform, Chicago Style," provided an interesting picture on measuring performance.
The schools were gathering a lot of data. "Two number crunchers at Marshall [High School] digested tens of thousands of data points, from the frequency of fights to cheerleaders' GPAs." After a year's worth of data collection and analysis, some schools in Chicago were seeing "promising trends."
According to Ascendant's 2011 strategy management survey--which we conducted in conjunction with our 2011 strategy execution summit--data quality was one of the top ten barriers to performance management and strategy execution. So, from this point of view, Chicago was succeeding with one of the harder issues.
Our strategy management survey found the number one barrier to performance management and strategy execution was funding. Chicago's worst performing schools were doing okay in the funding department as well--at least well enough in the days of declining budgets--because they received $20 million in federal money and an extra $7 million in local money.
However, even though Chicago Public Schools were able to overcome two of the biggest obstacles to performance management, the article presented a system that was achieving mixed results, at best.
In fact, the attitude of teachers and administrators toward the data collection and analysis was ambivalent. At one school, staffers complained about data collection and entry requirements, according to the Journal. One Chicago school administrator "embraced the process but worried the school was applying and measuring too many incremental changes without determining what works."
I think that points to the crux of the problem for Chicago Public Schools--they are measuring too many things. They decided they needed to measure their performance, but this performance measurement was not linked to a strategy. They hadn't taken the time to decide what issues weren't strategic and, therefore, what data they didn't need to track on a regular basis.
"It's very hard to get any sense of cause and effect," assistant principal Matt Curtis said. "We need to home in on a strategy or two and make the commitment to run with that."
As Chicago may find out, without the strategy piece, gains from measuring performance will be hard to maintain.
June 2021
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
Monthly Archive
June 2014 (1)
May 2014 (2)
March 2014 (1)
February 2014 (2)
January 2014 (1)
December 2013 (1)
October 2013 (2)
September 2013 (1)
July 2013 (2)
June 2013 (2)
April 2013 (1)
March 2013 (3)
February 2013 (4)
January 2013 (7)
December 2012 (4)
November 2012 (8)
October 2012 (9)
September 2012 (5)
August 2012 (6)
July 2012 (5)
June 2012 (7)
May 2012 (8)
April 2012 (5)
March 2012 (5)
February 2012 (6)
January 2012 (6)
December 2011 (7)
November 2011 (9)
October 2011 (9)
September 2011 (2)
August 2011 (8)
July 2011 (6)
June 2011 (8)
May 2011 (12)
April 2011 (5)
March 2011 (1)
February 2011 (2)
January 2011 (4)
December 2010 (6)
November 2010 (3)
October 2010 (5)
September 2010 (4)
August 2010 (3)
July 2010 (2)
June 2010 (1)
May 2010 (2)
April 2010 (1)
March 2010 (3)
January 2010 (4)
December 2009 (1)
November 2009 (1)
October 2009 (1)
September 2009 (3)
August 2009 (2)
July 2009 (3)
June 2009 (3)
May 2009 (6)
April 2009 (5)
March 2009 (3)
February 2009 (2)
January 2009 (2)
December 2008 (2)
November 2008 (2)
October 2008 (4)
September 2008 (6)
August 2008 (5)
July 2008 (4)
June 2008 (9)
May 2008 (5)
April 2008 (6)
March 2008 (8)