Reports: Not Getting Enough Bang For Our Buck

ScreenHunter_22 Jan. 19 14.27.gif

According to this wonky new report from a centrist Democratic think tank CPS isn't spending or getting much educationally from its spending while Naperville & District 200 & Elmhurst (yellow) are doing a little bit better and Oswego (in green) is doing the best. Valley View (in red) is doing the worst in the area.


Leave a comment
  • And I assume they are counting the money going to charters and not to real public schools.

  • Alexander does find some interesting policy papers, and the one released by the Center for American Progress clearly is interesting. Fundamentally this report is in my opinion methodologically flawed as it relates to special education costs and pre-special education referral interventions costs. It is flawed on its analysis and accounting for poverty. It is also flawed in relation to its measurement system for achievement outcomes for students on which productivity is based for students with disabilities. I would argue that the report itself is consistent with the logic of the CPS Performance Management approach which has been highly criticized on this blog.

    Because these issues are somewhat complex I am going to give a simplified explanation of why I believe the report is flawed.

    Problem #1 the report uses as its measuring stick of achievement state assessments scores in reading and math required by NCLB which the report admits are variable in their rigor (see page 24). The report is comparing school districts in different states with different assessments. This is a huge flaw.

    Problem #2 each of the school districts compared in terms of their Predicted Efficiency index which is controlled for poverty, ELL, and students in special education. not only have different percentages of students with disabilities, But these districts also they have different rates of disabled students who are given the standard state assessment as opposed to a non-standardized alternative assessments. Hence the outcomes of school districts with tighter or loser special education testing exclusion standards are not taken into consideration at all.

    Problem #3 some school districts with lower percentages of special education students have fewer identified special education students because the school district is spending more money on remedial education. This is not analyzed or taken into consideration at all in this report. Therefore, some school districts with high remedial expenditures and lower special education rates are not being appropriately compared to other districts that spend less on remediation and have higher special education referrals.

    Problem #4 While there are many factors accounted for in the regression equation used in the report (see page 19) the free and reduced lunch factor is an over simplified analysis of poverty. The reason for this is that free and reduced lunch has a threshold yearly income (before taxes) currently for a family of three of $33,874 for reduced lunch and $23,803 for free lunch. As is obvious to teachers in many CPS schools most of their children come from families who actually meet the overall federal poverty income level (for food stamps or SNAP) which is about 30% lower than the lunch program income guidelines for even free lunch, let alone for reduced lunch. To put it simply there are school districts whose poor are effectively the working poor and those districts like Chicago where the majority of poor children are the deeply poor. Using the same factor in the regression equation for these very different school districts is absurd, but in reality it is the only large scale data available to conduct such a study.

    My conclusion is the report may provide some inkling of "educational efficiency," but it is so fundamentally flawed as to be not worth much.

    Rod Estvan

  • here's the edweek story on the productivity report

Leave a comment