A Plus Colorado News

These posts are the opinions of the writers and do not necessarily reflect the views of A+ Colorado.

4 Reasons Why the State Board Vote to Keep HOPE Set a Troubling Precedent

Wednesday, July 20, 2016

   Continue Reading...


2016 Rambles Issues No. 3: What Will Tom Do (WWTD)?

Tuesday, June 07, 2016

What Will Tom Do (WWTD)?
  Continue Reading...


2016 Rambles Issue No. 3: Steps in the Right Direction: Aurora Public Schools Set to Increase High-Quality Options

Tuesday, June 07, 2016

Steps in the Right Direction: Aurora Public Schools Set to Increase High-Quality Options  Continue Reading...


2016 Rambles Issue No. 3: Out of Sight, Out of Mind: How Boulder and CDE are Sweeping Students Under the Rug

Tuesday, June 07, 2016

Out of Sight, Out of Mind: How Boulder and CDE are Sweeping Students Under the Rug  Continue Reading...


Rambles Issue No. 2: Setting the Bar for Innovation

Wednesday, April 06, 2016

Setting the Bar for Innovation  Continue Reading...


2016 Rambles Issue No. 2: An Effective Mill Levy

Wednesday, April 06, 2016

An Effective Mill Levy = Need + Plan + Evaluation + Feedback Loop


By: Van Schoales


Denver has recently begun discussions regarding a new mill levy proposal that would provide an additional $50 million to the city’s schools. As in most debates, there is ample evidence on both sides of the issue of school funding:  there is evidence that the money could make a big difference, and also a great deal of evidence that there is no relationship between increased funding and achievement.  It’s a paradox, but ultimately we know that how we spend money matters more than how much money we spend.


Check out any district that spends much more than Denver ($12,000 PPR) yet gets lower results-- see Newark ($30,000 PPR), Providence ($17,000 PPR), and Laramie, WY ($17,000)*.  We also know that the per pupil spending at any of Denver’s highest performing charters, e.g. DSST, KIPP, and University Prep, is significantly more than that of a typical district managed school due to fundraising on top of the per pupil revenue they receive after a portion is withheld to cover district administration costs. Adequate funding is necessary but insufficient on its own. Funding must be wisely managed. You only need to look to the results of Colorado’s school improvement program for a lesson in how to waste educational improvement dollars to the tune of over $50 million.


Denver has targeted increased mill levy funds to support non-core academic subjects, and increase resources for core academics. I think just about every mill in Denver has passed in the last twenty years, and all told, the district receives an influx of about $140 million dollars annually from the mill levies. Mills benefiting PE and the arts received strong support from Denver voters because families (rightly) believe these programs have suffered over the years. Additionally, the mills have focused on more resources for literacy and math, providing the funds needed to supply (time) intensive support for students, particularly those furthest from grade-level proficiency. More support and teaching time in most cases (though not all) cost more money, not less.   


One of the current proposals being discussed for the 2016 mill is an additional investment of $6-7 million in centralized professional development for elementary teachers on literacy. If this sounds familiar, it should. We have been down this road before, as have hundreds of other districts. Centralized professional development for literacy or other instructional practices does not work. Check out TNTP’s The Mirage for the problems with typical teacher professional development. Hopefully this will morph into something with a greater likelihood of success to address the huge challenge of raising literacy in Denver’s elementary schools.  


It is often hard to track, in detail, the impact of a district mill investment, but it is possible.  Importantly, it requires some funds set aside for evaluation and audits.  Speaking as the former co-chair of the DPS mill oversight committee, we made some progress with our mill levy investment scorecards, but they are still far from what is needed to truly understand whether additional taxpayer dollars have improved learning outcomes for kids.  


Many investment areas, in particular the $40-million-dollar plus investment in art education, requires that the district know what programing is happening in schools and have some measure of quality. Our recent report on arts education in DPS finds that the district still has little to show other than that we have about 30 more art teachers. These targeted investments should require third party evaluation. I’m still dumbfounded that school districts and their boards do not demand more evaluation to really know whether a new investment is working or not.


The bottom line for the 2016 mill (and even more so for the bond) is that there is still much to do and many resources needed in our most challenged schools. The question is whether the district has the right plan, people to implement the plan, and accountability systems to make sure the district is focused on learning how to have more impact rather than checking boxes about spending going to schools. This will be doubly important should DPS reach its statutory limit on mill levy requests with the city after the next election. Here’s hoping district leadership, the citizens’ committee and ultimately the DPS school board are prepared to tackle this challenge. We cannot afford to throw money at the district.  


*Note: PPR based on Fiscal Data from NCES Common Core of Data


  Continue Reading...

2016 Rambles Issue No. 2: Neighborhood Schools and the Achievement Gap

Tuesday, April 05, 2016

Neighborhood Schools and the Achievement Gap  Continue Reading...


2016 Rambles Issue No. 2: Measuring Educational Equity: Mission Accomplished or Merely Begun?

Tuesday, April 05, 2016

Measuring Educational Equity: Mission accomplished or merely begun?  

By: Van Schoales



Post originally appeared in the Thomas B. Fordham Flypaper blog, republished here with slight modifications. 

Education Cities and Great Schools recently released a useful new educational data tool called the Education Equality Index (EEI), which allows users to compare cities and states across the nation that are “closing the achievement gap.” The tool compiles school-level low-income student achievement data (2011–2014), compares it to state average proficiency rates for all students (by test and grade), and adjusts the school’s score based on the population it serves. The EEI then rolls up these school scores into city- and state-wide scores and quantifies the size of this gap.

Education Cities should be applauded for helping raise the issue around our nation’s huge achievement gap. We need to pay more attention to disparities by race, income, and gender in our schools. We need to apply even more resources to understanding how schools are narrowing these gaps and devote greater attention to those efforts that are actually getting students to achieve at higher levels. EEI also has some great data visualizations from which the National Center on Educational Statistics could take a few tips.

There are, however, a number of significant problems with the EEI tool—not least of which is the inherent limitation of using any single measure to evaluate complex systems like districts or schools. Here are my top five concerns.

1.      Defining “the gap” matters: EEI’s achievement gap is based on a comparison between the proficiency rate of low-income students in a given school and the same rate for all students in the state; it is not a direct comparison between low-income students and their better-off peers in the same school (or district or state). As a result of the EEI comparison bar, there can be some very misleading conclusions about gap-closing. According to the EEI measure, for example, Denver has one of the highest achievement gaps in the country (this echoes findings from lots of other research). The EEI metric also rates Denver as the number-two city (behind Omaha) in terms of “closing the gap.” Denver looks like it is closing the gap quickly, in large part because the state as a whole has made so little progress improving proficiency rates.

Looking within the district, however, Denver’s non-low-income students are improving at a very similar rate their low-income peers (reading and writing), or a greater rate (math). In a school (or city) where low-income students outperform the state average, therefore, they will have “closed the gap” by the EEI measure—even if large gaps remain between low-income and non-low-income students. The gap within Denver is not actually closing, in other words. That’s an important point that could be easily lost on some readers. Denver should be celebrated for driving improvements for all kids, but not for closing gaps. We suspect that this may be the case in other cities that EEI lauded, such as Washington, D.C.  

2.      The EEI ignores absolute performance: It appears that it is easier to narrow the EEI gap in states with lower overall performance. For example, let’s take the case of Boston, which ranks sixty-ninth on the EEI. If we look at performance on an easily comparable test (TUDA), we see that 28 percent of students eligible for free or reduced-price lunch (FRL) in Boston reached proficiency on the 2015 eighth-grade reading test. Massachusetts has the highest performance of any state on that test (46 percent of eighth graders scored proficient or above), meaning that Boston has a large gap by this measure. Taken by itself, one could interpret that Boston is doing one of the worst jobs educating low-income kids. But compare Boston’s performance to Chicago (ranked nineteenth on EEI; 19 percent of FRL students scored proficient) or Tampa (ranked twenty-second on EEI; 15 percent of FRL students scored proficient). Presenting gap data in a vacuum omits an important part of the story—we should be asking where outcomes are best for students. Which leads me to my next point.

3.      Be careful making comparisons on one measure: While I love having a new large data set, it is still a fairly limited snapshot of most schools and cities. And though I appreciate the effort to make cross-city comparisons, some of those comparisons are problematic viewed through this single lens. This is particularly true for comparing states or cities across state boundaries to one another with this data set.  Should New Mexico be celebrated for having a low-achievement gap using this method? Is Miami really six times better than Denver at closing the achievement gap, as determined by the portion of its students attending gap-closing schools? Given the constraints I’ve discussed, we do not think so without a much better understanding of other data about these places.  

4.      Messaging and Spin: Education Cities (and much of the press) has lauded schools and cities that are closing the achievement gap by income. For example, Education Cities notes that D.C.’s achievement gap is lower than 90 percent of other cities. But if we look at TUDA results, we see that 8 percent of the District’s low-income students meet proficiency, compared to 53 percent of more affluent students. There is no question that many of the schools and cities recognized by the EEI should be celebrated for improving achievement for low-income students; but in reality, there are no cities and very few schools that have closed the achievement gap by income. This is the most vexing problem in schools and American society today. Some the most exalted schools on the Education Cities lists are doing great relative to most schools. But if schools and cities were truly closing the achievement gap, we would see low-income and non-low-income students matriculating to and completing college at similar rates, and scoring the same on AP, ACT, and SAT tests. That’s simply not happening.   

5.      Myths and misunderstandings: I get the need to venerate schools and districts that are making progress. In addition to doling out praise, we need to shed more light on exactly what these schools are doing so that others might learn (as well as encourage these schools to do even better). Let’s not hang “Mission Accomplished” banners when we know even the best gap-closing schools are often far from fully achieving their aim. This methodology hides the work we have left to do. It is not just misleading, but also dangerous to suggest to policymakers and educators that all reforms have worked—some have, some have not. We need to be clear that we are making progress in some places—not in all—and by any measure, we still have a long way to go.

Even with all of these concerns, I’d still say that, on balance, the EEI tool is a valuable addition to the array of available information about American schools for an informed user. While I applaud Education Cities for quickly addressing the problem with their state ranking methodology, I hope that they can also constructively respond to these other concerns while building in more local data so that users don’t come to the wrong conclusions about particular places. Not only could this be misleading for funders, who need to be adequately informed to make the tough calls about how to improve school systems; it is even more problematic for families, who need to choose schools that work best for their kids. Brightening our students’ academic prospects won’t be the work of three years, or five, or ten. It will require decades to realize significant improvement, and we need to get away from thinking that it can be accomplished in the next grant cycle. We need to act with the “fierce urgency of now,” as Dr. Martin Luther King said, while being more thoughtful about how we measure success and finish the work we have left before us.



  Continue Reading...

Announcing: The 2015 A+ Game Changers

Wednesday, March 23, 2016


Introducing: The A+ 2015 Game Changers 

Each year we reflect on the incredible work that has been done by educators and education advocates around Colorado. The nominations from friends and colleagues for the 2015 A+ Game Changer Award remind us that the energy and passion for improving public education is alive and well in Colorado. 

This year's Game Changers have all had a significant impact on students in Aurora, Denver, and Colorado. Each lends a strong perspective to their field and has pushed to rethink the system so that more kids can thrive in their classrooms. We thank each of them for their total commitment to ensuring our public schools live up to their ideal.
  Continue Reading...

March 2016: A+ Newsletter

Monday, March 21, 2016
  Continue Reading...