The Politics of Education: 2010 MEAP Scores Reveal Significant Achievement Gaps Between Elementary Schools in A2

At the end of March, the Michigan Department of Education released MEAP scores for the state’s elementary and middle schools. The data is a treasure trove for those who enjoy sifting through such information. Test scores impact districts’ curricula and often determine what a particular school will “focus” on academically. For example, a school with high numbers of students who test as “not proficient” in writing will develop strategies to deal with the problem.

The latest MEAP scores reveal that significant achievement gaps are a problem in the Ann Arbor Public Schools, where the District collects $.28 cents of every property tax dollar paid, and, together with state money, spends about $12,000 per year per student. Schools in Washtenaw County spend about half a billion dollars each year educating about 50,000 students total. As it turns out, in Ann Arbor and Washtenaw County, like in most Districts around the county, relatively little of that money is spent on figuring out which which teachers are effective and why. This, in part, is why it’s possible that two elementary schools in Ann Arbor—schools that are less than 1 mile apart—have MEAP results that are miles apart.

There is currently a debate raging nation-wide about whether teacher compensation and even continued employment should be predicated (in whole or in part) on student test scores. In Los Angeles, the newspaper stunned District parents and infuriated the leadership of one of the largest and most politically powerful education unions in the country, the United Teachers of Los Angeles, by compiling and analyzing several years of test scores, matching up the test scores with the individual teachers in the District and putting a searchable database of the information online. Parents (or anyone) could type in the name of a teacher, and see whether students taught by that individual showed progress as measured by standardized tests administered by the State of California.

The original August 2010 article on The L.A. Times web site has been shared close to 7,000 times on Facebook since it was published.

The article begins: “Yet year after year, one fifth-grade class learns far more than the other down the hall. The difference has almost nothing to do with the size of the class, the students or their parents. It’s their teachers. Seeking to shed light on the problem, The Times obtained seven years of math and English test scores from the Los Angeles Unified School District and used the information to estimate the effectiveness of L.A. teachers — something the district could do but has not. The Times used a statistical approach known as value-added analysis, which rates teachers based on their students’ progress on standardized tests from year to year.”

The article enraged the local teachers’ union leadership, which launched a boycott of The L.A. Times, and demanded that the paper take down the searchable database. It’s easy to see why the union leaders became upset. The piece doesn’t mince words:

In Los Angeles and across the country, education officials have long known of the often huge disparities among teachers. They’ve seen the indelible effects, for good and ill, on children. But rather than analyze and address these disparities, they have opted mostly to ignore them. Most districts act as though one teacher is about as good as another. As a result, the most effective teachers often go unrecognized, the keys to their success rarely studied. Ineffective teachers often face no consequences and get no extra help.

Value-added analysis rates teachers based on their students’ progress on standardized tests over a number of years. According to The L.A. Times’s article, “A student’s performance is compared with his or her own in past years, which largely controls for outside influences often blamed for academic failure: poverty, prior learning and other factors.” It is a tool favored by those in the school reform movement in the United States, as well as by the Obama administration. It is a tool that is not embraced by the national leadership of the American Federation of Teachers though AFT President Randi Weingarten has been more open to the use of value-added analysis than any of her predecessors. Interestingly, in March 2011 The L.A. Times reported that the teachers’ union was preparing to move forward with its own “confidential” value-added rating system, and the school board was planning to move forward to use value-added analyses in evaluating the district’s 6,000 teachers.  Meanwhile, another of the AFT’s affiliates, this time the powerful New York United Federation of Teachers—where the past several AFT national presidents have come up through the ranks—is fighting the public release of value-added ratings for 12,000 K-12 teachers there.

AAPS officials said the District does not use value-added analysis to evaluate its 1,100 teachers though the District does track the student MEAP results of each of the District’s teachers. A question to Brit Satchwell, President of the AAEA union about whether there were plans to incorporate value-added analysis in Ann Arbor, went unanswered.

MEAP scores released by the state give one-year snapshots of how schools and school districts are doing with respect to teaching the basics, math, reading, writing, science and social studies. As in Los Angeles, and of course school districts in states across the country, there are significant disparities within the Ann Arbor school district with respect to student performance on the state’s standardized test. There are testing achievement gaps between students of different races, grade levels and, most disturbingly, between schools.

The gap between Ann Arbor’s elementary school students testing at the advanced level is persistent, reflected in MEAP data as far back as 2004.

Ann Arbor uses a unified curriculum in an effort to produce consistent results in the classroom. MEAP scores released recently demonstrate quite clearly that strategy isn’t working. MEAP scores available over the past several years show that there remain significant achievement disparities. The disparities, in some instances, are growing. Significantly, more students at some elementary schools in Ann Arbor are performing at the advanced level across the board in all five subjects, while the MEAP scores of students at other Ann Arbor elementary schools, taking the same MEAP tests, are not exceeding expectations.

Here are portraits of three grade levels at three Ann Arbor elementary schools. The schools, Northside Elementary and Logan Elementary, are located not more than 1 mile apart, geographically. Both schools feed into Clague Middle School. In 2010, there were significant differences in the achievement of the students in 3, 4, and 5 grades at Logan and Northside. All percentages below reflect the percentage of students tested who scored at the Advanced level in the subjects tested. The AAPS scores are the District-wide percentages of students scoring at the Advanced level in the subjects tested.

There are significant achievement gaps between the students tested at the three schools. Only the students at Burns Park Elementary exceeded District-wide Advanced-level MEAP scores in all of the subjects in each of the years tested. Logan student MEAP scores exceed 2010 District-wide MEAP scores in grades four and five, but not for reading in grade three. Conversely, students at Northside Elementary didn’t meet or exceed District-wide scores of students who tested at the Advanced level in any of the subjects tested in any of the grades tested. By fifth grade, students at Northside have made up some ground in the subjects tested, but achievement at the Advanced level of Northside students lags behind both the students at nearby Logan and Burns Park elementary schools, as well as students District-wide.

One important fact to remember is that these MEAP scores represent the achievement of the previous academic year. In other words, MEAP scores of fourth graders represent the mastery of work done in the third grade, not the fourth grade. Fourth grade teachers spend several weeks on MEAP prep. prior to the October test, but that work is meant as a refresher course, as it were.

Why are there such noticeable differences in the achievement of students at the three schools?

Ann Arbor officials have, in past, used race as an explanation.

The first step toward answering that question would be to openly acknowledge that there are significant differences in student achievement at the Advanced level between the various schools in the District. As a rule, when the District presents MEAP data, officials don’t make comparisons between individual schools, but rather make comparisons between how the District performed in each grade level. One other way that the gap is masked is in the fact that there is a tendency to lump together the students who met or exceeded expectations. There is, of course, a difference between simply meeting expectations and achieving at an advanced level.

This comes from the Academics section of the AAPS web site: “Over 96% of third graders, 94% fourth graders and 93% seventh graders met or exceeded state standards in math.”

This would lead a casual reader to believe that the majority of third and fourth grade students in the District are performing at the same level in math. This simply is not the case as evidenced by the data graphed for individual schools above. In 2010’s fourth grade class, almost twice as many students at Burns Park Elementary have advanced mastery of math, versus students at Northside Elementary.

National education reformers seek to link consistent under-performance such as that reflected in Northside’s MEAP data to individual teachers, who can then either be mentored or replaced. National reformers also seek to replicate student achievement, such as that reflected in the Advanced level MEAP scores at Burns Park Elementary School, throughout individual Districts.

Ann Arbor Public School MEAP data suggest that depending on which elementary school a child is enrolled in within the AAPS will determine whether the child simply “meets” state standards (is proficient) or will master the material at an advanced level.

To lump the “met or exceeded” groups together effectively disguises the reality that a parent who sends her/his child to one school in the District rather than another will see significantly varied achievement results. Students will demonstrate higher achievement if enrolled at Logan or Burns Park, according to MEAP data available that goes back to 2004. Since 2004, students at Northside have scored lower in math, reading, writing and science on the MEAP than students at Logan or Burns Park.

Logan and Northside are not a world apart. They are one mile apart as the crow flies. The demographics are not dissimilar, nor is the enrollment or class sizes.

According to the AAPS Web site:

AAPS Board of Education approved the School of Choice program for school year 2011/12. The Ann Arbor Public Schools will open a School of Choice window for Washtenaw County residents between April 15 – May 15. Applications will be avilable on this website beginning April 11. Enrollment applications will not be accepted until April 15th. Enrollment will be through a lottery process, NOT a first-come-first-serve process. The School of Choice grades approved for 2011/2012 school year are: Kindergarten, First (1st) and Sixth (6th) grades with limited seats available for second (2nd), third (3rd), fourth (4th) and fifth (5th) grades. Elementary schools will include Abbot, Bryant, Carpenter, Dicken, Eberwhite, Lakewood, Logan, Northside, Pittsfield & Pattengill. Middle schools will include Clague, Forsythe, Scarlett, Slauson, & Tappan.

Many of the elementary schools of choice are those with achievement scores in reading, math, science and writing that have, since 2004, been significantly lower than those of students who attend King, Burns Park and the Open School. In other words, a child attending one of these schools would be much more likely receive instruction that results in proficiency rather than advanced mastery of the subject.

Is race a factor, as AAPS officials suggest? The investigative and extensive analysis work of The Los Angeles Times suggest that when one class, group or school of students learns far more than another in a school district, the difference has almost nothing to do with the size of the class, the students or their parents.

On March 11, 2011 The L.A. Times published a follow-up story on value-added analysis that asks just this question. In that piece, it was suggested that, “Theoretically, value-added models inherently account for these differences [race and poverty], because each student’s performance is compared each year with the same student’s performance in the past, not with the work of other students. But many experts say further statistical adjustments are necessary to improve accuracy.”

9 Comments
  1. Dan says

    First, the reason the LA Times was criticized was that it violated student’s privacy in order to track individual students. Second, that is why value added analysis should be done by schools, not newspapers. Third, if AAPS chose to do it, it should be regarded as a performance review tool, and as such employees should be granted a measure of privacy. I can imagine counterarguments, but I doubt they are persuasive, at least to me. Fourth, this article asserts race is probably not the factor, but provide no evidence. Innuendo only. Fifth, the fact this data is easily obtained, by school (as well as race and income) indicates neither AAPS nor the MI DoE is hiding anything.

  2. A2Dem says

    Is it the teachers or the system? Well, at the moment, the system exists to employ the teachers first and educate the students, second. The teachers are the system, and I agree with rose that the system needs an overhaul.

    What this MEAP analysis shows is that if I enroll my kids in Burns Park they will be better educated than if I enrolled them at Logan or Northside. That’s not the way the system should work, at least I hope we can agree that’s not how the system should work.

    I’m not sure testing the kids is a stand alone method of teacher evaluation but for goodness sake if we give the kids the tests we should use the results of the test to determine whether the kids are getting the best instruction possible.

  3. rose says

    Oh, I do agree with your last post.

  4. A2 Politico says

    @rose, here’s the thing. Within the Ed. community leaders have, collectively, a billion years of higher education among them and they are simply at a loss about how to institute post-tenure review metrics. If there were any industry (and I believe K-12 ed is an industry) where there was need for accountability, it’s education. Value-added, if not the perfect system, has done one important thing: it has launched a nation-wide discussion of why there is a need to track the progress of students as they move through the classrooms of individual teachers.

  5. rose says

    The last sentence I believe encapsulates it the problem,…”But many experts say further statistical adjustments are necessary to improve accuracy.”

    There are some teachers who are given the harder kids, and they have a higher caseload, because they are better and more patient with them. Your arguments were based partially on the outcomes in the schools in the districts, and they do vary.
    You changed your metrics in the article.

    There are some teachers that definitely don’t do as good a job, but basing it on younger versus older, I am not so sure I agree. It’s a hard thing to judge.

    And what about the teacher who has many solidly good performing years behind them, and in their later years, are slowing down, and aren’t so sharp, but they are coming to work, and are trying their best, but the job performance is not what is was 25 years prior

  6. A2 Politico says

    @rose the point of value-added analysis is that it controls for all of the factors you bring up “because each student’s performance is compared each year with the same student’s performance in the past, not with the work of other students.” This is the key! Tracking individual students through the classrooms of individual teachers. When you do that, I’ve read over and over, that patterns emerge for individual teachers. One of the other surprises is that weak teachers tend not to be clustered in particular schools, but rather to be spread around a District.

    This debate doesn’t preclude the fact that people (teachers) can benefit from mentoring, instruction and supervision. At the moment, however, our District will keep a teacher based on nothing but seniority.

    As the student who wrote the latest Maelstrom piece noted, in her/his experience newer teachers tend to be the most motivated and motivating. In a budget crunch, the new ones are the first to go. This is a very real problem.

  7. rose says

    Children don’t need to go to a Title 1 school to get a reduced cost for lunch, that is something that’s available in all the schools. A Title 1 school has more at risk, lower socioeconomic kids and get more federal money to cover the cost of increased staff to help educate the children, arguably because the kid’s needs are greater.

    Does the opposite happen, do teachers lower the bar because it’s easier to do, and they have a built in excuse, and try harder when the kids are “better prepared” often because they are coming from higher income families? Do the kids get artificially sucked down the low expectation vortex of special education or of prejudice or is it the culture of the school, more relaxed because their expectations for the children are just lower, because then the 7 work day is easier, and no other factor is involved? I don’t have an answer, but I would think the parents of the children of the Title 1 schools would expect an answer.

    Is it the teachers or the system?

    Personally, I believe it’s the system.

    People have the capacity to perform, teachers have the capacity to improve, and get the best out of the kids, but for that to happen, the system has to be supportive and functional.

  8. ChuckL says

    Northside is a Title I school where the children may qualify for a reduced cost lunch.

    Is there any evidence that not-so-good teachers can become excellent teachers when properly mentored? I agree that systems that weed out a certain percentage of workers each year by firing are really bad for moral–administrators love a system like this because they can fire anyone for any reason and just say it was for poor performance–and only get sued once in a blue moon. But identifying and promoting best practices does not need to put a not-so-great teacher at risk; you want this individual to work on improving.

  9. rose says

    I don’t buy your argument that the demographics are the same at all in all three schools or across the district in any equal across schools. Not every school is a Title I school, but if they distributed the population more evenly maybe AAPS wouldn’t even have Title1 schools. But then again, those schools get more money for the extra services that a Title 1 school gets, so maybe it’s a forced move by the district.

    I think what is probably more revealing is the cost of housing in the area of a school of attendance is.

    Burns Park has significantly less minority students and the rate of children qualifying for free and reduced lunch is much lower than the other schools. Go to GreatSchools.net for demographic information, at well as other data. Look more closely and post those numbers.

    AAPS does a good job overall, but not great. There are significant areas they need improvement with, and with subsets of kids, but if a child is well prepared and well motivated, that child can get a good education. It’s the others that are at risk in this system.

Leave A Reply

Your email address will not be published.