Formula tweak helps school rankings
The Arizona Republic
Dec. 26, 2004
'Underperforming' sites fell sharply this year vs. 2003
Pat Kossan
When Arizona ranked 1,203 public schools in
October, the number of "performing" and "highly performing" schools suddenly
swelled, while those ranked "underperforming" plummeted.
In a miraculous turnaround, the number of "underperforming" schools dropped from
136 to 43, including 11 "failing."
The labels released this fall left parents and educators wondering: Did schools
improve that much or were the standards lowered?
The more than 800 schools lumped in the middle, or "performing" rank, have
little in common.
At the top end, there's Anthem Elementary in far north Phoenix, where nearly 9
of 10 third-graders passed the AIMS reading test. At the opposite end, there's
Desert Garden Elementary in Glendale, where just 4 of 10 third-graders passed
the test. Both received a "performing" rank.
Patrick Yennie, Anthem principal, can live with the gap.
"It's not frustrating, it just is," Yennie said. The principal said he did not
receive one parent phone call complaining about the school's label.
"I don't know if there's anything inherently wrong with performing," Yennie
said.
Catherine Stafford is superintendent of the Avondale Elementary School District
in the West Valley and said the complex formula the state uses to rank schools
each October needs to be easier to grasp.
"I want to understand it better so I can
help my teachers," Stafford said. "Every year the formula changes, and we're the
ones that have to stand in front of the (district) board in a public meeting and
explain it."
Arizona State Superintendent of Public Instruction Tom Horne said there is
always a tension between simplicity and fairness.
As you make adjustments (in the formula) to
make it fair, you make it more complicated," Horne said. "The art of public
policy-making is to find the golden mean between simplicity and fairness."
The search for that golden mean will continue, Horne warned. While the formula
was tweaked this year, it is expected to change dramatically in 2005.
Mark Kemp, father of three students in the Kyrene Elementary district, said he
can't remember how the state ranked his boys' schools. He said parents with time
and know-how help ensure schools are good, while schools in poorer
neighborhoods, where parents are busy just trying to help their families
survive, need extra help. He views the state's system as "political jockeying."
"It's easy for a politician to say we have the AIMS test and now we have
accountability and everything's OK," said Kemp, who called it "papering over"
the real problem of the lack of funds for students who really need the money.
'Solutions teams'
State schools chief Horne credits the improved rankings to hard work by schools
and by the Arizona Department of Education, which sent coaches, known as
"solutions teams," to every struggling school. The teams retrained teachers and
principals and helped write goals that lay out a plan to raise student test
scores.
The state agency knew that last spring's test scores were critical to the
rankings: There were 81 schools that had "underperforming" labels for two years.
A third year with the ranking would place those schools on the state's first
list of "failing" schools. That, in turn, would mean costly intervention.
So it was in the state's best interest to keep those 81 schools off the failing
list. It succeeded. In October, just 12 schools were deemed failing.
"It's a tribute to teachers and principals, to the schools and the solution
teams and assist coaches who made the 81 schools their top priority," Horne
said.
Garthanne de Ocampo is the new principal at Phoenix's Emerson Elementary and
credits the state teams for giving her teachers the spark they needed to make
changes. Her school moved from "underperforming" the last two years to
"performing" in 2004.
The principal isn't sure how that happened.
The percentage of third-graders passing AIMS at Emerson dropped in 2004, while
the percentage of fifth-graders increased slightly. The principal said the
state's confusing formula appears to change from year to year.
"It's very hard to understand exactly what needs to be done to continue as a
performing school," de Ocampo said.
State officials said Emerson jumped to "performing" because it improved student
AIMS scores, averaged over 2002, 2003 and 2004, compared with the school's AIMS
scores averaged over 2000 and 2001. It also met the federal standard called
Adequate Yearly Progress this year, which it did not last year.
Roger Freeman is testing director for Paradise Valley Unified School District
and it's his job to know exactly how the formula works. Freeman warned parents
that the formula isolates the state's few exemplary schools and those most in
need of help.
"If people think this system accurately rates schools, their ability to teach
and the students' performance in terms of shopping for the best schools, it
doesn't," Freeman said.
Formula changes
State Department of Education officials admit there's more to Arizona's
miraculous growth in performing schools than good coaching.
First, the state ranked an additional 105 schools. Second, Arizona schools were
helped by tweaks to the formula.
The formula, which works on a point system, used two statewide test scores.
Arizona's Instrument to Measure Standards, a math, reading and writing test, was
taken by third-, fifth-, and eighth-graders and high school students. The
national Stanford 9 test was taken by students from second grade through ninth
grade.
Here are the changes that helped create the 2004 miracle:
• Not all test scores count.
The state throws out scores of any student who hasn't attended the school for at
least a year. It throws out scores of disabled students who take special tests
or get help, such as having the test read to them, and it throws out the scores
of elementary students who have been learning English for less than three full
years.
For example, Paradise Valley's Palomino Primary is a "performing" school where a
large percent of children are learning English. In 2004, the state used test
scores from only 29 third-graders out of 169 third-graders enrolled.
In 2003, many schools still relied on students to give personal information,
such as birth dates, gender, and how long they attended the school. Kids used
nicknames, made mistakes, or purposefully put in wrong information that skewed
the results. The state relied on schools to correct the data but many schools
didn't. This year, the state verified more school data through its electronic
student tracking system.
• A school received a "preliminary determination" of its rank a month before
parents and the public learned how their school faired.
During the intervening time, schools could send the state formal written appeals
if they believed their rankings were wrong. The state changed the rank if the
school could prove there was an error in the data, such as missing student test
scores, or if state officials agreed some occurrence beyond the school's control
negatively affected how students scored on tests. That could include bad weather
that stopped school buses or construction that hampered learning, such as
shutting down heating or cooling systems. Of the 1,203 schools ranked on Oct.
15, 82 appealed; 20 were successful.
• A small technical change caused a big jump in rankings.
In 2003, after a school threw out the AIMS scores that didn't count, if the
total number of students tested in 2000 and 2001 was less than 31, all the AIMS
scores for that school's grade and subject, for all years, were not used. For
example, if scores from 29 students counted in a school's third-grade reading in
2000 and 2001, all third-grade reading scores for every year, even 2003, were
not used. So only fifth- and eighth-grade scores were included in the schools'
formula.
In 2004, that number was changed to a total of 16 student scores for the
2000-2001 school year. That allowed schools to get credit for scores in far more
grades and subjects. For example, if a combined 29 student scores counted in
third-grade reading in 2000 and 2001, now that same school could use all
third-grade reading scores for every year to help determine the rank.
Jennifer Regalado, Arizona Department of Education's director of accountability,
said the formula is sensitive to small improvements in single grades. Allowing
schools to count far more student scores pushed more schools into a higher
ranking, helped mainly by third-graders who historically rank highest on tests.
• The formula gives schools one extra point if the school met a federal standard
called Adequate Yearly Progress. If schools failed to test 95 percent of
students in certain categories, as required by federal standards, the feds
allowed them to average the last two years to reach the 95 percent mark. The
state also allowed high schools to set a make-up date for students who missed
the first round of AIMS testing. Accountability Director Regalado said that one
extra point "absolutely made a difference" for many schools. This year, 307
schools failed to make Adequate Yearly Progress, down from 444 in 2003. That
means 137 more schools this year got to add a point, a point that helped many
schools that were sitting on the bubble.
Just as educators begin to understand the state formula, they should expect big
changes next year, forced by new tests.
The national Stanford 9 test will be replaced by a test called Terra Nova. The
state's elementary AIMS test will be combined with Terra Nova and become one new
test, now called the Dual Purpose AIMS. It will be given to students in third
grade through eighth grade. The high school AIMS test will remain basically the
same, but with new questions created by Arizona teachers and, along with
proposed new passing scores, it could be easier to pass. Second-graders and
ninth-graders will take only the Terra Nova.
|