Letter to the Editor: Methodology used to examine SOL performance was confusing

Letter to the Editor: Methodology used to examine SOL performance was confusing
(Courtesy photo)

To the editor:

I’m writing in response to the August 24 article and editorial pieces that attempt summaries of the changes in SOL test performance by ACPS schools (“ACPS Test Score Decline Part of Two-Year Trend,” and “By the Numbers, ACPS Performance Declines”). Both pieces sparked confusion within the parent community because the authors’ method is not the method by which ACPS, or even the commonwealth, evaluates change in school performance.

Both the ACPS Compliance Office and Virginia Department of Education make school
assessments by tracking performance across tested content areas (reading, writing, history, science, and math). In contrast, it seems that the articles’ authors examined each specific, individual test (grade by grade, subject by subject) to arrive at very general conclusions about schools and the school system.

With this approach, small fluctuations in individual test subjects appear to carry equal weight against larger and more significant trends at the school content area level. The editorial piece concluded, for example, that Jefferson-Houston “had equal numbers of subjects that went up and down” in their examination of the individual tests by grade. In fact, the school experienced gains in four out of the five tested subjects when the whole school averages for reading, writing, history, science and math are considered.

Though the article and editorial concede that many of the identified declines were small, we are left wondering how these small changes compare to other area school systems, or even how they fall on the arc of our schools’ performance over a longer period. Again using Jefferson-Houston as an example, the 2 percent decline experienced in one test area this year is dwarfed by the double-digit gains (15 percent and higher) achieved across all test subjects since 2014. Absent this type of contextual information, it’s hard to understand the authors’ more general conclusions regarding performance trends.

As we invest more in Alexandria’s public schools, more eyes should be on their performance and how they are serving all students, and the city is well served by a press that devotes resources to that end.The data sub-sets that the authors mine for the purposes of these articles no doubt point to specific areas that ACPS, individual schools and grade levels all need to improve upon, and more in-depth analysis and reporting on related issues would be extremely valuable, such as how Alexandria is serving its economically disadvantaged students, or the distribution of funds among grade levels.

Communicating conclusions drawn from detailed data sets, however, requires consistency of terminology and logic, and many parents found the article and editorial pieces lacking in both.

– Sarah Mehaffey, President, Jefferson-Houston School PTA