The Limits of Average Test Scores

commentary

May 11, 2007

This commentary originally appeared in Pittsburgh Business TImes on May 11, 2007.

In a horse race, numbers tell the story. You look at the finish time of each horse and you see which came in first, second and third. It would be nice if we could rank other things in life so simply and easily, but that's often impossible. School quality is a good example.

Students in Pennsylvania and around the United States are taking state tests more than ever as a result of the federal No Child Left Behind Act. Schools are reporting these test scores so they can be rated on whether they are meeting the goals of NCLB, which require steady progress to eventually ensure that all students perform at a level the state considers proficient.

These test scores provide one useful piece of information to parents, school and governmental officials, and other taxpayers. The Pittsburgh Business Times provides current and prospective residents of Western Pennsylvania with a useful service by reporting how students in schools and school districts across the region and state fared on the state's accountability test, the Pennsylvania System of School Assessment.

The most common information presented is the percentage of students at a school or in a grade who score at the “proficient” level on the PSSA. This is the measure the Business Times uses. PSSA scores are useful for determining which schools are currently serving high-achieving students – something that could be of interest to a parent deciding where to buy a house or whether to choose the local charter school over the assigned neighborhood school.

But while this type of reporting provides one way of judging school quality, other information is needed to understand how well schools are actually educating their students. For example, average test scores tell us little about which schools are most effective at promoting student learning. The Business Times' “expectations ranking” takes a step in this direction by allowing readers to compare the average scores of schools serving students with similar economic backgrounds, but it falls short of measuring actual gains in achievement.

Many states, including Pennsylvania, are now experimenting with systems that measure student growth, sometimes called “value-added” assessment or VAA systems. These systems are designed to indicate whether the student achievement in a school is exceeding, keeping pace with, or falling behind the level of growth that would be expected as students move from one grade to the next.

VAA systems can answer questions about the average growth of specific groups, such as low- or high-achieving students. Although most parents lack access to this information, there is extensive work being done by researchers and analysts to improve VAA systems and find ways to make the information available to educators and parents.

The RAND Corp. recently produced a growth-based performance measure for the Pittsburgh Public Schools, and is currently involved in research to understand how principals and teachers across the state are using VAA information to make decisions about instruction.

Although the lack of growth data is probably the most significant limitation of today's testing regime, the current system has other drawbacks. These include an emphasis on whether students score above or below a specific cut score (called “proficient”), a lack of information about performance in important subjects and grades that are not tested by the PSSA, a lack of information about specific services and courses offered to students, and the complete absence of achievement data for private schools.

Fortunately, getting a closer look at individual schools and districts is easier now than it has ever been as a result of the proliferation of school and district Web sites. Although these vary in quality, the best Web sites allow users to learn a lot about a school's offerings.

Depending on the needs and interests of a child, information about Advanced Placement courses, college acceptance and attendance rates, vocational programs, arts courses, and extracurricular activities can give parents a sense of what kind of education a specific child is likely to receive.

The bottom line is that we should view reports such as the Business Times supplement as a valuable starting point, while recognizing that this information cannot tell us what is likely to happen to any specific child. Advances in technology and data are likely to make useful information increasingly accessible, which will improve everyone's ability to make good decisions about schools.


Laura Hamilton is a senior behavioral scientist in the Pittsburgh office of the RAND Corporation, a nonprofit research organization.

More About This Commentary

Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.