Wall Street Journal
What Happens if SAT Scores Consider Adversity? Find Your School
Updated: Dec 4, 2019
By Douglas Belkin
The Wall Street Journal obtained the list, and it offers a glimpse of the effects on test scores
What if SAT scores could take into account whether a student went to an elite boarding school in New England or a struggling public school in Chicago’s poorest neighborhood?
The College Board, which administers the SAT, asked this question and developed an adversity score for every U.S. high school, measuring about 15 factors such as income level and crime rate in a school’s neighborhood.
It abandoned the single-number measurement over the summer after a public outcry from educators and parents. Instead, it plans to give colleges a range of socioeconomic data on high schools and their neighborhoods.
The Wall Street Journal obtained the College Board school-adversity scores, which ranked schools from 1 to 100 in degree of adversity. It then asked a Georgetown University data scientist to use those scores to adjust the average SAT results of 10,353 high schools where at least 30 students took the SAT.
Among the findings by the data scientist:
• More than half of the 50 high schools with the highest unadjusted SAT scores are private.
• Top public magnet schools performed exceptionally well in adjusted SAT scores, meaning their scores jump when adversity is accounted for.
• Of the 10% of high schools with the highest SAT scores, a total of 1,035, just 64 had an adversity score of 50 or higher on the College Board’s scale.
• Some of the poorest schools punched well above their weight while some of the wealthiest performed poorly.
At the nation’s wealthiest high schools, students score an average of 441 points higher on the SAT than students at the poorest high schools, according to a Journal analysis of College Board data. On average, white boys whose parents earned college and graduate degrees outperform others, according to the analysis.
The College Board says it never meant to combine the adversity and SAT scores, and instead wanted to give college admissions officers more context to consider in reviewing applicants’ SAT performance. “There is no such thing as a weighted SAT score,” said spokesman Zachary Goldberg.
The SAT has never been more widely used—or broadly threatened—as the gatekeeper to elite higher education. The University of California is considering dropping the SAT as a requirement for admission due to its concerns that the test has become a barrier for poor and minority students applying to elite schools.
“There are some glaring concerns about standardized tests, specifically the SAT, and the disparate impact it has on certain groups,” said Eddie Comeaux, associate professor of higher education at University of California, Riverside and co-chair of a faculty task force reviewing the role of standardized tests in UC admissions.
The task force expects to issue recommendation early next year and the UC Board of Regents will then make its decision.
More than 2.2 million high-school students took the exam in 2019—the most ever. But more than 1,000 colleges and universities—also an all-time high—have stopped requiring a standardized entrance exam such as the SAT and made tests optional.
First administered in 1926, the SAT was designed to level the college admissions playing field by identifying smart students who would otherwise be overlooked by elite schools. The test is scored on a scale of 400 to 1600.
The College Board, a New York-based nonprofit, has said it has worried for years about race and income inequality influencing results. It has redesigned the test and introduced free online SAT tutoring to try to level the playing field.
Mr. Goldberg, the College Board spokesman, said the SAT isn’t discriminatory but rather highlights the inequalities in the nation’s education system.
The SAT is “strongly predictive of college performance,” he said. Taken together, grades and test scores “provide more insight into a student’s potential to succeed than either measure alone.”
Test scores determine more than admissions. They are also used in awarding billions of dollars in private, state and university merit scholarships.
How colleges consider a student’s race and economic level in making admissions decisions is a raging debate. Many colleges say a diverse student body is part of the educational mission of a school.
The College Board in the past tried to give colleges avenues to consider socioeconomic context along with test scores. In 1999, it launched a project to predict SAT scores based on socioeconomic factors including race, if schools chose to add it. Students who scored at least 200 points more on the SAT than predicted were called Strivers.
Because minorities often had lower predicted scores, they were more likely to be considered Strivers. The College Board canceled the project after public backlash from critics who charged the program was a proxy for race-based affirmative action.
The Journal asked Jeff Strohl, director of research at the Georgetown University Center on Education and the Workforce, to weight the SATs using the College Board’s adversity scores. The result: Elite public magnet schools rose to the top, including Walter Payton College Prep in Chicago, Boston Latin School in Boston and Bronx High School of Science in New York.
By comparison, more than half of the top 50 schools with the highest unadjusted SAT scores are private. They include Winsor in Massachusetts, Phillips Exeter Academy in New Hampshire, the Hopkins School and the Hotchkiss School in Connecticut, Sidwell Friends and St. Albans in Washington, D.C., Harvard-Westlake and the Marlborough School in California, and Trinity and Collegiate in New York.
To adjust the SAT scores, Mr. Strohl created a baseline by using the average SAT of schools with an adversity score of 50. Then he calculated the distance between that and the average of every other adversity score. Points were added or subtracted from individual schools depending on their distance from the baseline. Only schools with at least 30 SAT test takers were included in the analysis.
Corrections & Amplifications Georgetown University Center on Education and the Workforce was incorrectly called Georgetown Center on Education and the Economy in the source line for the graphic in an earlier version of this article. (Dec. 3, 2019)