Bucknell Inflation Announcement Indicates Nationwide Pattern
Published: Friday, February 8, 2013
Updated: Sunday, February 10, 2013 16:02
Bucknell University announced Jan. 25 that its admissions department had misreported SAT and ACT scores between 2006 and 2012, following a pattern among universities of inflating students’ test scores.
“The outcome of all these errors was that our SAT scores across each of the seven years were reported to various organizations, most notably this Board, as being higher than they actually were,” Bucknell President John Bravman wrote in a statement to the Bucknell Board of Trustees. “During those seven years of misreported data, on average 32 students per year were omitted from the reports, and our mean SAT scores were on average reported to be 16 points higher than they actually were.”
Bucknell Director of Media Communications Andy Hirsch wrote in an email that the mistakes resulted from malpractice in the admissions office.
“[The errors] were caused by the flawed practices of former enrollment management leadership no longer with Bucknell,” Hirsch wrote. “When our new Vice President of Enrollment Management [Bill Conley] discovered the calculation errors, [Bravman] immediately launched an internal review of the data. We cannot say why those flawed practices were in place.”
Bucknell’s miscalculation comes on the heels of similar incidents at Emory University, The George Washington University and Claremont McKenna College.
According to Georgetown Dean of Undergraduate Admissions Charles Deacon, such situations among colleges will become more prevalent as the number of applicants continues to decrease over the next several years and universities strive to appear more exclusive.
“Anything that will make you seem to be more selective than you really are is seen as a benefit to recruit people,” he said. “I think [Georgetown is] in a good position because we have almost a unique reputation of not doing that, so why would [we] then report false scores?”
One obstacle to obtaining accurate data is the lack of third party testing to validate admissions information.
“[U.S. News and World Report] is dependent upon us being honest while putting our statistics together. In our case, I think there’s a general sense that there’s integrity,” Deacon said. “The problem is most of this is voluntarily reported by the colleges. There’s no third party in the end that can give verifiable data.”
Other than statistical miscalculations, Deacon said that information manipulation is present in almost all university admissions offices.
“The ultimate fallback for us, along with [the Massachusetts Institute of Technology], is that we’re probably the two places of all of them that don’t push up the numbers,” Deacon said. “Those are not things we’re doing right now and [are things] that we have never done.”
Schools also employ many tactics — including utilizing the Common Application, implementing an early decision option, getting rid of the supplementary essays and waiving application fees, among others — to easily increase their admissions statistics.
Deacon, however, said that Georgetown’s more challenging admissions process results in a more appropriate applicant pool that self-selects students, in part.
“You have to do extra work to apply to us. You have to do the essays and put in effort —it’s work,” he said. “In turn, we’re going to keep our applicant pool to the right size where we can actually read your application and maybe get you the interview out there and still have a [low] admit rate. Who cares what it is, really, as long as you get the right people.”