AthleticBusiness.com has partnered with LexisNexis to bring you this content.

Copyright 2014 The Durham Herald Co.
All Rights Reserved
The Herald-Sun (Durham, N.C.)
Wes Platt wplatt@heraldsun.com; 919-419-6684

CHAPEL HILL - Overall, the trio of academics hired by University of North Carolina at Chapel Hill backed up assertions that Mary Willingham's data about student-athletes didn't support claims of widespread illiteracy.

Willingham reported in January that about 60 percent of UNC's student-athletes tested between 2004 and 2012 had a literacy level between fourth and eighth grade.

The consultants couldn't reproduce her findings with data provided by the university from the Scholastic Abilities Test for Adults.

In reports released last week, the three professors indicated that it was a mistake to assign grade-level equivalents to that test.

They noted, though, that they would need more information from UNC to fully assess literacy levels among student-athletes.

They also found that although the data didn't appear to support the severity of Willingham's claims, student-athletes at UNC still seemed to fall short on vocabulary.

Based on what they were given, two of the consultants - Lee Alan Branum-Martin of Georgia State University and Dennis Kramer of the University of Virginia - found that some student-athletes included in Willingham's study appeared to score below the 12th-grade level. And one described them as falling below the national average.

Branum-Martin found that 38 percent of UNC student-athletes tested scored below the 12th-grade level.

"This level of performance would suggest that these students may need additional assessment to better understand their complete reading skills and their capacities for college-level academic reading," he wrote.

Kramer acknowledged similar deficiencies, but noted that UNC provides added value to student-athletes who might face such challenges.

"While this report finds some instances where student-athletes were admitted to the UNC-Chapel Hill with a 'reading' grade-level equivalency less than grade 12, the interest should not be level of entrance but the value-added provided during their time in college," he wrote. "The incentives around student-athlete eligibility and academic performance create an environment where athletic departments provide academic support to student-athletes who need such to succeed."

Nathan Kuncel, a psychology professor from the University of Minnesota, found that 60 percent of student-athletes in the sample reported by Willingham are below the national average in vocabulary. However, they're anchored at the top end not by the eighth-grade equivalent, but instead the freshman year of college, he said.

"Overall, the normative data suggest that the student athletes in the sample are below average, but not dramatically, compared to the general population in reading vocabulary," he wrote.

Although he considered low scores a risk factor and cause for concern, Kuncel wrote that they "do not guarantee failure."

The consultants all agreed that standard scores in the SATA Reading Vocabulary subtest, developed in 1991, shouldn't be presented as grade equivalents.

"The SATA is a rather dated test, especially when compared to other achievement tests for college-aged students," Branum-Martin wrote. "The SATA was not designed to provide accurate grade equivalent scores below grade ten."

Kramer said the technical merits of grade-level equivalents are suspect.

"As someone who has developed assessments within both the K-12 and postsecondary enrollment, grade-level equivalents do not provide an accurate measure of student readiness or ability," he wrote.

Said Kuncel: "Despite their superficial intuitive appeal, grade equivalents should generally be avoided."

Branum-Martin took issue with one of the questions posed by the university when he was hired to analyze Willingham's findings: "Is the SATA Reading Vocabulary (RV) subtest a true reading test?"

His response: "The framing of this question is dangerously simplistic." He said it would be better to ask: "Is the SATA Reading Vocabulary subtest reasonably valid for use among college athletes to determine their reading abilities?"

With that in mind, he said that the SATA "does not appear to have been updated and it is possible that it does not adequately reflect contemporary educational and social influences."

He also suggested that UNC would need to provide more data to give "a better portrait of student reading."

"The given scores are not sufficient to substantiate grade levels with respect to overall reading ability," Branum-Martin wrote. "Other information would be required."

On Monday, Willingham said that the data she used to reach conclusions about student-athlete literacy also included SATA Writing Mechanics and SAT/ACT scores.

Karen Moon, director of news services for UNC, said that the consultants received the vocabulary and writing subtests, as well as the university's internal analysis of Willingham's data.

"The university asked the external experts to focus specifically on the use of the tests to measure reading ability and the use of grade levels to explain the results of these tests," Moon said.

Kramer didn't think the SATA RV subtest should be considered a measure of literacy.

"Reading and literacy are not synonymous," he wrote. "Literacy is traditionally defined as knowing how to read, write, and speak formally. Reading is the oral interpretation of written language."

He expressed concern about how many student-athletes from revenue-generating sports - basketball and football - were used in Willingham's sample. Student-athletes from those sports represent about 18 percent of the total UNC student-athlete population, he said, but made up 81 percent of the individuals within the data provided.

He found that most discussion around Willingham's report and data turned on the assertion that football and men's basketball players at UNC "were admitted with significantly lower reading levels as compared to student-athletes in non-revenue generating sports."

"However, it appears that the SATA assessment is biased downward for males and African-Americans rather than football and men's basketball participants," he wrote. "Given that African-American males are highly represented within these two sports, it stands to reason that the potential gender and racial biases of the SATA assessment are leading to lower scores for that particular population."

Meanwhile, UNC continues to seek "ways to strengthen the academic experience of student-athletes and enhance connections between academics and athletics," Moon said.

Follow on Twitter at @HS_WesPlatt. Connect on Facebook at facebook.com/wesplattheraldsun.

SNbS

 

April 17, 2014

 

 
 

 

Copyright © 2014 LexisNexis, a division of Reed Elsevier Inc. All Rights Reserved.
Terms and Conditions Privacy Policy