A Cornhusker concussion study appeared to accurately ID head injuries. Then it hit a brick wall.
By Ellie Kincaid, Retraction Watch
April 20, 2023, 4 p.m. ·

When Khalid Sayood and one of his students at the University of Nebraska-Lincoln set out to study concussions, they had an obvious group to study: the Cornhuskers football team.
To make things even simpler, a colleague who had already collected data on some football players said they could use it for their new study.
In the data – measurements of the players’ brain activity taken at the beginning of the season, and then again after some had sustained concussions – Sayood and his student hoped to find signals that would make diagnosing the brain injuries faster and cheaper.
In a published paper, they say that they succeeded, finding a way to identify the Husker football players that had concussions with 99% accuracy.
That early success opened up tantalizing possibilities and suggested there’s a potential path to more effective diagnosis of head injuries, both in Cornhuskers and across the country.
More effective diagnosis is a big deal, because each year, there are as many as 3 million concussions from sports in the U.S., according to the University of Pittsburgh Medical Center. Roughly 300,000 come from football alone.
In Nebraska, an average of 13,600 traumatic brain injuries happened per year from 2014 to 2018, according to a report from the Nebraska Department of Health and Human Services. Responding quickly and properly to concussions – essentially brain injuries that cause the brain to not work normally – is key to prevent the injury from getting worse, according to the health department.
With much more work, Sayood says, other scientists could build on their study to develop a new medical device that would be useful in settings with fewer resources than Division I college football.
But after they published their paper, the university’s research ethics board spotted problems with it: The researchers didn’t have the proper permissions to use the football players’ data, and the board said they published information that could be used to find out which players participated. The ensuing investigation and its consequences – including the retraction of the paper – mean the once-promising research may have hit a dead end for now.
“I find the actions of the [board] to be very objectionable, very harmful both to the university and the research enterprise,” Sayood said. “It’s been a horrible experience.”
The research project was born of the idea that the human brain is too complex – and each person’s brain too unique – to be able to find signs of trauma like a concussion by comparing the brain activity of someone who just had a head injury with the brain activity of someone who hasn’t.
Any differences scientists saw could be because people’s brains work differently, not because of a concussion. So Sayood, an UNL electrical engineering professor, wanted to look for differences in the same person’s results before and after a concussion, to be more confident that changes were due to trauma.
They used electroencephalogram (EEG) readings, which measure the brain’s electrical activity.
The players who participated likely would’ve worn skull caps with electrical probes that collected information on their brains while they completed a simple memory exercise.
The scientists compared EEG readings of football players taken before the beginning of the season with readings taken after some of those players had concussions. Additional EEGs of the players who didn’t have concussions served as another comparison.
Sayood and his student used EEG data from eight football players that other UNL scientists had collected for earlier published research. They published their results in December 2021 in the journal “Neurotrauma Reports.”
The method that Sayood and his student developed identified the players who had recently had concussions with 99.5% accuracy, they wrote. But because they only had data from a small number of players, confirming whether their method would help in the real world would take more work and more study participants.
Because there’s not currently an objective, clear way to diagnose when a person has had a concussion, a lot of researchers are working on finding more definitive tests, including by using EEG, said Jeffrey Tenney, a pediatric neurologist at Cincinnati Children’s Hospital. When he and others assessed that approach, however, they found there isn’t yet enough evidence that EEG is useful for diagnosing concussions.
“It’s quite possible someone might eventually find there is a way to use this,” Tenney said.
Researchers would need to study hundreds of people in a well-controlled environment, and take into account any medications or other brain disorders the participants might have, he said.
“It’s not going to win us any Nobel prizes,” Sayood said of his Nebraska-based research. But, he said, the method of comparing the EEGs of participants before and after a concussion was original. It’s also relatively cheap.
But the data Sayood and his student used for their research had a problem he didn’t realize at the time.
According to typical rules for research involving people, scientists must ask an ethics watchdog called an Institutional Review Board (IRB) to review their plans and get the board’s approval.
One of the most important things these boards consider is informed consent for the participants, said Abbey Lowe, a University of Nebraska Medical Center bioethicist. “People should get the chance to understand what you’re asking of them, then say yes or no to it.”
Consent is “a huge part of what makes human research ethical,” Lowe said. If researchers collect data for a study, but then decide they want to do different research with it, the participants’ consent for the original study doesn’t automatically carry over to the new one.
The colleagues who collected the data and shared it with Sayood said that it had been stripped of all information that could identify the players who participated, and thus they could use it without going through the IRB first.
But the data files weren’t deidentified, as Sayood would later discover: They included the players’ birthdates, which could trace back to them.
The university IRB would come to a very different conclusion about whether the researchers could use the data without approval.
Soon after the paper was published, UNL received a complaint questioning the researchers’ access to the data they used, said Dan Hoyt, a UNL research integrity officer.
Research compliance staff looking at the dataset saw that while it had been originally collected as part of an approved study, the new paper described work that hadn’t been part of the sanctioned plan. They contacted Sayood with questions.
When Sayood sent the dataset to the staffers investigating the matter, he discovered that it contained the players’ birthdates.
“I went back and looked and said, ‘Uh oh.’”
In addition, the paper contained a table with information on the eight football players whose data was used in the study, including their age on the day they were scanned. The IRB said this could be used to identify the players, in combination with other public information, like team rosters.
The full ethics board met to assess the study, and determined that the issues amounted to “serious noncompliance,” Hoyt said. In March 2022, the IRB wrote to Neurotrauma Reports, requesting the journal retract the paper.
Sayood appealed the decision but lost. The journal retracted the paper last April. It was one of nearly 5,000 retractions issued in 2022.
As is common practice, the paper remains online, but with a heading and watermarks declaring it “Retracted,” and a notice explaining the reason. At the IRB’s request, the journal also redacted data from the table with information about the players.
The thing that frustrated Sayood the most, he said, is that he’d hoped other researchers would pick up on the method to detect concussions and eventually build simple, cheap devices that could be used on the sidelines of a high school football game. But that doesn’t seem likely anymore.
“Now they’ve got this stupid ‘retracted’ thing written all over it, so I’m sure anybody who’s actually going to do something will ignore this paper,” he said.
Indeed, a doctor who contacted Sayood because he wanted to use the work “ghosted” the researcher after the IRB got involved, he said.
Besides the retraction, the university hasn’t imposed any additional consequences. But the retraction makes it difficult for Sayood or his student to advance the research themselves. The IRB ordered Sayood to destroy the data on the players, so they can’t continue the work or try to repeat it.
Even if they could repeat the study, Sayood is not sure another journal would publish it, because the results are already online, in the retracted paper, and so wouldn’t be novel. Yet they can’t use a retracted paper as the basis for new work, he said. “We are trapped.”
“We did do something wrong,” Sayood said of using the dataset with the birthdates. But he also said that the identifying information wasn’t a factor in their research.
Sayood doesn’t see why the paper had to be retracted, unless as a punishment for not following regulations.
Hoyt said making requests to retract papers isn’t “routine practice, but it does happen. Nobody takes the notion of retracting an article lightly.”
This story was produced in a collaboration between the Flatwater Free Press and Retraction Watch, a nonprofit news outlet that reports on scientific misconduct and related issues, and maintains a comprehensive database of retractions.
The Flatwater Free Press is Nebraska’s first independent, nonprofit newsroom focused on investigations and feature stories that matter.