The Journal. - The Band of Debunkers Busting Bad Scientists

Episode Date: September 26, 2023

WSJ’s Nidhi Subbaraman on the scientists who moonlight as data detectives and whose discoveries have upended careers.  Further Reading: -The Band of Debunkers Busting Bad Scientists  Learn m...ore about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Can you tell me about Francesca Gino? So Francesca Gino is a behavioral scientist. She researches questions that are very central to being a human, being a person. Why do people lie? What makes people want to cheat, questions about deception, creativity, dishonesty, and the circumstances under which they are either more common or less common. That's our colleague Nidhi Subbaraman. She's been reporting on Francesca Gino's story.
Starting point is 00:00:38 She was widely considered to be a rising star in her field. She had a couple of books, was an award-winning researcher, had co-authored certainly more than 100 papers, was a professor at a top business school. She was at Harvard Business School. But earlier this year, Gino became the center of an academic scandal.
Starting point is 00:01:00 After years studying the reasons why people lie and cheat, she's now facing allegations that her own research seems falsified, a claim that she denies. The allegations come partly from the work of a trio of behavioral scientists. They're among a growing wave of data vigilantes who are taking on and taking down bad science. These folks have been doing this for years and years and years and years,
Starting point is 00:01:23 and they've just come to a point where their work is being recognized. They've each carved out a niche for themselves in the ecosystem, and they're really triggering a shift from the powers that be, from universities, from journals. So this is an interesting moment to be watching this space. Welcome to The Journal, our show about money, business, and power. I'm Jessica Mendoza. It's Tuesday, September 26th. Coming up on the show, the debunkers trying to keep science honest. Your team requested a ride, but this time, not from you. It's through their Uber Teen account.
Starting point is 00:02:17 It's an Uber account that allows your teen to request a ride under your supervision with live trip tracking and highly rated drivers. Add your teen to your Uber account today. The debunkers who looked into Gino's work run a website called Data Collada. They are also behavioral scientists. They have degrees in psychology and economics, and they are friends and co-authors. Two of the scientists, Leif Nelson and Joe Simmons, met when they were grad students at Princeton in 1999. They played in a cover band called Gibson 5000 and were in a softball team called the Psychoplasmatics. In 2007, Nelson met Yuri Simonson when they were faculty members at the University of California, San Diego. The three started Data Colada a few years later. They came together over their shared interest in methodologies that were in play in the social
Starting point is 00:03:18 sciences. Essentially, they all were sort of convinced that lots of the studies they were reading were based on weak methods and didn't really represent the things that they were claiming to say. They were skeptical. Absolutely. So Data Kolata started looking at the data behind published studies. Can you talk about the process of publishing a scientific study or a research paper? Typically, how that goes is the following. I'm a working researcher, and I have an idea, a hypothesis, that something might work the way it does.
Starting point is 00:03:56 And I go out and test that hypothesis, either by recruiting a group of people to taste a basket of oranges and taste a basket of strawberries and then draw a flower. And then maybe I come to a conclusion about who made the better flower drawing. Is it the people who ate the oranges or who ate the strawberries? And then maybe I need to have a control group of people who ate nothing and were really hungry instead. So I write up my findings and send it to a journal. The journal's editor then commissions a group of people, reviewers, to look at my study and look at the data and evaluate to see if it says anything new and how new it is and how convincingly does the paper show it.
Starting point is 00:04:39 And then they come back to the journal and say, this is worth publishing in your journal. come back to the journal and say, this is worth publishing in your journal. Or they go back to the authors and say, if you could do another experiment that showed X, or if you could sort of rewrite your argument to show Y, then this is worth publishing. And then the journal publishes it. This is what is generally known as peer review, broadly speaking. Right. And so the peer reviews aren't necessarily checking the facts of the experiment. They're not looking at the data that it was based on. They're looking at the conclusions and seeing if it makes sense. Chiefly, yes. They're eyeballing the data and they're sort of looking at it, but they're not forensically examining it. They're eyeballing it, but they're not scrutinizing it. And as with any process, things can go wrong.
Starting point is 00:05:22 Everything operates on sort of a trust basis. You trust that the people sending you data are sending you correct data and are not making it up. But of course, like, people are people, whether they're academics or whether they're politicians. And sometimes people lie. And so if somebody is willfully trying to deceive a journal on using fraudulent data, that isn't always picked up
Starting point is 00:05:43 because that's not really something that the peer review process is meant to do. And then there's sort of another piece here too. Can you tell us about the pressure that's on some scientists to produce these types of papers? Absolutely. It's come to be called publish or perish. Publish or perish. Publish or perish, yes, is a term that people say a lot. Wow. But basically, it is an academic paper. A publication in a journal is the currency of the realm. If you have a paper in a journal, and the more important the journal, the better for you, then it opens the doors to a better job, to invitations to conferences, to the next best paper,
Starting point is 00:06:26 invitations to conferences, to the next best paper, and even beyond academia to, you know, seats on boards or like invitations to talks, things like that. It's in this high pressure world that the data colada scientists do their work. They were initially interested in the question of making science better, making methods better. better, making methods better. And in trying to explain why psychology was relying on weak methodologies, like small sample sizes, or, you know, trying to look for a hypothesis after you'd gathered up all of the data that you were gathering for an experiment, both of which have now been, you know, confirmed to be like bad ways of going about and the feel has shifted the way it does things in thinking about those questions how do we really have robust findings they began sort of tuning into when data looked like it might be problematic
Starting point is 00:07:14 either because of inadvertent steps taken by the researchers because they were using weak methods or because there was some intentional falsification going on. And then start unpacking layer by layer why that might be or who might be involved or what might be the explanation for it. And that's really hard to do and often they didn't have answers, but sometimes. So they were basically like conducting audits on the methodologies of all these different studies. They were auditing the underlying data. Over the years, the debunkers have called out dozens of papers, including some by Francesca Gino, the Harvard Business School professor.
Starting point is 00:07:57 In 2021, Data Collada received a tip about the data used in some of her papers. So they looked into it. What did the debunkers find in Francesca Gino's work? So they looked at data that informed her studies and they found that some of it appeared to be tampered with. It appeared like entries had been changed by hand. In one case, they guessed at what would have been the original data, were it not changed, and found that the results weren't supported. Later that year, the debunkers sent their discoveries about Gino's research studies to her employer, Harvard.
Starting point is 00:08:46 And that has led the debunkers from the classroom to the courtroom. That's after the break. At Air Miles, we help you collect more moments. So instead of scrolling through photos of friends on social media, you can spend more time dinnering with them. How's that spicy enchilada? Very flavorful. Yodeling with them. Ooh, must be mating season. And hiking with them. Is that a squirrel?
Starting point is 00:09:23 Bear! Run! Collect more moments with more ways to earn. Air Miles. And hiking with them. Is that a squirrel? Bear! Run! Collect more moments with more ways to earn. Air Mile. Need a great reason to get up in the morning? Well, what about two? Right now, get a small, organic Fairtrade coffee and a tasty bacon and egg or breakfast sandwich for only $5
Starting point is 00:09:42 at A&W's in Ontario. After the data Collada scientists shared their findings with Harvard, the school did its own investigation. It also found discrepancies in Gino's data. So the school put Gino on administrative leave. It also asked those research studies be pulled from the journals that published them. Now they are taking steps to revoke her tenure, which is a big deal. And how did Francesca Gino respond to that?
Starting point is 00:10:24 So her primary response has been to sue Harvard and to sue the three behavioral scientists who run the Data Collider blog. She posted one statement on LinkedIn soon after the lawsuit saying that, you know, some version of she did nothing wrong. In her post, Gino wrote, quote, I want to be very clear. I have never ever falsified data or engaged in research misconduct of any kind. Harvard Business School declined to comment. Could you give us a summary of what's in the lawsuit? What is she suing Harvard for? And then what is she suing the data collada scientists for? Boiled down, she says that Harvard's investigation was flawed and that the university is discriminating her because of her gender. She's also saying that Data Colada falsely
Starting point is 00:11:13 accuses her of research fraud. And what do the Data Colada scientists have to say about that? have to say about that? They spoke to us essentially for the first time since the lawsuit. And their chief rebuttal was that they stand by their posts. For the debunkers, the lawsuit is a big deal. They have their day jobs as professors, but their work on the Data Collada website is a passion project. They don't actually get paid to do it.
Starting point is 00:11:52 What's at stake for them with this lawsuit? Well, they were very worried that a multi-million dollar lawsuit would extinguish their savings because it could take many months or years to resolve, regardless of who was right or what they were saying. And they were sort of shocked about what it could mean for public criticism of science. What do you mean? Researchers, to some extent, beyond, you know, just these three bloggers, take it for granted, they said that people should be able to critique each other's findings in a public forum, in a back and forth way. That's sort of the mainstay of the way that science works. Somebody published the findings, somebody else says, well, what about this?
Starting point is 00:12:29 And they respond. And taking that discussion to a legal forum is in some ways unfair, I think, is the vibe I was getting. And also kind of not the way the game is played and could really stall the way that business as usual is conducted in science. Another high-profile example of bad science being called out came last year out of Stanford University. It's a shake-up at Stanford, starting at the top, where the research of university president Mark Tessier-Levine
Starting point is 00:13:02 is being investigated by Stanford, among others. Pubpeer is another website where scientists dissect published studies. Posts on that website put a spotlight on work done by Marc Tessier-Levine, a neuroscientist and Stanford's president. The posts drew the attention of the student newspaper. A university investigation followed. Eventually, three studies that Tessier-Levine co-wrote were retracted. Stanford concluded that Tessier-Levine didn't personally engage in research misconduct,
Starting point is 00:13:33 but that he failed to correct mistakes. Tessier-Levine declined to comment. Last month, he stepped down as president, but he remains on the faculty. So, kind of big picture, are there a lot of other examples of bad scientific papers out there? Like how big of a problem is this? So people who've been watching the space for a while, like there's this website called Retraction Watch that really looks for any time that a paper is retracted from a journal. Deemed to be not true based on faulty data or because of fraud, because of any reason. And that, they estimated for last year, was 8 out of 10,000 papers
Starting point is 00:14:13 are retracted because of something wrong with them. 8 out of 10,000 papers. Is that a lot? That's not a lot at all. But they think it should be a little higher than that. They think it should be more like maybe 200 out of every 10,000 because the journals are late at acknowledging mistakes or they aren't retracting as many papers as they ought to.
Starting point is 00:14:35 There's kind of a lag in the system. So the signal isn't, you know, fully accurate, some researchers would say. But it's still, you know, a small but not insignificant problem within the ecosystem by numbers. What's the real world impact of fraudulent studies that have been published out there? Science informs health policy, all kinds of government decisions and corporate decisions. And all of those could be based on findings that if false, then, you know, that affects the way people live and are treated in medical facilities. In the clinical space, of course, medical studies inform how doctors think about how to treat patients. And that has very real life and death consequences. That's all for today, Tuesday, September 26th.
Starting point is 00:15:39 The Journal is a co-production of Spotify and The Wall Street Journal. If you like our show, follow us wherever you get your podcasts. We're out every weekday afternoon. Thanks for listening. See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.