Daniel and Kelly’s Extraordinary Universe - How does scientific peer review work?
Episode Date: September 11, 2025Daniel and Kelly shine a light on the inner workings of one of the crucial parts of the scientific process: peer review. How does it work? Who are these peers? Can peer-reviewed papers be wrong, and d...oes that mean that all of science is questionable?See omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
This is an I-Heart podcast.
If a baby is giggling in the back seat, they're probably happy.
If a baby is crying in the back seat, they're probably hungry.
But if a baby is sleeping in the back seat, will you remember they're even there?
When you're distracted, stressed, or not usually the one who drives them,
the chances of forgetting them in the back seat are much higher.
It can happen to anyone.
Parked cars get hot fast and can be deadly.
So get in the habit of checking the back seat when you leave.
The message from NHTSA and the ad council.
I was diagnosed with cancer on Friday and cancer free the next Friday.
No chemo, no radiation, none of that.
On a recent episode of Culture Raises Us podcast,
I sat down with Warren Campbell, Grammy-winning producer, pastor, and music executive
to talk about the beats, the business,
and the legacy behind some of the biggest names in gospel, R&B, and hip-hop.
Professionally, I started at Death World Records.
From Mary Mary to Jennifer Hudson, we get into the soul of the music,
and the purpose that drives it.
Listen to Culture raises us
on the IHeart Radio app, Apple Podcasts,
or wherever you get your podcasts.
Lauren came in hot.
From viral performances to red carpet looks
that had everyone talking.
The podcast, the latest with Lauren the Rosa,
is your go-to for everything being made.
We'll be right here breaking it all down.
I'm going to be giving you all the headlines,
breaking down everything that is going down behind the scenes,
and getting into what the people are saying.
Like, what is the culture talking about?
That's exactly what we'll be getting into here
at the latest with Lauren the Rosa.
Everything VMA is.
To hear this and more, listen to the latest with Lauren the Rosa from the Black Effect Podcast Network on the iHeartRadio at Apple Podcasts or wherever you get your podcast.
It's 1943. A king dies under mysterious circumstances in the middle of World War II. Was it murder?
After 80 years of lies and cover-ups, his children need answers.
And, you know, kissed us and said, I'll see you tonight. And we never saw him again.
From exactly right and Blanchard House, the Butterfly King is a gripping history.
historical true crime series that dives deep into royal secrets, wartime cover-ups and a mystery
that refuses to die. All episodes are available now. Listen to the Butterfly King on the I-Heart
radio app, Apple Podcasts, or wherever you get your podcasts.
Science is supposed to be our best way of revealing the truth about the universe. But science is
constantly being updated and corrected, and sometimes we learn after the fact that a study was
flawed or even was shoddy to begin with. So how do scientists decide whether a new result is
robust or not? And how does the general public know when the science is settled or about to be
upended? No system is perfect, but it's important that the process is transparent. So today we're
going to shine a light on the inner workings of one of the crucial parts of the scientific process,
peer review. How does it work? Who are the
these peers? Can peer-reviewed papers be wrong? And does that mean that all of science is
questionable? Welcome to Daniel and Kelly's peer-reviewed universe.
Hi, I'm Daniel. I'm a particle physicist, and I've published over a thousand papers, but
haven't read most of them.
Hi, I'm Kelly Weiner-Smith, and I'm confused.
What does that mean, Daniel?
How have you published over a thousand papers?
At the LHC, does everybody who is re-ding, ding, ding.
That's it.
Oh, that's cheating, man.
That's cheating.
Well, the way the collaboration works is everybody who's contributed to the project is an
author, and when you're an author, you get included on every paper that comes out
during your authorship period.
And Atlas is a huge collaboration of lots of clever people we put out about.
a hundred papers a year, every single one has my name on it. And I haven't read most of them. I couldn't even explain the titles of some of them to you. So in my, so fields have their own cultures. In my field, in my culture, that would ethically not be okay. You should be responsible for what is in every paper that your name is on, in my opinion. And I was in another collaboration previously where things worked differently, where you weren't an author on a paper unless you asked to be.
And nobody questioned it if you had the right to be an author.
If you were on the list, you could be.
But it was an opt-in.
And I thought that was much better.
The lengths of author lists were shorter, and authorship meant something.
But here you can ask to be removed, but you have to go through this process.
And it would be like, I'd be doing it twice a week every week.
Wow.
So does your CV include all of those papers or just the ones that you feel you've actually contributed to?
So that depends.
The CV I make and share with people only includes papers.
that I actually did anything on or had ideas for or really contributed to.
But sometimes an institution wants a list of your official papers.
And then I have to, like, include all.
I don't even know the number of thousand plus papers at this point.
Oh, poor Daniel.
No, I know.
It's always me.
Poor new guy.
So how do you get them all?
Do you just copy them from Google Scholar?
Yeah, particle physics is pretty good at automation and stuff like this.
So we have our own databases of paper.
and it's pretty easy to download that kind of stuff.
But yeah, it's pretty tough to have so many papers.
Oh, I know.
So many pages you have to print out.
The file for the PDF for your resume is so big.
How about you, Kelly?
How many papers do you have?
And have you finished that paper from your thesis yet.
Oh, I'm feeling angry now, Daniel.
I have not.
I've been busy with other things, including my new ducks and geese.
But, which I won't mention of my co-authors.
Do your ducks and geese and goats
go on your CV. That's my question.
No.
Students supervise. Come on.
Oh, yeah. No, I've got that list.
Kevin the Great.
The Great Goat.
I think I've got, I haven't counted.
I think I've got something like 30 that my pace has slowed since I started writing
Pop-Sye.
But, you know, I still get a couple in there every year.
My CV has something whimsical and unserious on it.
Yours is totally professional?
Yeah.
What whimsical thing do you have on yours?
Oh, I'm just going to leave that as an Easter eggs.
I put it in there to see if anybody actually reads my CV.
Because if they don't respond to that, I'm like, hmm, you didn't really read it.
Okay.
Yeah, check it out.
All right.
Well, I'll be looking up your CV.
Do you have an updated?
Here's the question.
Most scholars don't update their CV, but like once every half decade.
Is your online CV updated?
Yeah, I think I updated it a couple months ago.
Oh, all right.
Bravo.
All right. Well, today we're not talking about the thousands of publications that Daniel has, but we're all very proud of Daniel for all of his hard work.
Or the tens of geese and ducks that Kelly is currently raising, though we're very proud of her and her literally extended family.
Well, I don't have that many, and I wouldn't claim other people's ducks that I didn't help out with as my own ducks.
Oh, snap.
Snap.
But anyway, today we are answering.
a question from a superfan, and we're going to go into a bit more detail. This question inspired
us to make a whole episode about the process of peer review. So let's go ahead and hear what
Stephen from British Columbia has to say. Hi, Daniel and Kelly. This is show superfan
Stephen from British Columbia, Canada. My question today has to do with how scientific studies
are peer reviewed. Is it me? Or is science denial spreading the
days. And it came to my attention recently that not all peer-reviewed studies are replicated.
And then it occurred to me, doesn't this automatically cast doubt on a discovery if a study
is not replicated but is still considered peer-reviewed? I would love to hear an explanation
about how the peer-reviewing process works and how scientists determine when a fact
is a fact, a finding is a finding, and how the community comes to a consensus on new
discoveries. I also think your audience could use some tools for how to determine what
studies are legitimate and how to spot a bad study. How should one respond to someone
who claims a scientific study is bogus? Thank you, Daniel and Kelly,
for this great podcast and for explaining science topics to the masses.
Thank you, Stephen, for asking this question.
I think it's really the right moment to dig into this.
You know, when we see institutions being attacked, when we see science being denied,
when we see the whole process of science being questioned,
I think it's important to shine some light on how does this all work?
What does it mean?
Especially for those folks who are not scientists who don't know,
what does it mean for a paper to be peer-reviewed?
What is bad science?
Thank you, Stephen, for asking.
asking this question and giving us the chance to dig into all of this.
That's right.
And so the question in particular is about peer review for scientific manuscripts.
But peer review actually happens at multiple stages in the process of doing science.
And so we thought we would take a step back and talk about peer review from the very beginning,
which is when your idea becomes a grant.
And this is one of the less fun parts about science is writing this grant because peer review for grants is harsh.
This is like the science version of when a bill becomes a law.
We should have written a song for this.
Oh, well, we still can.
You know, we could write it and then we can add it to the top.
No, we won't do that to everyone.
So, Kelly, tell me what is your process of grant writing?
First of all, who do you submit your grants to?
And what's it like for you to prepare that grant?
So I submit my grants to the National Science Foundation.
And actually, it's possible that in the last six months, the process has changed a bit
because I know the new administration is shaking things up.
The agency formerly known as the National Science Foundation.
No.
What is it called now?
I'm joking.
I'm joking.
Anything is believable, Daniel, at this point.
All right.
The institution is still known as the NSF or the National Science Foundation.
So for this, you write a grant.
And there's different programs that accept different kinds of grants, but often the grant is like 15 pages where you go into extreme detail in what.
we know about the question you want to ask based on literature that's already out there,
extreme detail in the experimental design, and then a lot of detail about what you think the
implications will be of your results.
So you talk about why this question is interesting, you argue that your work will help
answer it, and then you lay out in detail how you're going to spend every dollar, who's
going to do what and when.
You've got a whole Gant chart for like when everything is going to happen.
Yep.
And then you have to lay out also like the products for the grant, right?
Yeah, yeah, the budgets.
And so you're supposed to, there's a section called intellectual merit where you talk about how many papers you think you're going to end up producing, what big questions you're going to answer.
And then there's also a section called broader impacts where you talk about how your work is going to benefit society writ large.
And, you know, for my grants in the past, it's been things like, you know, my information is going to be helpful to fisheries managers who are managing fish populations, which is like a literally a multi-billion dollar industry for like people who want to go out fishing and stuff.
so most of my research has been on fish and their parasites.
Also, intellectual merit will include things like how many students you're going to train
and what skills you're going to give them that they can then use when they go out in the workforce.
And depending on the field you're in, sometimes your broader impacts will be things like,
hey, this could be what we learn about the brains of this fish,
could one day help us produce a new medication for anxiety or something like that.
So you talk about how your stuff would impact the broader society.
And so for those folks who might think,
hey, scientists are just cashing in on the government gravy train.
Tell us, like, is this the kind of thing you throw together in an afternoon
and then can confidently think it will be funded?
Oh, my God.
Okay, so one of the reasons, to be honest,
one of the reasons I sort of transitioned away from academia in the most purest sense,
so I'm like academic adjacent now,
but was because I spent too many Christmas Eve's working on my grants
because they were due like sometime in January.
And they just, they were like so many, I would say months of work in a lot of cases, if it's a new idea.
If you're just cleaning up an old grant, then it can be months of work.
And then so many great grants are submitted.
And when they go to review, you have three peers of yours who are in closely related fields, review it.
And then there's a panel discussion where they talk about their reviews of your grant in front of a bunch of other experts who can also weigh in.
And then all those reviews go to program directors who pick out of like the 20 best grants,
you know, maybe they can only fund 15 or something.
And then they pick some on the East Coast, some on the West Coast, some in the center of the
country, some from major research institutions, some from institutions that focus a lot
in undergraduate research and they have research programs as well.
And so so many really great grants don't make it into the pile that get funded.
Meaning that people have read them, have said this is excellent, the science is good, it's interesting, it's important, we would benefit us as a society if we did this, it's all well thought through, but no.
That is exactly right. Yes, and it is soul crushing. I know so many people who are like, I gave up Christmas Eve on this grant, and this is the third time I've submitted it, and it just never makes it quite high enough, even though everybody thinks it's amazing. And then quickly, just to mention for follow-up, if you do get a grant, every year you have to write a progress.
report saying what you've done, have you stuck to your timeline? How have you spent the money that
you've been given? Are you going to hit your timelines? And what have you done that impacts society?
And so every year you have to report on that. And then at the end, you have to write a like bigger
reports. So you need to be justifying what you're doing with the money every step of the way.
And before we turn it around and ask me questions, who are the peers here? Who are folks who are
reading your grants and commenting on it.
There are folks in the same discipline at other universities.
So you also have to fill out a detailed Excel sheet where you talk about who in the
field who does similar work has been your student or your postdoc or your mentor or a co-author.
You might have a conflict of interest.
And anyone who might like you for you and who might be too nice to you when they review the grant.
So it has to be only people that you haven't worked with or haven't co-authored a paper with for
something like five to 10 years. I guess they assume that the liking of someone wears off in about
half a decade. But it can be tough because, you know, these fields are kind of small. But, you know,
usually it's other people at research institutions who do similar work are familiar with the literature
and can tell if what you're proposing makes sense or not and is good or not. And so the thing to
take away from this is getting a grant funded is hard, right? Not only is it a huge amount of work,
but you have to survive an excruciating process where really only the best are selected.
It's sort of like watching the Olympics and you're wondering like, oh, here's somebody from this country
and somebody from that country.
And you know that each person has already won some sort of like really competitive national
competition just to even be there at the Olympics representing their country.
And so like every grant that's funded, even if it seems silly and it's about like the sex lives
of ducks, you know that it's been like,
excruciatingly written, reviewed in detail, questioned by experts without conflicts of interest,
and found to be excellent, beaten out lots of other grants. These are not slush funds shoveled
to scientists to, you know, just like do whatever they want with. These are hard-won funds
to do science that the agency thinks are a good idea. Yeah, it is absolutely excruciating to do
these things. It feels great when you get it funded. And I said I hate writing grants. Sometimes I enjoy the
process of like figuring out just the right experimental design for a question. That's like fun for
me. But in general, when the grant doesn't get funded, it sucks. But anyway, all right, so what's
your experience? Yeah, my experience is very similar. You know, we come up with an idea. We spend a lot of
time polishing it, often doing a lot of the work in advance that we need to demonstrate that the
work described is reasonable, right? If it's too far forward thinking, then you won't get the
money because they're like, well, that might work, but can you really prove it? It's too much
of a risk. If you've already done too much of the work, then they're like, oh, this is already
done. Why would we fund it? So it's really sort of on the edge there of like, you've done the
initial work for free, right, or on some other grant or whatever. So you prove that the idea
is valid and can fly, but not so much that they're like, why would we fund this? You've already done
it, which is often a delicate balance. And my feeling with grants is submit it and forget it,
because such a tiny fraction ever come back with money that you just got to, like, let it go.
Like, hey, I submitted it and I never expect to hear back, you know, anything positive.
And so I just sort of like give up emotionally on everyone because otherwise it's too hard, you know?
It's too hard to deal with.
So my process is very similar to yours with a couple of exceptions.
Sometimes I submit to private foundations, like I recently found a private foundation that likes to fund projects,
that are two blue sky that were rejected by the NSF for being, like, way too out there.
Oh, wow.
And that submission process was like, write a paragraph, send us your rejected NSF grant with the reviews, and that's it.
Whoa.
And that one actually just came back, and they just gave me some money to build my cosmic ray smartphone telescope down here at Irvine.
So, yeah, that was actually a really positive experience.
Congratulations.
Yay, that's awesome.
Thank you.
Yeah, yeah, exactly.
because, you know, private foundations can do whatever they like with your money.
What we talked about previously was mostly the process of applying to government institutions.
The NIH, the NSF, most of my funding comes from Department of Energy Office of Science,
which funds particle physics and lots of other stuff in the United States.
But private foundations can do anything.
Like the Bill Gates Foundation, you can't even apply to.
They have to, like, reach out to you.
Wow.
And, you know, McKenzie Scott just, like, gives money to people.
They don't even apply.
They just get a phone call saying, like, by the way,
here comes a million dollars or more.
Whoa.
So, yeah, private foundations are weird.
Yeah.
So I'm sure that they make solid choices about who they're going to give their money to,
but I guess one way to sort of check how much you can trust an article is you can look
in the acknowledgement section at the end and say, oh, this was funded by the National Science Foundation.
And then you know this went through rigorous peer review.
And just because something doesn't go through rigorous peer review at the grant stage doesn't mean it's bad science,
But at least, you know, if it went through a government agency, it has been sort of gone over with a fine tooth comb.
And the thing that's frustrating to me is this process is so inefficient.
Scientists spend so much of their time preparing grant proposals and having them rejected.
And, you know, each of these is a great idea.
Each of them, if we did it, would benefit society in terms of sheer knowledge or, you know, new technology or something.
We have all these smart people constantly pitching great ideas to the government.
The government saying, yeah, that's awesome.
We'd love to do it, but we can't.
Like, why don't we just double, triple funding for science?
It would be, like, better for us.
Every dollar we spend we know comes back two, threefold in terms of like economic output.
It just seems crazy to me to reject all of these excellent ideas.
I mean, I got to say, not every idea I've reviewed on a panel has been excellent.
No, but there are plenty above threshold.
Yes, absolutely, absolutely.
There's always a grant.
every round where my heart breaks a little that it didn't get funded because I thought, oh, that is so cool.
Right.
But there wasn't enough money to cover it. And so, yes, there's a lot of good work that's not getting done.
And I agree. I'm a grant reviewer often. And I see grants that I'm like, no, this is not well thought out or there's a flaw here or this isn't cutting edge.
You know, somebody did this last year. And it's important that this stuff gets reviewed and gets reviewed in a fair way.
But there's so many that are above threshold and only a fraction of those get funded. And it just seems to me to be a waste.
but whatever. I'm not a political person. I don't understand these things. But I want folks out there
to realize that every grant you've seen that's been funded has been reviewed in excruciating detail
before dollar one was even sent to the institution. That's right. I don't think we need to get
into lots of detail about this, but every once in a while my husband and I will have chats
about what would be a better system for funding grants? Like maybe every four years you get a chance
to submit and that way you don't have to write grants the other three years and there's fewer people
in the pool and you're more likely to get it.
I think that's not the answer.
But I wonder if there's some way we could save scientists from spending all of this time
on grants that don't get funded.
Or maybe we should just throw 20 times as much money at the scientific community.
There's our solution.
Well, you know, it used to be 50 years ago that the philosophy is a little bit different
that the government funded people rather than projects.
And they were like, oh, Scientist X at this university, you're a smart lady.
You've done good work.
We'll just keep giving you money.
And it'll just go for a while. As long as you keep doing something, we'll keep giving you money.
And I think that used to be the major model. And it's just not that way anymore. Now it's projects. And so if you have a lab at a university, it's sort of like running a small business. You know, you have to constantly be pitching grants to get funds. You know, you're like running a store at the mall, Katrina likes to say. And you're constantly looking for new ways to contribute. The one holdover that I'm aware of is actually experimental particle physics is still a little bit of the older model.
They run competitions for junior faculty, and if you win one of these awards, like I won
a outstanding junior investigator award when I was a very young professor, then you sort of get
in the club, and then they mostly fund you, and you're funny. It can go up if you do
great and go down, if you're less productive, but it's much more stable than typically.
I think because these particle physics projects last like 20 years. You know, we build a collider,
we expect to use it for 25 years. That's my thinking for why they still do it in the sort of
older model. But it means I don't have to write as many grants because I do have one more stable
source of funding. Wow. So even to this day, you still get a chunk of money every year?
I have to submit a grant every three years to propose new work and to tell them what I did in the last
three years. And so far, every time my grant has been renewed and continued. So, yeah, I've been
funded from the Department of Energy for a couple of decades now, which is very nice. And I'm
very grateful to the Department of Energy. Thank you very much. And to all the taxpayers who
support them. Oh, you government chill.
I'm just kidding. I am so happy for you. That's awesome. I know there's a couple grants for people in the medical field that work that way also where they fund your lab and just trust that you're doing awesome stuff and you will continue to do awesome stuff with that money. And that sounds pretty sweet. Yeah, it's pretty nice. Yeah. I think you're talking about like the Howard Hughes Awards, for example. That sounds right. Yeah. Yeah. All right. Well, my jealousy aside, let's all take a break. And when we get back, we're going to talk about the next phase when peer review is done. And that,
is when you're about to start your experiments.
about everything that is going down behind the scenes
and getting into what the people are saying.
Like, what is the culture talking about?
That's exactly what we'll be getting into here
at the latest with Lauren the Rosa.
Everything be amazed.
Let's get to do it.
I'm a homegirl that knows a little bit about everything
and everybody.
To hear this and more, listen to the latest with Lauren the Rosa
from the Black Effect Podcast Network
on the IHeart Radio app, Apple Podcasts
or wherever you get your podcast.
Imagine that you're on an airplane and all of a sudden you hear this.
Attention passengers.
The pilot is having an emergency and we need someone, anyone to land this plane.
Think you could do it?
It turns out that nearly 50% of men think that they could land the plane with the help of air traffic control.
And they're saying like, okay, pull this, do this, pull that, turn this.
It's just, I can do it in my eyes closed.
I'm Manny.
I'm Noah.
This is Devon.
And on our new show, No Such Thing, we get to the bottom of questions like these.
Join us as we talk to the leading expert on overconfidence.
Those who lack expertise lack the expertise they need to recognize that they lack expertise.
And then, as we try the whole thing out for real.
Wait, what?
Oh, that's the run right.
I'm looking at this thing.
Listen to No Such Thing on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
Welcome to Pretty Private with Ebeney, the podcast where silence is broken and stories are set free.
I'm Ebeney and every Tuesday I'll be sharing all new anonymous stories that would challenge your perceptions and give you new insight on the people around you.
On Pretty Private, we'll explore the untold experiences of women of color who faced it all.
Childhood trauma, addiction, abuse, incarceration, grief, mental health,
struggles and more, and found the shrimp to make it to the other side.
My dad was shot and killed in his house. Yes, he was a drug dealer. Yes, he was a confidential
informant, but he wasn't shot on a street corner. He wasn't shot in the middle of a drug
deal. He was shot in his house, unarmed. Pretty Private isn't just a podcast. It's your
personal guide for turning storylines into lifelines. Every Tuesday, make sure you listen to Pretty
private from the Black Effect Podcast Network.
Tune in on the IHeart Radio app, Apple Podcasts, or wherever you listen to your favorite shows.
Hey, sis, what if I could promise you you never had to listen to a condescending finance, bro,
tell you how to manage your money again.
Welcome to Brown Ambition.
This is the hard part when you pay down those credit cards.
If you haven't gotten to the bottom of why you were racking up credit or turning to credit cards,
you may just recreate the same problem a year from now.
When you do feel like you are bleeding.
from these high interest rates, I would start shopping for a debt consolidation loan, starting
with your local credit union, shopping around online, looking for some online lenders because they
tend to have fewer fees and be more affordable. Listen, I am not here to judge. It is so expensive
in these streets. I 100% can see how in just a few months you can have this much credit card debt
and it weighs on you. It's really easy to just like stick your head in the sand. It's nice and dark
in the sand. Even if it's scary, it's not going to go away just because you're a boy.
it, and in fact, it may get even worse.
For more judgment-free money advice,
listen to Brown Ambition on the IHeartRadio app, Apple Podcast,
or wherever you get your podcast.
All right, we are back and we are talking about peer review,
thanks to a question from a listener,
and we dug into the process of peer review for grant proposals
and grant writing, and now let's talk about what it's like to review an experiment while it's
running before the paper is even sent to the journal. Kelly, I mostly do research on particles
that don't have rights and don't have institutional review boards protecting them. What's it like
to do an experiment on living creatures with emotions that can feel pain and have people looking
over your shoulder? Oh, it can be pretty stressful. So say you get that National Science Foundation
grant, they won't actually release the first.
funds to you until you've shown that you've acquired certain permits and protocols so your
institution is giving you permission to do the research. So my PhD work required collecting
some fish out of estuaries in California. So first I had to talk to the state government and
fill out permits to get permission to go collect those fish. So I had to convince the California
government that I wasn't going to take too many, that there were plenty of these fish out there,
that what I was going to do to them was asking a worthwhile scientific question.
What are you going to do to them?
That sounds so aggressive.
I'm going to ask of them.
That they will be contributing in a meaningful way to science.
And so first you have to get that permission to take the animals out of the wild.
And then you need to fill out a protocol through the institutional animal care and use committee, which is Ayakook.
It includes five people.
And they include folks like veterinarians and outside members.
So like somebody who's part of the community who is just sort of going to well.
in on what the general public feels about the work that's being done. Sometimes you'll also
have like an ethicist or a lawyer in there and often you'll have like a faculty member on there
also. And what do these people want? Do they want to say yes? Do they want to say no? Are they just
totally disinterested? Like what are their motivations? Why is like a rando from the public doing this
anyway? So if I'm being completely honest, my field in the past did some uncomfortable things to
animals. And, you know, they didn't use, for example, anesthetic when they were doing some
surgeries and they should have, things like that. And so this committee is trying to make sure that
you are treating the animals as nice as possible and using the fewest animals that you need to
get answers to your questions. So you need to convince them that you have read up on the
most effective anesthetics for whatever animal it is that you're using. You also often have to take a
bunch of online training courses where you show I am well-trained in the best way to treat these
animals and I know how these anesthetics work. You often have to prove to them that you have
teamed up with someone who can make sure you're doing the process correctly. And you need to
convince them that you've thought really hard about the exact number of animals that you're going
to use for these experiments. You know, their job isn't to stop science and they're not necessarily
trying to figure out if the question that's being asked is good or not. They assume that if you
have funding from the National Science Foundation, that has already happened. But their goal is to
just make sure that nothing unethical happens and that the animals who are contributing their
lives to science have the best life possible. Oh, that sounds nice. And it's nice to see the process
sort of self-correcting. Like, okay, we trust scientists. Actually, maybe they need some more eyeballs
on them because they have conflicts of interest. And so it's good to have other people who don't have
those conflicts looking over your shoulder and making sure you're doing things the right way. That's nice.
It is. And I also appreciate that there's a veterinarian on there. The veterinarian will, like, check in pretty regularly. I have training in parasitology, but I don't necessarily have training in, you know, care of fish or something. And I've done a bunch of reading. But it's nice to have a veterinarian check in every once in a while and offer their expertise to make sure that these animals really are being treated as well as you can in a lab setting. And I have never worked on humans, but if you are doing things with humans, including just like sending out a survey to humans that might ask questions that would make people.
feel uncomfortable, you have to pass those procedures through an institutional review board.
So there's a similar procedure for working through humans. So I guess you don't have anything
like that for particle physics, because we don't care what you do to the particles.
No, we don't. But I did one time come home from work to find Katrina collecting samples for an
experiment from our children. And I was like, hmm, shouldn't you be asking the other parent
for approval before experimenting on your children? Like, you have a conflict of interest here.
And she was like, it's just a saliva sample.
And I'm like, the children can't consent.
So we have a lot of fun joking about that.
Did you give permission?
I did.
Yes.
Okay, that's good.
That's good.
So, Daniel, there's this new thing that's getting sort of big in science that my friends
have been talking about.
I haven't had a chance to do this yet.
But pre-registering a scientific study.
Have you done this yet?
We don't actually do this in particle physics because this is another way that particle physics is
weird is that we publish our studies even when we get a negative result.
Nice.
Like, in lots of fields of science, you might say, I have an idea for how we might learn something, and you know, you do the study and then it didn't work. Or you learned that there's nothing new there. Like, okay, bears eat salmon. We already knew that or whatever. And often those studies don't get published. If the result isn't statistically significant or you didn't learn something new. And there's a statistical issue there, which is that this can lead to a bias in our data and our understanding. The effect is called pehacking. And it means that sometimes things,
that are not real can appear to be significant just due to random fluctuation. Say, for example,
I'm testing a bunch of coins to see if they're fair. I flip each one 100 times. Then, like,
most of the time, I'm going to get 50 heads. But occasionally, just due to random fluctuations,
I'm going to get one that gets like 70 heads, right? And the more I do this experiment,
the more likely I am to have one that's a fluctuation. And so if I only publish the ones that seem to
be significant that cross some statistical threshold to be weird, it's going to look like
these coins are not fair when in reality they are. That's what P hacking is. P refers to the
probability for the alternative hypothesis that the coin is fair to fluctuate randomly to look
like it's not fair. And so the way to counteract this is to publish negative results to say,
look, I did all these experiments and the coin came out fair. We already knew that yon. But it's
important that we include this context so that if one in a thousand studies says, look, I saw a one
and a thousand times effect, you would know what it means. And often we do this by pre-registering
studies by saying, I'm going to go out and do this study, and I don't know the results yet,
but I'm going to publish it either way, right? And so that's a way to counteract P-hacking.
So in my field, what you've just described, I think we would call the file drawer effect.
And so the idea here is that, yeah, when you get a null result or a result, that's less
interesting, you might try to publish it somewhere, but you're less likely to put in the
effort to try to publish it in a lower tier journal that you're going to get less credit from your
institution for having published in. And that could make it look like an effect is there because
when you randomly get a positive effect, it gets published and otherwise it doesn't. For us,
P-hacking is when you didn't get the result that you wanted, but you're looking at your data
and you're like, oh, you know, I wasn't actually asking a question about this other thing,
but if I look at my data, there's actually a statistically significant result.
there. And so I'm going to publish on that. And, you know, maybe I'll mention that this wasn't
the initial thing, but I found this, you know, significant relationship that's positive. So I'll
publish on that. And the deal there is, you know, like you said, randomly you would expect to get
results that look like something is really going on. And if you just are searching through your
data set, you're more likely to find something that's significant. And that wasn't the question
you were originally asking. And so the reason we do pre-registering a scientific study is you'll say
ahead of time. I am specifically looking at the relationship between, I don't know, parasites and
how often a fish darts. And so if I go on and I publish a paper about parasites and how active a
fish is, you could say, hey, did you just find that result? And that you changed your paper to be
about that because it looked interesting. And so anyway, this is how we make sure that you're not
looking for significant results and just publishing whatever you find. And in particle physics,
we're lucky enough that we always publish because a negative result is still interesting.
Like if you look for a particle because you think it's there and the universe would make more
sense for it to exist like the Higgs boson, then if you see it, great, you publish it.
If you don't, that's still interesting.
You still want to know, oh, there isn't a Higgs boson because if we smash particles together
often enough when we don't see it, we can say something about the likelihood of it not existing
because if it existed, we would have seen it.
Now, we still are susceptible to P-hacking because we will sometimes see fluctuations, like, data will just, like, look like a new particle sometimes.
And to counteract that, we have a very stringent criteria.
It's called this 5-sigma threshold, which means that we only claim a particle is discovered if it's very, very, very unlikely to come from a fluctuation.
But we still, in principle, could get fooled, and that's why we always have duplicate experiments.
Like, at the Large-Changer Collider, we have two of these big detectors, and they're totally independent.
And that's why you expect to see the same physics in both.
And if you don't, you know, something is squirly.
Yeah.
And in my field, we're getting better where if you get no result or negative results,
as long as you can show that you did a sound experiment and the answer is no,
then you can still publish it somewhere.
And so a public library of science or PLOS is a journal that encourages folks to submit their no results
as long as the science was good and you can defend what you did to try to make sure that you're not getting this file,
drawer affects stuff. So you get the no answers just as often as the yes answers. And that's helpful,
right? It saves people time because if Kelly had a great idea and it turned out the answer is no,
and then she just throws it in the file drawer, then Sam across the country is going to try the same
thing someday and waste their time. So it's good to get this stuff out there. Right. And you know,
in my field, wasting time also often means that some animals may have died in the process. And so to me,
I feel like there's this additional ethical requirement that you get anything that you learned from
these animals out so that nobody has to go and repeat it and it, you know, more animals might
have to pass away for science. So anyway, good to get it out there. All right. So then let's talk
more about what happens when you think you have an interesting result and you've written it up
and you want to publish it. And we're going to leave everybody on a ledge for just a second. And when
we get back from the break, we'll talk more about the fascinating world of peer review.
You hear that excuse?
You don't know if you don't lie about that, right?
Lauren came in.
From viral performances to red carpet looks that had everyone talking.
The podcast, The Latest with Lauren the Rosa, is your go-to for everything being made.
We will be right here breaking it all down.
I'm going to be giving you all the headlines, breaking down everything that is going down behind the scenes,
and getting into what the people are saying.
Like, what is the culture talking about?
That's exactly what we'll be getting into here at the latest with Lauren the Rosa.
everything VMAs.
I'm a homegirl that knows a little bit about everything and everybody.
To hear this and more, listen to the latest with Lauren the Rosa from the Black Effect
Podcast Network on the IHeartRadio app, Apple Podcasts or wherever you get your podcast.
Welcome to Pretty Private with Ebeney, the podcast where silence is broken and stories are set free.
I'm Ebeney, and every Tuesday I'll be sharing all new.
anonymous stories that would challenge your perceptions and give you new insight on the people
around you.
On Pretty Private, we'll explore the untold experiences of women of color who faced it all.
Childhood trauma, addiction, abuse, incarceration, grief, mental health struggles, and more,
and found the shrimp to make it to the other side.
My dad was shot and killed in his house.
Yes, he was a drug dealer.
Yes, he was a confidential informant, but he wasn't shot on his side.
street corner. He wasn't shot in the middle of a drug deal. He was shot in his house, unarmed.
Pretty Private isn't just a podcast. It's your personal guide for turning storylines into
lifelines. Every Tuesday, make sure you listen to Pretty Private from the Black Effect Podcast Network.
Tune in on the IHeart Radio app, Apple Podcasts, or wherever you listen to your favorite shows.
Imagine that you're on an airplane and all of a sudden you hear this.
Passengers, the pilot is having an emergency, and we need someone, anyone to land this plane.
Think you could do it?
It turns out that nearly 50% of men think that they could land the plane with the help of air traffic control.
And they're saying like, okay, pull this, pull that, turn this.
It's just, I can do it my eyes close.
I'm Manny.
I'm Noah.
This is Devon.
And on our new show, no such thing.
We get to the bottom of questions like these.
Join us as we talk to the leading expert on overconfidence.
Those who lack expertise lack the expertise they need to recognize that they lack expertise.
And then, as we try the whole thing out for real.
Wait, what?
Oh, that's the run right.
I'm looking at this thing.
Listen to no such thing on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
A foot washed up a shoe with some bones in it.
They had no idea who it was.
Most everything was burned up pretty good from the fire that not a whole lot was salvageable.
These are the coldest of cold cases, but everything is about to change.
Every case that is a cold case that has DNA.
Right now in a backlog will be identified in our lifetime.
A small lab in Texas is cracking the code on DNA.
Using new scientific tools, they're finding clues in evidence so tiny you might just miss it.
He never thought he was going to get caught.
And I just looked at my computer screen.
I was just like, ah, gotcha.
On America's Crime Lab, we'll learn about victims and survivors.
And you'll meet the team behind the scenes at Othrum,
the Houston Lab that takes on the most hopeless cases
to finally solve the unsolvable.
Listen to America's Crime Lab on the IHeart Radio app,
Apple Podcasts, or wherever you get your podcasts.
All right, Daniel, you have a significant result.
You've got a result that looks good, or at least even if it's no, you're sure it was no,
because the experiment was done well.
What do you do next?
So we write it up, and when all of our co-authors are happy with it,
the first thing we do is we post it online.
So particle physics is a very, very online science.
And we invented this thing called the archive, where scientists post it.
their papers, and it's called a pre-print because we posted before we submit it to the journal.
So, for example, I finished a paper this week, and yesterday it appeared on the archive.
And nobody's reviewed it, nobody's confirmed it.
It's just out there for other scientists to look at.
And in a week or so, if we don't get like screaming fits from somebody saying, you stole my idea
or this is deeply wrong, then we'll submit it to a journal.
So the process for particle physics is first posted online, then submitted to a journal
where it's going to get reviewed, and that's going to take months and months.
But it's so slow that particle physicists just read the preprint server and read those papers.
And nobody really cares when it comes out in the journal later, because we already have seen the paper months earlier.
But what if the peer-reviewed version catches an error?
Do you go back and update the archive version?
Absolutely.
You can update it, and usually papers in the archive are like version three, version four,
and there's comments in between, you can compare them, and you can see the mistakes getting fixed.
Absolutely.
And some papers appear in the archive and then are never published.
And that's fine.
I actually had one paper like 10 years ago put on the archive.
It took like more than a year to review.
And then they rejected it because the paper already had like 30 citations.
And so they were like, why publish this?
It's already out there.
People are reading it and using it.
Huh.
And I was like that sucks.
Such a stupid reason.
Yeah.
It was really dumb.
But then we just gave up.
And now it's like one of my more cited papers.
It was never published.
The system is silly.
It is silly.
But I know that in other fields it's different, like my friends in astronomy, they never put papers on the archive until after they've been reviewed.
And they think that if they send it to a journal and it's already out in the archive, those journal referees are going to find that disrespectful, like that they didn't wait for them to chime in.
So it's definitely a field-by-field culture kind of thing.
How does it work for you guys?
Yeah, so y'all were definitely the trailblazers.
We subsequently came up with bio-archive.
And I remember when I was initially wanting to submit to bioarchive, I was hesitant to do it because some of our journals did say your paper should not be available anywhere else because we want to be the only place where you can find it.
And that's because journals make money off of people who download their articles.
And so they didn't want someone else to be able to get another version for free somewhere else.
And on bioarchive or archive, all of the articles are free.
But it seems like it's now become acceptable that you can.
put papers in these pre-print servers like bioarchive. And so now a lot of people will do that.
And they do it for a variety of reasons. Some of it is they want to get feedback early. They want
to get the results out there early. But sometimes it's also like if you've got a PhD student who
wants to go looking for a job on their resume, if they finish to study, they'll write in
preparation next to the manuscript. But in preparation could mean I've got the data and I still need
to do all the stats and I haven't even started writing. Or it could mean I'm a best.
to submit it to a journal. And if you've put it on bio-archive, you're saying, look, it's done.
I just need to start the peer review process. Or even it's in peer review, because that process
can take a long time. So anyway, it's becoming a lot more common to put your papers on bio-archive
just to make them easier to access for other people and just to show that you really have
actually finished that study finally after a decade. Or however long it took.
The clock is still ticking on your thesis there, Kelly. We'll see.
I'll get it. In preparation. In preparation.
then you send it to a journal and you pick a journal like if you think it's a really exciting paper you
send it to nature or science if you think it's less exciting you send to a journal with a smaller
impact factor um so in particle physics we often publish in like physical review or in journal of high
energy physics or these kinds of journals and it goes to an editor and an editor finds people to review it
these are the peers and they reach out to folks they know who are experts in the field who aren't
your supervisor or your brother or this kind of thing and ask them to review these things.
And Katrina is a senior editor on a journal, so I see this process all the time. And she has to
read the paper, think about who might know about it, and then find people to review it. And
usually you have one to three people read this paper and give comments on it. But finding reviewers
can be tough. So for our grants that we talked about earlier, you have to make sure that the person
reviewing the grant doesn't have a conflict of interest. They're not your brother or your supervisor
or something like that.
And in my field, the same holds true for when you're reviewing manuscripts.
Is that true for your field as well?
Yeah, absolutely.
Yeah.
You have to say, oh, I can't read this one because that was my student or something like that.
Yeah.
How long does review usually take for your field?
Yeah, I'd say it's, you know, like four to eight weeks or something like that.
Wow.
How about you guys?
I'd be happy four to eight weeks.
That sounds good, but it can be sometimes six months.
And at six months, you start writing the editor being like, come on, man.
It's just crazy, but yeah, it can be hard to find reviewers.
Because reviewers are just other scientists, busy writing their own grants and writing their
own papers and doing all their work and picking up their kids from daycare and stuff
like this and trying to get through that mountain of laundry.
You know, I think that people forget that, like, science is just people, right?
And people are busy.
Yeah.
And if you're out there and you've written some treatise on the fundamental physics of
the universe and nobody has read it, it's not because we're gatekeeping, it's just because
we're busy, you know, and there's lots of those.
And I often say no to editors who ask me to review stuff because I'm just too busy to do it in a timely manner.
Yeah, no, me as well.
And I think it's worth noting that while we're doing this review work, we're doing it for free.
Yeah.
Like when you're on a grant review panel, you can lose weeks of time to reviewing these things and you don't get reimbursed for that time.
And the same thing goes, so when I review a manuscript, I read it for one day, takes me a couple hours to read carefully.
And then I sit on it for a few days.
And that's the main thing that is occupying my brain.
in the background for those days.
And that could be, like, I could have been thinking about the introduction to my next book or
something, but instead I'm thinking about the methods to make sure it made sense.
And then I spend another three to four hours reading it again and writing in detail my comments.
And if I think there's some literature they missed, I go and I find it.
And, like, it's a multi-day process when I commit to reviewing a paper.
And reviewers don't get paid for that, nor do the editors.
How long does it take you?
Well, I wish that all of my papers were reviewed by somebody as thorough as you.
But my process is similar, like an initial read, some thoughts, and then let it sit.
Sometimes I'll discuss it with people in my group.
We'll read it together.
And then I read it again in detail.
And then, especially if the review is negative, I wait and I sit on it and I come back later.
I'm like, was this fair?
And also, most importantly, was I nice enough?
Are my comments constructive and helpful and not, like, just negative or harsh in any way?
I'd like try to take a sandpaper to anything that's negative and smooth it over because I've been
on the other end of harsh reviews and it's it doesn't help anybody to get zingers in there and like
especially because a lot of my papers are written by young people and it's often their first paper
and then I have to show them this review that's like this guy's a jerk yeah this isn't necessary
I have to explain to them that the anonymity here is protecting them but you know most reviewers
are fair and are thoughtful and are courteous and so often we get like helpful feedback
like, what about this?
Or have you thought about this question?
Or what would you say to this concern?
Do you find your feedback to be mostly useful?
Yeah, most of the time, the feedback I get improves the paper.
I did have one reviewer when it was one of my first papers who told me they were very
disappointed in me.
And I was like, you're a jerk.
Very disappointed in you.
Thanks, Dad.
And that was because my supplemental materials were too long.
I was like, God, give me a break.
But anyway, so in my field, we're moving towards.
double blind review. So the reviewers are not supposed to know the names of the authors and the
authors are not supposed to know who reviewed their paper. And that's a really great idea so that you
feel like you can honestly tell someone if their paper was good or not. You don't have to worry
about someone getting mad at you in particular. Though in practice, there's usually like four people
who could review your paper. So you know, like, you know, it was Joe or Francine. And Francine tends
to be meaner. And so, but anyway, so yeah, what about in your field? Is it?
it blind or double blind? It's only blind in one direction. The review is anonymous, but you see
who the authors are. I know in some other fields, like in top computer science, conferences,
for example, it's double blind and that's helpful, but also it's not that hard to figure it out.
If you wanted to know who these people are, you could figure it out. But you might also be
wondering, like, what is the job of the reviewer exactly? Does a reviewer have to make sure the
experiment was right? Like, if I'm checking Kelly's work on the reviewer, do I have to go out and do
the experiment and make sure she was right? And that's the thing.
is a peer review is not replication.
The job of the reviewer is to ask, do the claims of the paper, are they supported by the
evidence provided?
Is there logic there?
Is there mathematical tissue between the work that's been done and the conclusions that are
being claimed?
And also is it well written in the citations?
And finally, is it interesting?
Is it important?
Does it play a role in the scientific conversation?
Which is, you know, a little bit subjective, of course, but science is by the people and for
the people.
But if you're working on something nobody's interested in and nobody's going to read your paper,
then a journal might say, this is solid work, but nobody cares.
It's a boring question or you didn't learn anything interesting.
The reviewer's job is not to say, hey, I did this experiment also, and I know that this is a feature of the universe.
That's not the task of the reviewer.
It's not their job to do that for you.
It's also not the task of the reviewer to say, like, hey, this is a cool idea.
I would have done this differently.
And so you now need to go back and do all these additional studies that I think are
important. And you see this a lot in peer review that reviews come back and say, this is cool,
but also do X, Y, Z. And like, that's frustrating to me because that's not the job of the reviewer.
Yes. Yep. Nope. That's super frustrating. And I think it's worth noting that sometimes bad papers get
through this procedure. And one of the ways that bad papers get through is that reviewers aren't
required or, and this makes sense, but they're not responsible for going through, for example,
the codes for the statistical models
that you ran. And so if somebody
checked their codes a bunch of times, but
they forgot a minus sign somewhere,
the result could be wrong.
And maybe nobody knows. And I have known
a couple people who afterwards
have gone to use those models
again for some other question and caught their
mistake. And then a good scientist
will contact that journal and say,
I have to retract my paper,
or I have to submit an errata and let
everybody know, I forgot the minus sign.
It's a different result.
And that's really painful, but at least they're being honest, and that's great.
And I think that's really important.
Or some people are dishonest, and there's no way for the reviewer to know that.
My field had somebody where they were one of the biggest names in the field.
And there's this new requirement in our field.
What's a couple decades old now, I'm old.
But where you have to put the data that you use that you collected as part of the experiment online somewhere
in a public place where people can download your data.
And again, you know, in a field where animals are being used to collect data, it's great to know that those data could be used by other people to ask questions.
You can get more information out of them.
But it also means that if someone gets suspect results, you can pull their data and rerun the models.
And when somebody was looking at the data, it was clear that the numbers just didn't make sense.
Like one column was always the prior column times three.
And it, anyway, after more scrutiny, it became clear that this person was making up.
their data. But, you know, we have this new check where your data have to be available to
everyone else. And that has helped us sort of tease out people who are being dishonest.
So the system is evolving and getting better over time. But still, sometimes you get stuff
through. You certainly do. And there are folks out there who are like combing through papers
to find this stuff. And there's a scientist, for example, her name is Elizabeth Bick. And this is
her passion. She combs through old papers and finds evidence of like photoshopping in biology. You
you take a picture of your gel and it's supposed to have this blob and she finds, oh, this blob is the
same as that blob or it's been reverted or whatever. And so there are definitely lots of ways that
you could find this stuff now that we couldn't have done beforehand. But peer review is not
a guarantee that this hasn't been done. Like reviewers should look for this stuff and call it out
if you see it, but stuff definitely can get through. It's not a perfect system. You know, it's one of these
like, it's the terrible system, but it's also the best that we have so far. Just an aside here, I am a massive
of wimp, if I, like, purposefully fabricated data in this era where everything is, like,
public and people can peek behind the curtain, I would never sleep again.
I'd be like, someone's going to find me out.
And it would be like, I would be miserable the rest of my life.
I'd rather have a bunch of low, low impact papers than worry that I was going to get called
out for lying.
But anyway, I am a wimp.
Yeah.
And so the highest standard, really, is not just that something has been peer reviewed,
but that something has been independently replicated.
Like, you might have a scientist who's doing totally solid work and see some effect in their lab,
but doesn't realize that it's an artifact due to some condition, the humidity or, you know, the local gravity or the trains or something are causing some influence on their experiments.
And so you want people on the other side of the world who built it differently, who made different assumptions,
who sensitive different stuff, to reproduce it.
You might remember this excitement about high-temperature superconductors a couple of years ago, LK-99.
Korean scientists claimed to have created this room temperature cheap superconductor, which would revolutionize the industry.
And so very quickly, people were out there trying to replicate it.
And people were excited, but until another independent group built it and showed that it was real,
nobody really accepted it and thought we were in a new era.
And personally, for example, when we were discovering the Higgs boson, I saw the data accumulating around a mass of 125.
we were looking at it constantly. He was building up and up and up. But until I heard that my colleagues around the ring also saw an effect at the same place, I didn't personally think, okay, yeah, we found this thing. And so this sort of independent replication is really sort of the highest standard. What do you think, Kelly? I think so. And to me, this is one of the current weaknesses in science, at least in my field. So replicating data is absolutely critical. But if you write to the National Science Foundation, for example, and you're like,
oh, I just want to do the same experiment that Maria did,
but I'm going to do it on another continent.
It's going to be hard to get money for that because it's not a new idea.
And it's also probably not going to get published in a top-tier journal.
And so if you are training PhD students who are going to want to get jobs
and they're just quote unquote just replicating somebody else's work,
that's not going to help them get a job as much as following up on their own exciting new idea.
And so we've got this incentive structure that's actually not super great.
for encouraging people to replicate each other's studies.
But I do think it's absolutely critical,
and I would love to see us sort of work on that incentive structure
to make replication just as important
as the initial thing that got done.
That's interesting.
In particle physics, there's maybe a slightly healthier environment.
We have a few signals of new physics
that we've seen in experiments that we're all curious about,
but nobody really accepts because it doesn't quite make sense.
And then there's been another generation of experiments
to follow up on those.
So, for example, there's a very significant signal of dark matter, an experiment in Italy called Dama.
And nobody really believes it because we've never seen it in another experiment.
And people have set up other experiments very similar to Dama, but in another part of the world,
with different conditions in a different cave, for example, and not seeing the same effects.
And so those were experiments definitely inspired by this signal to test this in other conditions.
And in the last few years, there was this, quote, unquote, discovery of a fifth force in this experiment.
in Hungary and folks in Berkeley are trying to replicate it. And there's an experiment in
Italy trying to probe it as well. And so there's definitely like follow-up work. But usually those
follow-up experiments are a little bit broader and they try to not just check this one new
result, but also learn something else along the way to make it so they're also covering new
ground. But I agree we should definitely have replication. But it comes back to funding, right?
Like if you're a reviewer, what would you rather fund? Like let's check Kelly study or let's do
this brand new thing that could tell us something new about the universe.
Yeah.
And when I write a grant, I try to work my own replication in there.
Like, as part of asking a new question, I'm going to repeat the experiment, but maybe put
a little tweak, but then I can at least make sure that I'm still getting the same result
in, you know, a different space or with slightly different lighting or something like that.
And so, and, you know, especially when, to defend my field a little bit, especially when
animals need to be euthanized as part of the experiment, you know, you might be a little bit
hesitant to be like, well, this is just for replication.
Well, if you already got an answer, do animals need to, again?
So anyway, so we do also try to ask additional questions while trying to replicate, but it
would be nice if we had more incentive for that.
And there's folks out there just doing this.
Like, you might have heard of the replication crisis.
That comes out of people finding papers in the literature and saying, like, all right,
let's reproduce this.
Let's see if it holds up.
And this is one reason that, like, P hacking is a thing people talk about because we've
discovered that some of the results in literature are just statistical.
artifacts that were selected in order to get a paper out there. And so I think what you're seeing
in a broader sense is science is self-correcting. Just like we saw that we needed to add external
reviewers and members of the public when we're talking about the experiments you do on animals,
now we see like, okay, we need some sort of way to protect against this kind of abuse as well.
And so, you know, the process is a living thing. It's not like science is a crisp philosophy.
what we call science has changed over the last 10 years, 50 years, 100 years, and it will
continue to evolve. And I hope keep delivering great truths about the universe. Yeah, me too. And so
Stephen asked, how can you know if something is good science or not? I feel like that was the
big push behind the question. And so what do you think? When you read a paper for the first
time, what do you look for? Well, you know, science has to stand on its own. So the thing I look
four is like, does the paper make sense? Does it lay at an argument? Do the conclusions follow from
the evidence presented? That's the most important thing to me. But before, I'm going to personally
believe that something is real part of the universe. Yeah, I need to see it reproduced by another
group. I've often reproduced papers myself, especially if they're like statistical, just to make
sure I understand like, how is this calculation being done exactly? How do you get from step three to
four? Because I really want to understand what are the assumptions being made and think about
whether those are broad enough. So I would say that yes, peer review papers can be wrong,
but that's part of science. And the highest standard, I think, is independent replication.
What do you think, Kelly? Yeah, no, I agree. I mean, when I read a paper, I'll look to see,
you know, what journal is it in? So lately there are, in this day and age, there are some new,
what we call predatory journals where they will encourage people to submit, but they publish
just about anything they get. And peer review isn't so much about peers having a chance to turn down
a study, but they'll like give a little bit of input, but the paper gets published one way or another.
So you need to look to see what kind of journal it was published in. And then, you know,
if it's a field I know about, I'll look to see, did you cite the papers that I would expect
you to be citing? Was your literature search deep enough? And where your experiments designed well,
what else could those results have meant? And then who funded the study and stuff like that?
But, you know, if you were a member of the general public and you can't do all of that and you're
just reading a pop-size summary, I would look for, again, like,
Like, where is this being published in?
You know, like if it's the Atlantic by Ed Young, it's probably great.
And if the science journalist had other scientists weigh in on what might have been wrong about the study, that's a good sign.
And so you look for where you're getting the information, how critical they seem to be, and stuff like that.
What do you look for in a pop-SI article?
Yeah, in a pop-sci article, I definitely look to see whether they have talked to people who are not authors, you know, other people in the field who know this.
stuff. And is it just a press release from the university or is a journalist who's actually
thought about this stuff and written something up? Those are definitely the things I look for
in pop-side press releases because there's the temptation and the sort of marketplace of ideas
to over-inflate the meaning of an incremental study. Yep, yep. All right. So in general, I would
say that, you know, we have this process in place to try to make sure that the best ideas are the
ones that move forward and that we're all checking to make sure no one missed anything important
along the way. Some stuff gets through either by mistake or because some people are unscrupulous,
but hopefully that doesn't happen that often. But, you know, the system is evolving at all times.
And if you have a study that you're interested in, but you don't know if you can trust the press
release or whatever, you can send it to us. And if it happens to be in our wheelhouse, we're happy
to weigh in and tell you what sort of, you know, set off our alarm bells in our heads or what we
liked about the study. And we're happy to help people figure out.
out what was done well and what was just squeaking through.
All right.
Thanks very much, Stephen, for asking this question and for shining a light on the inner
workings of science.
And if you'd like to reach out to us, send us an email at questions at danielankelly.org.
And let's see what Stephen had to say about our answer.
Hi, Daniel and Kelly.
It's Steve here.
Thanks so much for answering my question.
I think you shed a lot of light on some topics that most of the general public are not
aware of. And I really appreciate that. It's actually really fascinating to learn that just because
something is peer-reviewed doesn't mean it's 100% fact. And it's definitely a lot to take away here.
So I appreciate you guys diving into the topic and looking forward to the next episode.
We would love to hear from you.
We really would.
We want to know what questions you have about this extraordinary universe.
We want to know your thoughts on recent shows, suggestions for future shows.
If you contact us, we will get back to you.
We really mean it.
We answer every message.
Email us at questions at danielandkelly.org.
Or you can find us on social media.
We have accounts on X, Instagram, Blue Sky, and on all of those platforms, you can find
us at D and K Universe.
Don't be shy. Write to us.
When your car is making a strange noise,
no matter what it is,
you can't just pretend it's not happening.
That's an interesting sound.
It's like your mental health.
If you're struggling and feeling overwhelmed,
it's important to do something about it.
It can be as simple as talking to someone,
or just taking a deep, calming breath to ground yourself.
Because once you start to address the problem,
you can go so much further.
The Huntsman Mental Health Institute in the Adidas,
Council have resources available for you at love your mind today.org. I was diagnosed with cancer on
Friday and cancer free the next Friday. No chemo, no radiation, none of that. On a recent episode
of Culture Raises Us podcast, I sat down with Warren Campbell, Grammy winning producer, pastor, and
music executive to talk about the beats, the business, and the legacy behind some of the biggest
names in gospel, R&B, and hip-hop. Professionally, I started at Deadwell Records. From Mary, Mary
to Jennifer Hudson, we get into the soul of the music and the purpose that drives it.
Listen to Culture Raises us on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
Lauren came in hot.
From viral performances to red carpet looks that had everyone talking.
The podcast, The Latest with Lauren the Rosa, is your go-to for everything, BMA.
We'll be right here breaking it all down.
I'm going to be giving you all the headlines, breaking down everything that is going down behind the scenes,
and getting into what the people are saying.
Like, what is the culture talking about.
That's exactly what we'll be getting into here at the latest with Lauren the Rosa.
Everything BNAs.
To hear this and more, listen.
to the latest with Lauren the Rosa from the Black Effect Podcast Network on the iHeartRadio
app, Apple Podcasts or wherever you get your podcast.
It's 1943. A king dies under mysterious circumstances in the middle of World War II.
Was it murder? After 80 years of lies and cover-ups, his children need answers.
And, you know, kissed us and said, I'll see you tonight. And we never saw him again.
From exactly right and Blanchard House, the Butterfly King is a gripping historical true crime series
that dives deep into royal secrets,
wartime cover-ups,
and a mystery that refuses to die.
All episodes are available now.
Listen to the Butterfly King on the I-Heart Radio app,
Apple Podcasts, or wherever you get your podcasts.
This is an I-Heart podcast.