Stuff You Should Know - SYSK Selects: Research tips from SYSK
Episode Date: June 20, 2020People often ask us how we do our research. We're not going to disclose all of our secrets, but we'll give you some tips on how to root out the bad studies from the good ones. Learn all about shady st...udies and reporting in this classic episode! Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
On the podcast, Hey Dude, the 90s called,
David Lasher and Christine Taylor,
stars of the cult classic show, Hey Dude,
bring you back to the days of slip dresses
and choker necklaces.
We're gonna use Hey Dude as our jumping off point,
but we are going to unpack and dive back
into the decade of the 90s.
We lived it, and now we're calling on all of our friends
to come back and relive it.
Listen to Hey Dude, the 90s called
on the iHeart radio app, Apple Podcasts,
or wherever you get your podcasts.
Hey, I'm Lance Bass, host of the new iHeart podcast,
Frosted Tips with Lance Bass.
Do you ever think to yourself, what advice would Lance Bass
and my favorite boy bands give me in this situation?
If you do, you've come to the right place
because I'm here to help.
And a different hot, sexy teen crush boy bander
each week to guide you through life.
Tell everybody, ya everybody, about my new podcast
and make sure to listen so we'll never, ever have to say.
Bye, bye, bye.
Listen to Frosted Tips with Lance Bass
on the iHeart radio app, Apple Podcasts,
or wherever you listen to podcasts.
Hey everybody, it's me, Josh,
and for this week's SYS Case Selects,
I've chosen our Guide to Research Tips.
It's a surprisingly good episode
that shares the ins and outs of keeping
from being duped online by bad information
and how to read between the lines
on sensational science reporting,
all sorts of stuff like that.
And you might notice in this episode,
Chuck sounds different than usual.
That's because this is during the period
that he was transitioning into a person
with a full set of teeth.
So that adds to the hilarity of the whole thing.
I hope you enjoy this as much as we did making it.
Welcome to Step You Should Know,
a production of iHeartRadio's How Stuff Works.
Hey, and welcome to the podcast.
I'm Josh Clark with Charles W. Chuck Bryant and Jerry.
This is stuff you should know.
Josh, we're gonna do something weird today.
We're gonna do a listener mail at the head of the podcast.
What?
I know, right?
What?
All right, let's do it.
Okay, this is from beyond.
Wait, wait, hold on.
Do we have the listener mail music going?
Oh, I don't know.
Jerry.
Should we go the whole nine yards?
So let's do it.
People might freak out.
I know.
All right, this is from Bianca Boisich,
is what I'm gonna say.
I think that's great.
Hey guys, Rocha, not too long ago,
asking about how you research your own podcast.
It just got back from a class where we talked about
research misrepresentation in journal articles.
Apparently journals don't publish everything
that is submitted.
A lot of researchers don't even publish their studies.
They don't like the results.
Some laws have been put into place
to prevent misrepresentation,
such as researchers having to register their studies
before they get results and journals only accepting
pre-registered studies, but apparently this is not
happening at all, even though it is now technically law.
This ends with the general public being misinformed
about methods and drugs that work.
For example, there are 25 studies proving a drug works
and 25 that don't.
It's more likely that 20 of the positive results
have been published and only one or two of the negative.
And that is from Bianca.
And that led us to this article on our own website.
10 signs that that study is bogus.
Yeah.
And here it is.
Nice, Chuck.
Well, we get asked a lot about research from people,
usually in college, they're like,
you guys are professional researchers.
How do I know I'm doing a good job in getting good info?
And it's getting harder and harder these days.
It really is.
One sign that I've learned is if you are searching
about a study and all of the hits that come back
are from different news organizations
and they're all within like a two, three day period
from a year ago, nothing more recent than that,
then somebody released a sensational study
and no one did any actual effort into investigating it
and there was no follow-up.
If you dig deep enough, somebody might have done follow-up
or something like that, but for the most part,
it was just something that splashed across the headlines
which more often than not is the case
as far as science reporting goes.
So that's a bonus.
That's the 11th.
Boom.
How about that?
Yeah.
Should we just start banging these out?
Let's do it.
Or do you have some other clever segue?
Well, apart and parcel with that.
I don't know if it's clever.
You do come across people who you know
can be trusted and relied upon to do good science reporting.
So like Ed Yong is one.
Another guy named Ben Goldacre
has something called Bad Science.
I don't remember what outlet he's with.
And then there's a guy, I think Scientific American
named John Horgan, who's awesome.
Yeah, or some journals and organizations
that have been around and stood the test of time
that you know are really doing it right, like Nature.
Yeah, Scientific American is like really science.
Yeah, like I feel really good about using those sources.
Yeah, but even they can, there's something called
scientism where there's a lot of like faith
and dogma associated with the scientific process.
And you know, you have to root through that as well.
Try it, I'm done.
The first one that they have here on the list
is that it's unrepeatable.
And that's a big one.
The Center for Open Science did a study,
it was a project really where they took 270 researchers
and they said, you know what, take these 100 studies
that have been published already, psychological studies,
and just pour over them.
And in 2015, just last year, they took them a while,
took them several years.
They said, you know what, more than half of these
can't even be repeated using the same methods.
They're not reproducible.
Nope, not reproducible.
That's a big one.
And that means that when they carried out,
they followed the methodology.
Scientific Method Podcast, you should listen to that one.
That was a good one.
That they found that their results
were just not what the people published,
not anywhere near them.
For example, they used one as an example
where a study found that men were terrible
at determining whether a woman was giving them
some sort of like a clues to attraction
or just being friendly.
Yeah, sexy, sexy stuff.
Or just be friends.
Or yeah, or good to meet you.
Yeah, or buzz off, Jerk.
Sure, yeah.
And they did the study again as part of this
open science, Center for Open Science study, your survey.
And they found that that was not reproducible.
Or that they came up with totally different results.
And that was just one of many.
Yeah, and in this case specifically,
they looked into that study and they found that it was,
one was in the United Kingdom.
One was in the United States.
So that may have something to do with it.
But the point is, Chuck, is if you're talking about
humanity, I don't think the study was like
the American male is terrible at it.
It's men are terrible at it.
So that means that whether it's in the UK,
which is basically the US with an accent
and a pinch-aunt for tea,
I'm just kidding, UK, see you soon.
It should be universal.
Yeah.
You know?
Agreed.
Unless you're saying, no, it's just,
this only applies to American men.
Right, or these 100 American men.
Then it's not even studied.
Yeah.
The next one we have is, it's plausible,
not necessarily provable.
And this is a big one because,
and I think we're talking about observational studies here
more than lab experiments.
Because with observational studies,
you sit in a room and get asked 300 questions
about something, and all these people get asked
the same questions, and then they pour over the data
and they draw out their own observations.
Right, and one of the, very famously,
an observational study that led to false results
found a correlation between having a type A personality
and being prone to risk for heart attacks.
And for a long time, you know that the news outlets
were like, oh yes, of course, that makes total sense.
This study proved what we've all known all along.
And then it came out that, no, actually,
what was going on was a well-known anomaly
where you have a 5% risk that chance
will produce something that looks like
a statistically significant correlation.
But it's not at all.
When really, it's just total chance.
And science is aware of this,
especially with observational studies,
because the more questions you have,
the more opportunity you have for that 5% chance
to create a seemingly statistically significant correlation
when really it's not there.
It was just random chance where if somebody else goes back
and does the same study,
they're not gonna come up with the same results.
But if a researcher is,
I would guess willfully blind to that 5% chance,
they will go ahead and produce the study
and be like, no, it's true, here's the results right here.
Go ahead and report on it and make my career.
Yeah, well, and they also might be looking for something.
In fact, chances are they are.
It's not just some random study and like,
let's just see what we get
if we ask a bunch of weird questions.
It's like, hey, we're looking to try
and prove something most likely.
So that Bader-Meinhof thing might come into play
where you're kind of cherry picking data.
Yeah, that's a big problem.
That kind of comes up.
A lot of these are really kind of interrelated to-
Oh, totally.
The other big thing that's interrelated
is how the media reports on science these days.
Yeah, you know.
It's a big deal.
Like John Oliver just recently went off on this
and NPR did a thing on it.
That's great.
Like, they might even,
like the researcher might say plausible,
but it doesn't get portrayed that way in the media.
Sure, remember that poor kid
who thought he found the ancient Mayan city?
The media just took it and ran with it.
You know?
Yeah, I think there was a lot of,
maybe or it's possible,
we need to go check kind of thing.
And then he's like, no,
he discovered an ancient Mayan city, never known before.
Yeah, and let's put it in a headline.
And that's, I mean, that's just kind of the way it is.
These days.
Yeah.
Do you have to be able to sort through it?
And I guess that's what we're doing here,
aren't we, Chuck?
We're telling everybody how to sort through it.
Or at the very least,
take scientific reporting with a grain of salt.
Yes.
Right, and that will,
like you don't necessarily have the time
to go through and double that research
and then check on that research and, you know?
Right.
So take it with a grain of salt.
Yeah.
Unsound samples.
Here was this study that basically said,
how you lost your virginity is going to have a very
large impact and play a role on how you feel about sex
and experience sex for the rest of your life.
Yeah.
It's possible.
Sure, it seems logical.
So we'll just go with it.
But when you only interview college students
and you don't, you only interview heterosexual people,
then you can't really say you've done a robust study.
Now, can you?
Plus, you also take out of the sample size,
your sample population,
anybody who reports having had a violent encounter.
Yeah.
Throw them out.
Yeah.
For that data out.
Because that's not gonna inform how you feel about sex.
Right, exactly.
You're just narrowing it down further and further.
And again, cherry picking the data
by throwing people out of your population sample
that will throw off the data that you want.
Yeah.
And I'd never heard of this acronym weird.
And a lot of these studies are conducted
by professors and academics.
So a lot of times you got college students as your sample
and there's something called weird, Western educated
from industrialized, rich and democratic countries.
Right.
Those are the participants in the studies.
Yes.
The study subjects.
But then they will say men.
Right.
Well, what about the gay man in Africa?
Right.
Like you didn't ask him.
So that's actually a really, really big deal.
In 2010, these three researchers did a survey
of a ton of social science and behavioral science studies.
Found that 80% of them used weird study participants.
So basically it was college kids for 80% of these papers.
And they surveyed a bunch of papers.
And they took it a little further.
And they said that people who fit into the weird category
only make up 12% of the world population.
But they represent 80% of the population of these studies.
And a college student Chuck in North America, Europe,
Israel or Australia is 4,000 times more likely
to be in a scientific study than anyone else on the planet.
Yeah.
And they're basing psychology and behavioral sciences
are basing their findings onto everybody else
based on this small tranche of humanity.
Yeah.
And that's a big problem that's extremely misleading.
Yeah.
And it's also a little insulting
because what they're essentially saying is like,
this is who matters.
Well, also, yeah, but what's sad is
this is who I am going to go to the trouble
of recruiting for my study.
It's just sheer laziness.
And I'm sure a lot of them are like,
well, I don't have the funding to do that.
I guess I see that.
But at the same time,
I guarantee there's a tremendous amount
of laziness involved.
Yeah. Or maybe if you don't have the money,
maybe don't do that study.
Yeah.
Is it that simple?
I'm probably oversimplifying.
I don't know.
I'm sure we're going to hear from some people
in academia about this one.
We'll stop using weird participants.
Or at the very least say heterosexual Dartmouth students,
this applies to them.
Not everybody in the world.
I mean, 80% of these studies use those people
as study participants.
And they're not even emblematic
of the rest of the human race.
College students are shown to see the world differently
than other people around the world.
So it's not like you can be like,
well, it still works.
You can still extrapolate.
No, it's like flawed in every way, shape, and form.
Right.
We should probably take a break, huh?
Yeah, let's take a break,
because you're getting a little hot under the collar.
I love it.
We'll be right back after this.
Just like the number of stars in the sky,
there is so much stuff you should know.
On the podcast,
Hey Dude, the 90s called David Lasher and Christine Taylor,
stars of the cult classic show, Hey Dude,
bring you back to the days of slip dresses
and choker necklaces.
We're gonna use Hey Dude as our jumping off point,
but we are going to unpack and dive back
into the decade of the 90s.
We lived it, and now we're calling on all of our friends
to come back and relive it.
It's a podcast packed with interviews,
co-stars, friends, and non-stop references
to the best decade ever.
Do you remember going to Blockbuster?
Do you remember Nintendo 64?
Do you remember getting Frosted Tips?
Was that a cereal?
No, it was hair.
Do you remember AOL Instant Messenger
and the dial-up sound like poltergeist?
So leave a code on your best friend's beeper,
because you'll want to be there
when the nostalgia starts flowing.
Each episode will rival the feeling
of taking out the cartridge from your Game Boy,
blowing on it and popping it back in
as we take you back to the 90s.
Listen to Hey Dude, the 90s called
on the iHeart Radio app, Apple Podcasts,
or wherever you get your podcasts.
Hey, I'm Lance Bass, host of the new iHeart podcast,
Frosted Tips with Lance Bass.
The hardest thing can be knowing who to turn to
when questions arise or times get tough,
or you're at the end of the road.
Ah, okay, I see what you're doing.
Do you ever think to yourself,
what advice would Lance Bass
and my favorite boy bands give me in this situation?
If you do, you've come to the right place,
because I'm here to help.
This, I promise you.
Oh, God.
Seriously, I swear.
And you won't have to send an SOS,
because I'll be there for you.
Oh, man.
And so will my husband, Michael.
Hey, that's me.
Yep, we know that, Michael.
And a different hot, sexy teen crush boy bander
each week to guide you through life, step by step.
Oh, not another one.
Kids, relationships, life in general, can get messy.
You may be thinking, this is the story of my life.
Just stop now.
If so, tell everybody, yeah, everybody,
about my new podcast, and make sure to listen,
so we'll never, ever have to say bye, bye, bye.
Listen to Frosted Tips with Lance Bass
on the iHeart radio app, Apple Podcasts,
or wherever you listen to podcasts.
Learning stuff with Joshua and Charles,
stuff you should know.
All right, what's next, buddy?
Very small sample sizes.
Right.
If you do a study with 20 mice,
then you're not doing a good enough study.
No, so they use this in the article.
They use the idea of 10,000 smokers
and 10,000 non-smokers.
Yeah.
And they said, okay, if you have a population sample
that size, that's not bad.
It's a pretty good start.
And you find that 50% of the smokers develop lung cancer,
but only 5% of non-smokers did.
Then your study has what's called a high power.
Yeah.
It's, if you had something like 10 smokers
and 10 non-smokers, and two,
the smokers develop lung cancer,
and one develop lung cancer as well,
you have very little power,
and you should have very little confidence in your findings.
But regardless, it's still gonna get reported
if it's a sexy idea.
Yeah, for sure.
And because these are kind of overlapping in a lot of ways,
I want to mention this guy, a scientist named Ulrich Dernigl.
He and his colleague Malcolm McLeod have been trying,
I mean, and there are a lot of scientists
that are trying to clean this up,
because they know it's a problem.
But he co-wrote an article in Nature
that's called Robust Research Colon.
Institutions must do their part for reproducibility.
So this kind of ties back into the,
reproducing things like we said earlier.
And his whole idea is, you know what, good funding,
they should tie funding to good institutional practices.
Like you shouldn't get the money
if you can't show that you're doing it right.
Yeah.
And he said that would just weed out a lot of stuff.
Here's one staggering stat for reproducibility
and small sample size.
Biomedical researchers for drug companies reported
that 25% of their, only 25% of the papers
that they publish are even reproducible.
You know, like an insider stat.
And that doesn't matter.
The drugs are still going to market.
Yeah.
Yeah.
Which is, that's a really good example
of why this does matter to the average person.
You know, like if you hear something like,
monkeys like to cuddle with one another
because they are reminded of their mother's study shows.
Right.
You can just be like, oh, that's great.
I'm gonna share that on the internet.
It doesn't really affect you in any way.
Yeah.
But when there's studies being conducted
that are creating drugs that could kill you
or not treat you or that kind of thing
and it's attracting money and funding
and that kind of stuff, that's harmful.
Yeah, absolutely.
I found another survey.
Did you like that terrible study idea that it came up with?
No, I liked it.
Monkeys like to cuddle.
140 trainees at the MD Anderson Cancer Center
in Houston, Texas.
Thank you, Houston, for being so kind to us
at our recent show.
They found that nearly a third of these trainees
felt pressure to support their mentors' work,
like to get ahead or not get fired.
So that's another issue
is you've got these trainees or residents
and you have these mentors
and even if you disagree or don't think it's a great study,
you're pressured into just going along with it.
I could see that.
For sure.
There seems to be a huge hierarchy in science.
Yeah, for sure.
Like in a lab, you got the person who runs the lab,
it's their lab.
Yeah, you go against them.
Right.
But there are people like Science and Nature
to great journals are updating their guidelines right now.
They're introducing checklists.
Science hired statisticians to their panel
of reviewing editors, not just other peer reviewed,
like they actually hired numbers people specifically.
Oh, gotcha.
Because that's a big process.
That's a huge part of studies.
It's like this mind breaking statistical analysis
that can be used for good or ill.
And I mean, I don't think the average scientist
necessarily is a whiz at that,
although it has to be part of training.
But not necessarily.
I mean, that's a different kind of beast altogether, stats.
We talked about it earlier.
I took a stats class in college.
Oh man, I had so much trouble.
I was awful at it.
It really just, it's a special kind of, is it even math?
Hell.
Yeah, I didn't get it.
I passed it though.
I passed it because my professor took pity on me.
Oh, that's nice.
Is that Ulrich Dernigl?
Ulrich Dernigl?
Uh-huh.
He is a big time crusader for his jam,
making sure that science is good science.
One of the things he crusades against is the idea of,
you remember in that virginity study
where they just threw out anybody
who had a violent encounter for their first sexual experience.
Apparently that's a big deal with animal studies as well.
If you're studying the effects of a drug or something,
like there was this one in the article,
if you're studying the effects of a stroke drug,
and you've got a control group of mice
that are taking the drug, or that aren't taking the drug,
and then a test group that are getting the drug,
and then like three mice from the test group die,
even though they're on the stroke drug,
they die of a massive stroke,
and you just literally and figuratively
throw them out of the study,
and don't include them in the results,
that changes the data.
And he's been on a peer review on a paper before.
He's like, no, this doesn't pass peer review.
You can't just throw out,
what happened to these three rodents?
You started with 10, there's only seven reported in the end.
What happened to those three?
And how many of them just don't report the 10?
Yeah.
They're like, oh, we only started with seven,
weren't going, you know?
Well, I was about to say I get the urge,
I don't get it, because it's not right,
but I think what happens is you work so hard at something.
Yeah, yeah.
And you're like, how can I just walk away
from two years of this because it didn't get a result?
Okay, the point of real science, though.
Yeah, you have to walk away from it.
Well, you have to publish that.
Yeah.
And that's the other thing too,
and I guarantee scientists will say,
hey man, try getting a negative paper published
in a good journal these days.
You don't want that kind of stuff.
But part of it also is, I don't think it's enough
to just have to be published in like a journal.
You want to make the news cycle as well.
That makes it even better, right?
So I think there's a lot of factors involved,
but ultimately, if you take all that stuff away,
if you take the culture away from it,
you're, if you get negative results,
you're supposed to publish that
so that some other scientists can come along
and be like, oh, somebody else already did this
using these methods that I was going to use.
I'm not going to waste two years of my career
because somebody else already did.
Thank you, buddy.
Yeah.
For saving me this time and trouble and effort
to know that this does not work.
You've proven this doesn't work.
When you saw it to prove it does work,
you actually proved it didn't work.
That's part of science.
Yeah, and I wish there wasn't a negative connotation
to a negative result,
because to me, the value is the same.
Sure.
As proving something does work,
as proving something doesn't work, right?
Right.
Again, it's just not as sexy.
Yeah, but I'm not sexy either.
So maybe that's why I get it.
Here's one that I didn't know was a thing,
predatory publishing.
I didn't know about it either.
Never heard of this.
So here's the scenario.
You're a doctor or a scientist
and you get an email from a journal
that says, hey, you got anything interesting for us?
I've heard about your work.
And you say, well, I actually do.
I have this study right here.
They say, cool, we'll publish it.
You go, great, my career is taking off.
Then you get a bill that says, where's my three grand?
For publishing your article.
And you're like, I don't owe you three grand.
All right, give us two.
And they're like, I can't even give you two.
And if you fight them long enough,
maybe they'll drop it and never work with you again.
Or maybe let's be like, we'll talk to you next quarter.
Exactly.
That's called predatory publishing.
And it's a, I'm not sure how new it is.
Maybe I-
It's pretty new.
Is it pretty new?
But it's a thing now where you can pay,
essentially, to get something published.
Yes, you can.
It's kind of like, who's who
in behavioral sciences kind of thing, you know?
And apparently it's new because it's a result
of open source academic journals,
which a lot of people push for, including Aaron Schwartz,
very famously, who like took a bunch of academic articles
and published them online and was prosecuted heavily for it,
persecuted, you could even say.
But the idea that science is behind this paywall,
which is another great article from Priceonomics,
by the way, really just ticks a lot of people off.
So they started to open source journals, right?
And as a result, predatory publishers came about
and said, oh, okay, yeah, let's make this free.
But we need to make our money anyway.
So we're gonna charge the academic
who wrote the study for publishing it.
Well, yeah, and sometimes now it's just a flat out
scam operation, 100%.
There's this guy named Jeffrey Beal,
who is a research librarian.
He is my new hero,
because he's truly like one of these dudes that has,
he's trying to make a difference
and he's not profiting from this,
but he's spending a lot of time
by creating a list of predatory publishers.
Yeah, a significant list too.
Yeah, how many, 4,000 of them right now?
Yeah, some of these companies flat out lie
like they're literally based out of Pakistan or Nigeria
and they say, no, we're a New York publisher.
So it's just a flat out scam,
or they lie about their review practices.
They might not have any review practices.
And they straight up lie and say they do.
There was one called Scientific Journals International
out of Minnesota that he found out was just one guy.
No, yeah. Like literally working out of his home,
just lobbying for articles, charging to get them published,
not reviewing anything and just saying, I'm a journal.
Yeah. I'm a scientific journal.
Look at me go.
He shut it down, apparently, or tried to sell it.
I think he was found out.
And this other one,
the International Journal of Engineering Research
and Applications, they created an award
and then gave it to itself.
Oh, yeah.
And even modeled the award from an Australian TV award,
like the physical statue.
Wow. That's fascinating.
I didn't know you could do that.
I'm going to give ourselves the best podcast
and the universal award.
I like that.
It's going to look like the Oscar.
Yeah. Okay.
The Oscar crossed with the Emmy.
This other one, MedNo Publications actually confused
the meaning of STM, Science, Technology, Medicine.
They thought it meant sports, technology and medicine.
No.
Well, a lot of science journalists or scientists too,
but watchdogs like to send gibberish articles
into those things to see if they'll publish them
and sometimes they do, frequently they do.
They sniff them off the case.
It's big time.
How about that callback?
It's been a while.
It has been.
It needs to be a T-shirt.
Should we take a break?
Yeah.
All right.
We'll be back and finish up right after this.
Just like the number of stars in the sky,
there is so much stuff you should know.
On the podcast, Hey Dude, the 90s called
David Lasher and Christine Taylor,
stars of the cult classic show, Hey Dude,
bring you back to the days of slip dresses
and choker necklaces.
We're going to use Hey Dude as our jumping off point,
but we are going to unpack and dive back
into the decade of the 90s.
We lived it and now we're calling on all of our friends
to come back and relive it.
It's a podcast packed with interviews,
co-stars, friends, and non-stop references
to the best decade ever.
Do you remember going to Blockbuster?
Do you remember Nintendo 64?
Do you remember getting Frosted Tips?
Was that a cereal?
No, it was hair.
Do you remember AOL Instant Messenger
and the dial-up sound like poltergeist?
So leave a code on your best friend's beeper
because you'll want to be there
when the nostalgia starts flowing.
Each episode will rival the feeling
of taking out the cartridge from your Game Boy,
blowing on it and popping it back in
as we take you back to the 90s.
Listen to Hey Dude, the 90s called
on the iHeart radio app, Apple Podcasts,
or wherever you get your podcasts.
Hey, I'm Lance Bass, host of the new iHeart podcast,
Frosted Tips with Lance Bass.
The hardest thing can be knowing who to turn to
when questions arise or times get tough
or you're at the end of the road.
Ah, okay, I see what you're doing.
Do you ever think to yourself,
what advice would Lance Bass
and my favorite boy bands give me in this situation?
If you do, you've come to the right place
because I'm here to help.
This, I promise you.
Oh, God.
Seriously, I swear.
And you won't have to send an SOS
because I'll be there for you.
Oh, man.
And so will my husband, Michael.
Um, hey, that's me.
Yep, we know that, Michael.
And a different hot, sexy teen crush boy bander
each week to guide you through life, step by step.
Oh, not another one.
Kids, relationships, life in general can get messy.
You may be thinking, this is the story of my life.
Just stop now.
If so, tell everybody, yeah, everybody
about my new podcast and make sure to listen.
So we'll never, ever have to say bye, bye, bye.
Listen to Frosted Tips with Lance Bass
on the iHeart radio app, Apple podcast
or wherever you listen to podcasts.
Learning stuff with Joshua and Charles.
Stuff you should know.
So here's a big one.
You ever heard the term follow the money?
Mm-hmm.
That's applicable to a lot of realms of society.
Yeah.
And most certainly in journals,
if something looks hinky, just do a little investigating
and see who's sponsoring their work.
Well, especially if that person is like, no,
everyone else is wrong,
climate change is not manmade kind of thing.
Sure.
You know, if you look at where their funding's coming from,
you might be unsurprised to find that it's coming
from people who would benefit from the idea
that anthropogenic climate change isn't real.
Yeah.
Well, we might as well talk about them.
Okay.
Willie Soon.
Yeah.
Mr. Soon.
Is he a doctor?
He's a physicist of some sort, yeah.
All right.
Well, I'm just gonna say Mr. or Dr. Soon.
Okay.
Because I'm not positive.
He is one of a few people on the planet Earth,
professionals that is.
Right.
Who deny human climate change,
human influence climate change, like you said.
Yeah.
You said the fancier word for it though.
Anthropogenic.
Yeah, it's a good word.
Is that it?
And he works at the Harvard Smithsonian Center
for Astrophysics.
So hey, he's with Harvard.
He's got the cred, right?
Right.
Turns out when you look into where he's getting his funding,
he received $1.2 million over the past decade
from ExxonMobil, the Southern company.
The Kochs.
And the Koch brothers, their foundation,
the Charles G. Koch Foundation.
Exxon stopped in 2010, stopped funding him.
But the bulk of his money and his funding came,
I'm sorry, I forgot the American Petroleum Institute,
came from people who clearly had a dog in this fight.
And it's just, how can you trust this, you know?
Yeah, well, you trusted because there's a guy,
and he has a PhD in Aerospace Engineering, by the way.
All right, he's a doc.
He works with this organization,
the Harvard Smithsonian Center for Astrophysics,
which is a legitimate place.
It doesn't get any funding from Harvard,
but it gets a lot from NASA and from the Smithsonian.
Well, and Harvard's very clear to point this out
when people ask them about Willie Soon.
Right.
They're kind of like, well, here's the quote,
Willie Soon is a Smithsonian staff researcher
at Harvard Smithsonian Center for Astrophysics,
a collaboration of the Harvard College Observatory
and the Smithsonian Astrophysical Observatory.
Like, they just want to be real clear,
even though he uses a Harvard email address.
Right.
He's not our employee.
No, but again, he's getting lots of funding from NASA
and lots of funding from the Smithsonian.
This guy, if his scientific beliefs are what they are
and he's a smart guy, then yeah,
I don't know about like getting fired for saying,
here's a paper on the idea that climate change
is not human made.
Yeah, he thinks it's the sun's fault.
But he doesn't reveal in any of his conflicts of interest
that should go at the end of the paper.
He didn't reveal where his funding was coming from.
Yeah.
And I get the impression that in academia,
if you are totally cool with everybody thinking
like you're a shill, you can get away with it.
Right.
Well, this stuff, a lot of this stuff is not illegal.
Right.
Even predatory publishing is not illegal.
Yeah.
It's just unethical.
Right.
And if you're counting on people to police themselves
with ethics, a lot of times they'll disappoint you.
The Heartland Institute gave Willie Soon a courage award.
And if you...
We're not caring about what other scientists think of them.
If you've heard the Heartland Institute,
you might remember them.
They're a conservative think tank.
You might remember them in the 90s
when they worked alongside Philip Morris
to deny the risks of secondhand smoke.
Yeah, that's all chronicle in that book
I've talked about, Merchants of Doubt.
Oh really?
The Heartland Institute.
Just a bunch of scientists, legitimate,
bona fide scientists who are up for being bought
by groups like that.
It's sad.
It is sad.
And the whole thing is they're saying like,
well, you can't say without beyond a shadow of a doubt
with absolute certainty that that's the case.
And science is like, no, science doesn't do that.
Science doesn't do absolute certainty,
but the average person reading a newspaper sees that,
oh, you can't say with absolute certainty,
well, maybe it isn't man-made.
And then there's that doubt that the people just go
and get the money for, for saying that,
for writing papers about it.
Yeah, millions of dollars.
It's despicable.
Yeah, it really is.
Self-reviewed, you've heard of peer review.
We've talked about it quite a bit.
Peer review is when you have a study.
And then one or more, ideally more of your peers,
reviews your study and says, you know what?
You had best practices, you did it right.
It was reproducible.
You followed the scientific method.
I'm gonna give it my stamp of approval
and put my name on it, not literally.
Or is it?
I think so.
It says who reviewed it.
Yeah, I believe so.
Okay, put my name on it.
And like in the journal when it's published.
But not my name as the author of the study,
you know what I mean?
Right.
As a peer reviewer.
Yeah, as a peer reviewer.
And that's a wonderful thing.
But people have faked this
and been their own peer reviewer,
which is not how it works.
No.
Who's this guy?
Well, I'm terrible at pronouncing Korean names.
So all apologies.
But I'm gonna say Nung in Moon.
Nice.
Dr. Moon?
I think, yeah, let's call him Dr. Moon.
Okay, so Dr. Moon worked on natural medicine, I believe.
And was submitting all these papers
that were getting reviewed very quickly.
Because apparently part of the process of peer reviews
is to say, this paper is great.
Can you recommend some people in your field
that can review your paper?
And Dr. Moon said, I sure can.
Yeah, he was on fire.
Let me go make up some people
and make up some email addresses
that actually come to my inbox.
And just posed as all of his own peer reviewers.
He was lazy though, is the thing.
Like I don't know that he would have been found out
if he hadn't been careless, I guess.
Because he was returning the reviews within 24 hours
sometimes.
A peer review of a real study should take,
I would guess weeks if not months.
Like the publication schedule
for the average study or paper,
I don't think is a very quick thing.
There's not a lot of quick turnaround.
And this guy was like 24 hours.
Well, they were like, Dr. Moon,
I see your paper was reviewed and accepted by Dr. Moony.
It's like, I just added a Y to the end.
It seemed easy.
If you Google peer review fraud,
you will be shocked at how often this happens
and how many legit science publishers
are having to retract studies.
And it doesn't mean they're bad.
They're getting duped as well.
But there was one based in Berlin in 2015,
had 64 retractions because of fraudulent reviews.
Oh, wow.
And they're just one publisher of many.
Every publisher out there probably has been duped.
Maybe not everyone.
I'm surmising that.
But it's a big problem.
We should do a study on it.
I'll review it.
It'll end up in the headlines now.
Every single publisher duped says Chuck.
And speaking of the headlines, Chuck,
one of the problems with science reporting
or reading science reporting is that
what you usually are hearing,
especially if it's making a big splash,
is what's called the initial findings.
Somebody carried out a study and this is what they found
and it's amazing and mind-blowing and it supports
everything everyone's always known,
but now there's a scientific study
that says yes, that's the case.
And then if you wait a year or two
when people follow up and reproduce the study
and find that it's actually not the case,
it doesn't get reported on usually.
Yeah, and sometimes the scientist or the publisher
is, they're doing it right
and they say initial findings,
but the public, and sometimes even the reporter
will say initial findings,
but we as people that ingest this stuff
need to understand what that means.
And the fine print is always like more study is needed,
but no one, if it's something that you want to be true,
you'll just say, hey, look at this study.
Right.
It's brand new and they need to study it for 20 more years,
but hey, look what it says.
Right, and the more you start paying attention,
this kind of thing, the more kind of disdain you have
for that kind of just offhand sensationalist science reporting.
But you'll still get caught up in it.
Like every once in a while,
I'll catch myself saying something to you,
and you'll be like, oh, did you hear this?
And then as I'm saying it out loud,
I'm like, that's preposterous.
There's no way that's gonna pan out to be true.
I got click baited.
I know.
We have to avoid this stuff, it's stuff,
because we have our name on this podcast,
but luckily we've given ourselves the back door
of saying, hey, we make mistakes a lot.
It's true though.
We're humans.
No, we're not scientists.
And then finally, we're gonna finish up with the header
on this one, it's a cool story.
Yeah.
And that's a big one because it's not enough these days,
and this all ties in with media
and how we read things as people,
but it's not enough just to have a study
that might prove something.
You have to wrap it up in a nice package
to deliver people.
Get it in the news cycle.
In the cooler or the better.
Yep.
Yep.
It almost doesn't matter about the science
as far as the media is concerned.
They just want a good headline
and a scientist who will say, yeah, that's cool.
Here's what I found.
Yep.
This is gonna change the world.
Mm-hmm.
Loch Ness Monster is real.
This kind of ended up being depressing somehow.
Yeah.
Not somehow.
Yeah.
Like, yeah, it's kind of depressing.
I know.
We'll figure it out, Chuck.
Well, we do our best, I'll say that.
Science will prevail.
I hope so.
If you wanna know more about science
and scientific studies and research fraud
and all that kind of stuff,
just type some random words into the searchbar
at howstoveworks.com, see what comes up.
Yeah.
And since I said random, it's time for listener mail.
Oh, no.
Oh, yeah?
You know what it's time for.
What?
Administrative details.
All right, Josh, administrative details.
If you're new to the show, you don't know what it is.
That's a very clunky title.
Yeah.
We're saying thank you to listeners
who send us neat things.
It is clunky and generic,
and I've totally gotten used to it by now.
Well, you're the one who made it up.
To be clunky and generic, and it's stuck.
Yeah.
So people send us stuff from time to time,
and it's just very kind of you to do so.
Yes.
And we like to give shout outs,
whether or not it's just out of the goodness of your heart,
or if you have a little small business
that you're trying to plug.
Either way.
It's a sneaky way of getting it in there.
Yeah, but I mean, I think we brought that on.
Didn't we say if you have a small business,
then you send us something,
we'll be happy to say something.
Exactly.
Thank you.
All right, so let's get it going here.
We got some coffee from 1000 Faces,
right here in Athens, Georgia, from Kayla.
Yeah.
Delicious.
Yes, it was.
We also got some other coffee too,
from Jonathan at Steamworks Coffee.
He came up with a Josh and Chuck blend.
Oh yeah.
It's pretty awesome.
I believe it's available for sale too.
Yeah, the Josh and Chuck blend is dark and bitter.
Jim Simmons, he's a retired teacher
who sent us some lovely handmade wooden bowls.
Oh yes.
And a very nice handwritten letter,
which is always great.
Thanks a lot, Jim.
Let's see, Chamberlain sent us homemade pasta,
including a delicious savory pumpkin fettuccine.
It was very nice.
Yum.
Jay Graft, two F's,
sent us a postcard from Great Wall of China.
It's kind of neat.
Sometimes we get those postcards
from places we've talked about.
No, he's like, look where I am.
Thanks, Jay.
You guys aren't here.
Let's see, the Hammer Press team,
they sent us a bunch of Mother's Day cards
that are wonderful.
Oh, those were really nice.
They were really great.
You should check them out, the Hammer Press team.
Yeah, yeah.
Misty, Billy, and Jessica,
they sent us a care package of a lot of things.
There were some cookies.
Okay.
Including one of my favorite,
white chocolate dipped ritz and peanut butter cracker.
Oh yeah.
Man, I love those.
Homemade, right?
Oh yeah.
And then some 70s macrame for you,
along with 70s macrame magazines.
Yeah.
Because you're obsessed with macrame.
We have a macrame plant holder
hanging from my microphone arm.
Uh-huh.
Holding a cup.
A coffee mug sent to us by Joe and Linda Hecht.
Oh, that's right.
And it has some pens in it.
And they also sent us Misty, Billy, and Jessica,
a lovely little hand-drawn picture of us
with their family, which was so sweet.
That is very awesome.
We've said it before, we'll say it again.
Huge thank you to Jim Ruane.
I believe that's how you say his name,
and the Crown Royal people for sending us all the Crown Royal.
We are running low.
Ha ha ha.
Mark Silberg of the Rocky Mountain Institute
sent us a book called Reinventing Fire.
Oh yeah.
They're great out there, and they
know what they're talking about.
And I think it's Reinventing Fire
colon, bold business solutions for the new energy era.
Yeah, they're basically like green energy observers.
But I think they're experts in all sectors of energy,
but they have a focus on green energy.
Which is awesome.
Yeah, they're pretty cool.
John, whose wife makes delightfully delicious doggy
treats.
Delightfully delicious is the name of the company.
There's no artificial colors or flavors,
and they got Sweet Little Momo hooked on sweet potato
dog treats.
I thought you were going to say hooked on the junk.
The sweet potato junk.
She's crazy cuckoo for sweet potatoes.
Nice.
Oh man.
That's good for a dog too.
It is, very.
Strat Johnson sent us his band's LP.
And if you're in a band, your name is Strat.
That's pretty cool.
Sure.
Diomea, still.
I think that was great.
Yeah, I'm not sure if I pronounced all right.
D-I-O-M-A-E-A.
Frederick, this is long overdue, Frederick at the 1521 store,
1521store.com, sent us some awesome low profile cork iPhone
cases and passport holders.
And I was telling them, Jerry walks around
with her iPhone in the cork holder,
and it looks pretty sweet.
Oh yeah.
So he said, awesome.
Glad to hear it.
Joe and Holly Harper sent us some really cool 3D printed stuff
you should know things, like SYSK, like a little desk.
Oh, like after Robert Indiana's love sculpture?
Yeah, that's what I couldn't think of what that was from.
Yeah, it's awesome.
It's really neat and like a bracelet made out
of stuff you should know, 3D carved, like plastic.
It's really neat.
Yeah, they did some good stuff.
So thanks, Joe and Holly Harper, for that.
And then last for this one, we got a postcard from Yosemite
National Park from Laura Jackson.
So thanks a lot for that.
Thanks to everybody who sends us stuff.
It's nice to know we're thought of, and we appreciate it.
Yeah, we're going to finish up with another set
on the next episode of administrative details.
You got anything else?
No, that's it.
Oh yeah, if you guys want to hang out with us on social media,
you can go to SYSK podcast on Twitter or on Instagram.
You can hang out with us at facebook.com
slash stuff you should know.
You can send us an email to stuffpodcast
at howstuffworks.com.
And as always, join us at our home on the web,
stuffyoushouldknow.com.
Stuff You Should Know is a production of I Heart Radio's
How Stuff Works.
For more podcasts from I Heart Radio,
visit the I Heart Radio app.
Apple podcasts are wherever you listen
to your favorite shows.
Hey, I'm Lance Bass, host of the new I Heart Podcast,
Frosted Tips with Lance Bass.
Do you ever think to yourself, what advice would Lance Bass
and my favorite boy bands give me in this situation?
If you do, you've come to the right place
because I'm here to help.
And a different hot, sexy teen crush
boy bander each week to guide you through life.
Tell everybody, ya everybody, about my new podcast
and make sure to listen so we'll never, ever have to say.
Bye, bye, bye.
Listen to Frosted Tips with Lance Bass
on the I Heart Radio app, Apple Podcast,
or wherever you listen to podcasts.