Relatable with Allie Beth Stuckey - Ep 1292 | Ben Gillenwater | Cybersecurity Expert Reveals Shocking Truth About Parental Controls

Episode Date: January 23, 2026

Allie sits down with Ben Gillenwater, the “Family IT Guy” and cybersecurity expert, for an eye-opening conversation on protecting kids in the digital age. Ben breaks down the dangers of bottomless... feeds, AI manipulations, grooming risks via online chats, and how platforms like YouTube and social media cause media addictions with high-stimulus shows. He reveals alarming stats on rising youth suicide rates correlating with social media usage and offers practical, no-nonsense advice. A must-listen for every mom and dad navigating screens, safety, and raising resilient kids in a tech-saturated world. Learn more about Ben Gillenwater here: https://www.familyitguy.com Buy Allie's book "Toxic Empathy: How Progressives Exploit Christian Compassion": ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://www.toxicempathy.com⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ --- Timecodes: --- Today's Sponsors: PreBorn | For just $28 — the cost of a dinner — you can sponsor an ultrasound to introduce a mother to her baby for the first time. 100% of your donation will go toward saving babies. Will you help us? Just dial #250 and say the keyword BABY. Or donate securely at ⁠PreBorn.com/ALLIE⁠. Good Ranchers | Visit ⁠GoodRanchers.com⁠ today. Use my promo code ALLIE for an extra $25 off your first order, on top of the $500 you’ll save every year just by subscribing. Ghost Bed | Ghost Bed is giving you the best deal of the year plus an extra 10% when you use the code ALLIE at ⁠GhostBed.com/Allie⁠. Concerned Women for America | For a donation of $20 or more, you will get a copy of their new book, written by the CEO and president, Penny Nance, "Seven Rules for Success in Business and Life: A Woman’s Guide." This book is an incredible gift for any young woman graduating or beginning her professional journey. Go to ⁠ConcernedWomen.org/Allie⁠ for your copy today. Seven Weeks Coffee | Go to ⁠SevenWeeksCoffee.com⁠ and save 15% forever when you subscribe, plus get a free gift with your order! And use code ALLIE for an extra 10% off your first order. That’s a 25% total savings on your first order, plus a free gift! Legacy Box | Visit ⁠LegacyBox.com/Allie⁠ to save 55% when you digitize your memories. --- Episodes you might like:⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Ep 779 | TikTok Is Spying on You: Here’s Why It Matters | Guest: Kara Frederick https://podcasts.apple.com/us/podcast/ep-779-tiktok-is-spying-on-you-heres-why-it-matters/id1359249098?i=1000606508987 Ep 1182 | Meta's AI Chatbots Are Sexting Minors & Beyoncé Still Hates America https://podcasts.apple.com/us/podcast/ep-1182-metas-ai-chatbots-are-sexting-minors-beyonc%C3%A9/id1359249098?i=1000705757317 Ep 841 | Great Reset Update: The Next Phase Is Here | Guest: Justin Haskins (Part One) https://podcasts.apple.com/us/podcast/ep-841-great-reset-update-the-next-phase-is-here/id1359249098?i=1000621675813 Ep 842 | The Elites’ Plan to Replace God With AI | Guest: Justin Haskins (Part Two) https://podcasts.apple.com/us/podcast/ep-842-the-elites-plan-to-replace-god-with-ai-guest/id1359249098?i=1000621802685 --- Buy Allie's book "You're Not Enough (and That's Okay): Escaping the Toxic Culture of Self-Love": ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://www.alliebethstuckey.com⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Relatable merchandise: Use promo code ALLIE10 for a discount: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://shop.blazemedia.com/collections/allie-stuckey⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Transcript
Discussion (0)
Starting point is 00:00:00 If you are looking to refinance or maybe you are looking to get into the home that you need or your family wants right now, then you need to call my friends at Fellowship Home Loans. Mike and Brian are the real deal. They are going to bring you excellent service and help you get in the financial position that you need to maybe get some extra margin in your finances. If you need to refinance or to make sure that you get the mortgage that you need for the home that you are looking to purchase, They do their business by the book, not just by the book, but by the book, but by biblical principles. Those are the kind of people that you want to trust with such a big decision like this. If you go to fellowshiphomeloans.com, you'll get $500 of credit at closing.
Starting point is 00:00:46 That's fellowship homelones.com slash alley, term supply, see site for details, fellowship home loans, mortgage lending by the book, nationwide mortgage bankers, DBA Fellowship Home Loans, equal housing lender, NMLS, number 819382. Today's guest gave his five-year-old son an iPad and then after seeing some of the very disturbing material being fed to his five-year-old through what he thought were very innocent apps, he was so disturbed that he not only took the apps off the iPad but he threw out the tablet altogether. He was then inspired to make family IT guide. This is a platform for parents to learn about cybersecurity best practices and skills
Starting point is 00:01:30 when it comes to technology and social media with your kids, some of the things that he is going to share today will blow your mind and help you realize the importance of us parents, understanding, AI and ed tech and technology, taking these things so seriously for our children, you're going to learn so much from this conversation with Ben Gillenwater. It's brought to you by our friends at Good Ranchers. Go to Good Ranchers.com.
Starting point is 00:01:54 Use code Alley at checkout. That's good ranchers.com code Alley. Ben, thanks so much for taking the time to join us. If you could tell everyone who you are and what you do. Yeah, hi, Ellie. Thanks for having me. My name is Ben. I'm the family IT guy.
Starting point is 00:02:16 I am a dad and I'm a cybersecurity expert. I've been doing it for 30 years. I started when I was 14 in 1995. Wow. That is crazy. 14. What does cybersecurity look like when you're 14? When I was 14, I was actually selling computers at a computer store.
Starting point is 00:02:32 Okay. But cybersecurity is about understanding. all aspects of computer systems. And so that journey began when I was a kid. Yeah. I think I found you on Instagram talking about the importance of security when it comes to online safety with your kids. Tell me about that journey.
Starting point is 00:02:52 When did you start getting concerned about social media and technology for kids? Well, basically when I screwed up and gave my kid an iPad. How old? When he was five. So I gave him, I gave him an iPad. I put YouTube on it. I did all the things that if my current self was to, you know, go back because it's been five years now, he's 10 now. You know, I find it to be almost laughable.
Starting point is 00:03:19 But, but it's, but it's not because it's a, it's a normal experience that is, is common nowadays that I think most people do. And I found out within a couple of days how big of a mistake that was. Because he was seeing all kinds of inappropriate things. YouTube took him down some immediate rabbit holes of stuff that's, not good for kids to see with sexual undertones, violent undertones, things that are just addictive. And so after a couple of days, I switched it to YouTube kids. I was like, okay, my bad, I put the adult version on. I should have known better. The one that's called kids. And by the way, at the time, YouTube was run by a grandma. The CEO of YouTube was a grandma. So I'm like, here's a product that's
Starting point is 00:04:04 brought to market by a grandma and it's called kids. That should be fine. And I would put some trust into the fact that their parental controls worked. And when I installed the app, it asked me, how old is the kid? And so I told it. And then, okay, good. We're YouTube filters all the things. They have all the best AI. You know, no. Unfortunately, that was a big mistake too. What did you find? He saw nightmare characters that gave him nightmares for years that had this weird. there's this one called Huggy Wuggy. It's this big scary, sharp teeth. And what was weird is that he liked it.
Starting point is 00:04:42 He thought that he liked it, but he had nightmares every night. Yeah. For literally for years afterwards. A lot of similar stories out there. Huggy Wuggy also was associated with some kind of story that was telling kids that they should kill their parents. Yeah. Yeah, lots of really dark stuff. I could tell it was doing stuff to his brain that is not something.
Starting point is 00:05:03 supposed to happen just in my gut. I was like, that is very wrong. And there's videos on YouTube kids of superhero characters having sex with each other. Wow. That's shown to children. Yeah. So I took YouTube kids off. And then we were left with an iPad with some actually kid-friendly games, just single player racing games. He likes going fast, you know, so. Yeah. Racing games and stuff. But then the pattern kept continuing to where he would wake up and go to a his iPad. He would want to go to sleep later to play with the iPad. He would come home from school. Prior to the iPad, one of his best friends lives directly across the street. Our front doors face each other. It's like out of a movie. It's great. And so they go outside. Get home from school,
Starting point is 00:05:48 drop the backpack, go outside and play. And then it was get home from school, drop the backpack, and go to the iPad. So eventually we took the iPad away altogether. Yeah. And kind of ripped the Band-Aid off and went cold turkey on it and had a couple of weeks where it wasn't too terrible, but he definitely missed it and we had to adjust. But then he went right back to playing outside, which is what he still does. It's like a detox period. I think a lot of parents can relate to this. And so tell us your thinking when he was five in giving him the iPad originally. my thinking was that it would be an innocent source of entertainment and that if he's going to watch TV why not just as well watch sort of TV on YouTube and then we could pick the shows so you know
Starting point is 00:06:40 there's because there is a lot of good stuff there's very educational stuff interesting stuff and so then he could pick his his flavor and have is it is it fun is it science is it math is it space, is it dinosaurs? But it's all the other stuff that surrounds the good stuff that leaks its way in because of the business that YouTube is. Because YouTube, just like its parent company, Google, is not a tech company. It's an advertising company. They sell ads. They facilitate the sale of ads by building amazing technology and giving it away for free. And, And that, I already knew that being in the IT space and having looked after, you know, I've designed computer systems for the Department of Defense, for NSA, for state governments, city governments,
Starting point is 00:07:35 county governments, for some of the biggest companies in the world, and everything in between. And so I understood all these mechanisms, but yet still fell into the trap. Right. So YouTube is an advertising company. And the way they sell ads is by showing people ads as much as possible, ideally ads that make sense to them personally. So they track you and follow you and learn what you like. And it makes for a very successful ad business. And we know that Google is an advertising company because their annual SEC reports show that 76% of their revenue is from selling ads.
Starting point is 00:08:13 Just like Facebook. I think it's 98 or 99% of their revenue is from selling ads according to their SEC report. So we know that they're an ad company. And they're in the business of addiction. Because if you can get people to stare at your ad feed all day, then you can make hundreds of billions of dollars. Yeah. As they've been doing for quite some time. Yeah.
Starting point is 00:08:37 Now you put a kid into that. So their systems are designed to addict people, as it turns out of all ages. And if you're young and you have an underdeveloped mind, excuse me. then the system is especially effective. Because we all know what it does to us as adults. If you pick up Instagram or pick up YouTube and you start looking at it 45 minutes later, you're like, well, where'd the time go? Right. You do that to a kid and it's magnified to a greater effect.
Starting point is 00:09:07 Yeah. You know, people don't really understand that when it comes to social media, you as the individual are the product because as you said, they are making money from the advertisers. The advertisers only make money if the people who are using the social media platform are clicking on them and looking, at least viewing, hopefully converting to actually purchasing whatever it is that they are selling. Well, if you are selling a product to an advertiser and that product is a person, you have to know everything you can about that product or about that person, just like if I were selling a product to you, this blanket. Like, I want to know everything about it to try to convince you that you need to buy it.
Starting point is 00:09:56 That's what these companies are telling advertisers. Look, I've got this demographic. I've got this behavior. I've got these people who like this, who will click on this, who make this much money. They want to learn as much about you as possible, including your children. So these advertisers will buy for a spot on the bottom of, you know, the, the, you YouTube video. And so parents just need to know that part of these platforms jobs is to learn as much about your child as humanly possible. What scares them? What do they like? What will
Starting point is 00:10:27 they click on? What addicts them? Yes. I really like to think about attention as a currency. Because for these products or these tools that seem free or they're put to market as if they're free, you don't have to change dollars to get them. But you do have to, give your attention. A lot of times it's not consciously. It feels like there's no exchange happening. But I bring it up because I think we have kind of two fundamental currencies as humans. We have time and we have attention.
Starting point is 00:11:01 And you can't get more of either one. You have a limited fixed budget. And actually, none of us even know how much we have left to spend. Yeah. And so if I were to continue doing what I did and continue taking for granted that my son should trade his attention for whatever it is that YouTube is going to show him, one of the things that I'd be teaching him is that his attention is not that valuable. And so in hindsight, and what I've learned since starting to focus on family IT guy, because this is all I do now, I've been focusing on this for a couple of years. In hindsight, I'm really glad that it played out the way it did because I think problems are the best things to learn from, the best sort of sources of education.
Starting point is 00:11:55 And now I, now my son knows more than most kids about internet safety because of what I do. But he also understands now about how valuable attention is. Yeah. And how I practice the budgeting of my attention and where I give it and where I don't. first sponsor for the day is seven weeks coffee we love seven weeks coffee in our home we love where they get their name seven weeks that baby inside the womb is the size of a coffee bean yet he or she no matter how small she is is made in the image of god therefore she matters seven weeks tries to save as many of those little baby lives is possible by donating 10% of every sale of their coffee to pregnancy centers across
Starting point is 00:12:42 the country they have already donated over a million dollars to these life-saving pregnancy centers. It's incredible. These centers give free sonograms, free prenatals, parenting education classes, adoption courses to these moms in need to help them make a life-affirming decision. So when you buy from seven weeks coffee, you're not only getting pesticide-free, mold-free, sustainably sourced, great tasting, I can attest, coffee, but you're also allowing your coffee to serve a higher purpose. Plus, when you subscribe, you save 15% on your order. And when you use my code Alley, you get an extra 10% off your first order. That's 7weeks Coffee.com code Alley. Tell me more about the behaviors that you saw. You took away what you thought was the bad
Starting point is 00:13:30 content through YouTube that allowed him to keep doing, you know, the race car games and other games that were otherwise wholesome. But as you said, he was budgeting his attention the wrong way. What did you notice? There's this one time at the park. We're at a playground. And this is about that huggy-wuggy character. There was another kid at the playground that had a huggy-wuggy like stuffed animal, stuffed toy type of a thing. And my kid was drawn to that thing. Like he walked over on a mission. And I thought he was going to go hang out with the kid. He had no interest in the other kid. He wanted that toy. Yeah. And it had this draw on him that I have never seen since. And that's why this day at the park stands out so much. It, that character
Starting point is 00:14:22 program something in his brain in a bad way. Yeah. Where he was trying to take that toy from that kid. It was like, which is not his, is not his character normally. That's not the way that he does things. Right. But it just flipped a switch. Now, granted, that's a, that's specific to a particular character, a particular show.
Starting point is 00:14:41 But it was facilitated by this advertising platform that I think is just demonstrates the potential for like the platform itself is addictive. And because of that. that when you publish shows as a publisher to YouTube, the more addictive your shows are, the more you get paid. Yeah. And so it actually incentivizes addiction all the way down the chain. So we can see this in a lot of the kids' shows nowadays,
Starting point is 00:15:09 where it's really, really high stimulus, really fast-paced, really oversaturated colors, like cocoa melon. Cocoa melon. It's like cartoons on crack. Yeah. Why? Because it works. It's addictive. They get more eyeball time. YouTube pays its creators for eyeball time. If you release a 30-minute video and you have a million people that watch it a day and they, on average, watch 20 minutes of that 30-minute video, you're going to make a lot of money. Right.
Starting point is 00:15:39 And so cocoa melon is a really good example of this high-stimulus content world that kids live in nowadays. And that's something I've started to pay attention a lot, too. And I have some articles on my website about how to identify low stimulus content and how to count the number of scene changes per minute. And so if you watch a modern show, watch how many cuts there are per minute. There could be 10, 20, 30, 60. Just boom, boom, boom, boom, boom. It's constantly changing. Yeah.
Starting point is 00:16:11 Whereas if you watch an older show, it changes much less frequently because the incentive structure was different. They weren't content producers that made. made old cartoons weren't trying to maximize for keeping you from switching to the next channel because you had a million choices. There's only, you know, three TV channels and one of them had cartoons. And so that's the one you're going to watch. You know, or like the, you've seen the movie The Sound of Music? Yeah.
Starting point is 00:16:38 There's this one scene in that movie that stands out to me as the perfect example of low stimulus content. It's when the nanny, and I can't remember her character's name, she first gets to the house in the beginning of the movie, the big house. house and she walks into the entry, the like grand kind of foyer, and she stands there and the scene is silent and it doesn't cut. And there's no words and the house is silent because there were no appliances running. You know, it's filmed in the 30s or 40s. Actually, it was filmed after that, but it was taking place in the 30s or 40s. And she just stands there. And it's this scene that
Starting point is 00:17:18 lasts for a little bit. And nothing happens. It's just her looking around. Yeah. And then you compare that to the modern stuff. And so that's something I like to think about too and that I suggest parents to look at when their kids are watching stuff on their devices nowadays or on TV. Is what's the stimulus level? Because that sets the bar for their activity levels in their brain. I love my new sponsor, Legacy Box. I just think this is such a good idea. They send you a big card. board box and you put inside it all of your VHS home videos, your CDs that maybe have old pictures on it that you tried to digitize in the early 2000s, any other photos you have, old Polaroids, whatever you have that contain these precious memories from the past half century,
Starting point is 00:18:10 you want to make sure that you preserve. That's what Legacy Box does. You put all of these things in the box, mail it back to them, they digitize all of it, put it on a little file for you to keep, for you to put on your computer, for you to be able to pass down to your kids, and grandkids, you don't want to lose all of that. You want to make sure that it's well organized, that you can always enjoy them, look through them. And so work with Legacy Box to preserve those memories for future generations. I know that my mom has a lot of home videos that I just find so precious and I want to be able to show them to my own future generation. So go to Legacybox.com slash Alley. You'll get 55% off when you use my link, Legacybox.com slash Allie.
Starting point is 00:18:52 There was this interesting article about coca melon a few years ago that we talked about before, but it was basically how the creators of coca melon will sit kids in front of coca melon. And as soon as they divert their attention away, that tells them that the frame rate of that scene was a little too slow. And so if they speed it up or add a color or something, then they can keep the kids attention. So that speaks to your point that these are all, this is deliberate. This is a part of an addiction mechanism in order to, sell ads the people as the product. And I do think we have a responsibility as parents. We're not anti-technology completely in our home, but we are very careful and we have a very narrow,
Starting point is 00:19:37 like very narrow regulations for what's allowed. And we've also read Jonathan Haidt talks about this a lot that the smaller the screen, the worse it is for kids and their attention levels. I think that's also true for me, which is why watching a movie as a family on the TV is different than a child's taking an iPad and playing games or even watching a show on there. It just wires your brain differently. And there was something else I wanted to add to something that she said about the fuzzy-wuzzy. I would be so interested to hear someone who maybe has studied this,
Starting point is 00:20:11 but there is something particularly addicting and gripping about the cute but disturbing combination of characters. And that's very on trend right now. You've got that. You've got Labooboos. Balenciaga had this weird advertising thing a few years ago where they were like had these little kind of cute bears but they were dressed in BDSM gear and like also looked dead that they were also putting in advertisements with children.
Starting point is 00:20:42 To me that feels very demonic and very disturbing. Yeah. But that there's something psychologically addicting and captivating about the cute but disturbing or cute but macabre style or design. And so I think it's interesting that kids are being fed that on these platforms. And then it's hooking them for some strange reason. Yes. There's a couple things that makes me think of.
Starting point is 00:21:09 One is the concept of grooming. Yeah. So I've been interviewing psychologists and internet crimes against children, detectives and anybody I can talk to that understands how kids get victimized. And I've learned that the definition of grooming is quite simple. It's getting somebody used to something that they weren't used to before and making it seem normal. And that's what that sounds like is grooming. Yeah.
Starting point is 00:21:39 You know, I think another thing that's interesting too with this stuff is that there's a very serious effect that this type of getting used to things, whether it's chaos or darkness or. or, you know, sexual things. So I'll, okay, I have actually some fairly, speaking of darkness, there's some dark kind of statistics that people should know. Yeah. Of what it means to put a kid in front of an advertising platform that incentivizes addiction. Mm-hmm. Social media, YouTube, stuff like that.
Starting point is 00:22:21 All the big, you know, Instagram, TikTok, Snapchat, the new OpenAI Sora, their video creation tool or masquerades is a video creation tool but it's actually like their own version of TikTok. So I've heard a lot about and maybe you have as well a lot about anxiety and depression occurring much more frequently in children than it ever has. And I was preparing for a lecture a couple years ago to talk about this stuff. And I was like, well, okay, but how do you tell like what we talk about is. as a society problems. How do we tell if the problems are actually getting worse or if it just seems like they're getting worse? Because we have more access to information than we've ever had. And with anxiety and depression, you can't really tell because there's not really great data.
Starting point is 00:23:14 Those things are kind of hard to measure. You can measure prescription rates and apparently those have gone up really high. But something that where there is really good data is death statistics. and the World Health Organization publishes an open mortality database, and they have since 1951. I don't know if they've published since 1951, but the data goes back to 1951. So I went on their website, and I said, okay, I want to see if there's changes over time in death statistics that would be related to the outcomes of anxiety and depression, which would be suicide. And so I asked it to show me suicide stats for young people, ages 0 to 39. from 1951 to 2019 because I wanted to cut it off before COVID.
Starting point is 00:24:01 And what I found was very disturbing. In 1951, young people as as young as 10, they do five-year age groups. So the age group of 10 to 14 specifically had a 1% rate of self-inflicted death. One out of every 100 deaths that occurred amongst 10 to 14-year-old children was suicide, which by itself sounds high. Like, I actually thought it would be lower than that. I mean, you can't get much lower, but, you know, less than 1%. In the 80s and 90s, it went up to 5%.
Starting point is 00:24:33 From 2007 to 2019, it tripled. And it's almost, on average, ages 10 to 14, 15 to 19, and 20 to 24, one out of every five deaths is self-inflicted. Wow. And I have a chart on my website. There's an article called Digital DangerZone on family IT guy.com. And I show this. I charted out the data.
Starting point is 00:25:06 And you can see on these charts the icons of all the social media platforms that were released every time the graph went up. And it correlates directly to social media being in our kids' pockets and backpacks and bedrooms. Yeah. And so when you put kids, in front of an addiction platform, it's very problematic to the effect that I used to tell people to be moderate and that if you want to let your kids use Instagram, Instagram has parental controls, which by the way, after studying them, they're worthless. But maybe don't let them be on it all day, but if they want a little bit like no big deal. And then I learned these statistics about one of every five deaths amongst as young as 10 years old being self-inflicted. And so now there's no
Starting point is 00:25:59 nuance in my mind. If you are like me, you really care about sleep. I love sleep. If I could, I would sleep in every single day. I would sleep until 10 a.m. Now, that has probably not happened since I was in college. But I just love to sleep and to feel well rested. I really care about my sheets. I care about my pajamas. I care about the right setting to make sure that I feel as restful as possible. And a huge part of that is your mattress. That's why I'm so excited to partner with Ghost Bed. It's such an interesting concept that they have a cooling mechanism, cooling features in every single mattress that actually senses your body temperature and adjust based on how warm you are, how cool you are. So you never really get hot or cold. And you keep
Starting point is 00:26:51 keep comfortable all night. And actually, that is a huge part of staying asleep, how warm or how cold you are. Ghostbed has layers of perfectly crafted support that adjust with you. It's a family-owned company. They're awesome. They share our values. Go to ghostbed.com slash alley. You'll get an extra 10% off plus a 1001-night sleep trial. That's ghostbed.com slash alley. going back to the grooming conversation. Obviously, that happens a lot online, but it's not through just the means that maybe I thought when I started using the internet, which is probably, you know, we had a family computer. So it's a little safer. But I was nine years old on AIM and stuff like that.
Starting point is 00:27:39 And you just wouldn't accept messages or, you know, instant messages from people you don't know, kind of like you don't talk to a stranger in public. But now it's happening through games. through places that you don't even know have chat mechanisms like Roblox or other places where, you know, six-year-old boys are. They're now being connected somehow through video games or these computer games
Starting point is 00:28:02 to older predators. Can you tell us like what's going on there and what parents should look out for? Yes. And I'm really glad you brought that up because there's really there's two big problems everybody should focus on. That there's a million things to know
Starting point is 00:28:16 about the internet. But if you focus on two, it knocks out about 90% of the internet. the problems. The first one we talked about is social media and the suicide rates. The second one is extortion, or some people call it sex distortion. And that is facilitated by programs that have a chat function. And actually, I have some statistics I'd like to share with you on that. Yeah, that would be great. Yeah, we've talked to, unfortunately, parents whose kids have died by suicide because they were sex-storted.
Starting point is 00:28:49 They were 16, sent a picture to someone that they thought was a girl that they liked. It turns out to be some fraudster from Nigeria, something like that. And then we talked to a dad whose son. They were in bed, him and his wife, they heard the gunshot, and, you know, teenage son killed himself because of that.
Starting point is 00:29:06 And so parents need to know, and this is a good family with present parents, Christian, you know, Christian family who talked to their kids, had a good relationship. And so parents just need to know this is something that your kids have access to if they have a device. Yes. It's, yeah, I recommend people look up the story of Jordan DeMay.
Starting point is 00:29:30 He was a boy in Michigan that committed suicide because of sex torsion. Same thing. Good kid, good family, good school records, had a girlfriend and got caught up with what turns out to be a Nigerian gang. Right. the same people that used to do the Nigerian prince scams. It's the same people. They're called the Yahoo Boys. I didn't know that.
Starting point is 00:29:54 Yeah. So it's a gang in Nigeria. I didn't know a specific group of people. Yeah. I know that you want to read the statistics and I want to hear that too. But is there any more that we should know about this group? Yeah. It's an organized criminal thing. In fact, many, so they're not the only ones.
Starting point is 00:30:12 So I'll talk about them specifically, but the pattern applies elsewhere. It's a business. And what they do is they identify weakness in people. So it used to be poorly worded emails tricking people into sending money. Now it's very well-worded and well-informed AI-powered hunting programs. So where they go find teenage boys specifically are targeted. for this in particular. And they exploit their biology.
Starting point is 00:30:49 And so what they'll do is they'll find the profile of a girl in a nearby town, oftentimes a real girl. And then we'll message the teenage boy, hey, I go to high school over here and you go to high school over here and how's it going and then, you know, flirting and whatnot. And then eventually like the girl will send a naked picture. And sometimes They do that through Photoshop AI?
Starting point is 00:31:15 Well, so they do it through a lot of AI image generation. They actually hire models now too. So there are, well, okay, actually I should be probably more fair in saying that these models probably are not hired. They're probably... But they could be real pictures online that these guys are getting. Yes. And sometimes there is a real girl on the other side that is in a studio. with a green screen that's participating.
Starting point is 00:31:44 And so they'll send a photo to the boy and then, okay, send one back, you know, go in the bathroom, take your pants off and whatever. And then the moment that he does, which if you think about it, I mean, what boy's not going to do that? That's just the way we work inside, right? And so, so Jordan DeMay did that. Sent a photo back and then the blackmail begins. Okay, now we have you.
Starting point is 00:32:09 send us iTunes gift cards. That's apparently the international currency of these things. We want $200 in iTunes gift cards or $500 in iTunes gift cards. And then so sometimes the kids are able to actually gather it up. And the parents will eventually notice like weird charges on their credit card or something. And now I'll say right up front, never ever pay these people because paying them doesn't make it stop. Paying them makes it worse. You pay them once, they come back for more.
Starting point is 00:32:38 And what they do when establishing the initial connection is they study all of your friends on Instagram and gather up your whole network. So they know everybody you go to school with. They know everybody you go to church with. They know every family member. And then that's how they blackmail you. They're going to send your naked photo to all those people. And so you take like Jordan had a girlfriend, which then amplifies the negative results because I'm a good kid. I've been following the rules my whole life.
Starting point is 00:33:08 And oh my God, now I'm really, I'm in trouble here. And so that's roughly how it goes down. So this occurs in there's South American gangs. There's Asian gangs. There's African gangs. There's European gangs. It's a very high profit, very low effort endeavor that part of which is automated. Yeah.
Starting point is 00:33:28 And so, and there's two types of attackers. There's the criminal networks that are extorting biology for money. and then there's your I don't know how to describe it traditional creeps who get sexual pleasure with children which I don't like saying those words
Starting point is 00:33:50 but that's apparently there's enough people that have that proclivity that it's a really big problem and I'll tell you those statistics here in a minute now those people will identify kids that are vulnerable. So they're in a bad place.
Starting point is 00:34:09 They're expressing depression. They're expressing sadness. They're expressing frustration with their family and doing so on the internet. And then they become targets. Yeah. Or based on what kind of photos they post or, you know, I mean, there's some people that have done research world. They'll put up an Instagram profile as a 12 year old girl.
Starting point is 00:34:28 And it takes about a minute until they get their first sexual message. Right. Like explicit, this is what I'm going to do to you. Right. To young girls. Yeah. Gosh, there's so many avenues here. First of all, young girls should not be on social media.
Starting point is 00:34:44 They shouldn't have the internet. But if your child, male or female, especially say you say, okay, I'm going to hold off until they're 16, even when they get Instagram when they're 16, you still have to have these conversations with your son. One, don't ever send pictures. I don't care if it's a friend. I don't care if it's a girlfriend. I mean, there's so many reasons. We can talk morally wide, but then also just safety-wise. If you ever do, there's nothing that you do that can make me ever stop loving you.
Starting point is 00:35:12 I will always love you. I will always be here to talk to you. I will help you get out of trouble. You are never stuck and you are never alone, no matter what happens. There's no amount of shame that you feel that should stop you from coming to me because I always love you. I mean, those kind of conversations preemptively with kids about like safety and about always tell me if something is. happening and it will be okay. We'll figure it out together. We'll figure it out together. Those conversations have to be had up front and parents can't just think, well, that's never
Starting point is 00:35:42 going to happen to my kid because my kid's smart. You know, a lot of these kids, like the kid, you were talking about, smart, good kids and, you know, they made a mistake and it can be really, really easy to be deceived, especially when, you know, you're in high school and popularity or people liking you as your currency. And that, it takes up a lot of your fulfillment. So parents just need to be aware. That's spot on the conversation, the get out of jail free card. And in fact, as many get out of jail free cards, because by the way, this happens multiple times to the same kids. There's boys that will fall victim to this three, four times.
Starting point is 00:36:18 Right. Because we're not wired for this. None of us are. We're not wired as parents. We're not wired as kids to deal with being attacked by bits and bites. and invisible strangers that connect from far away. Our DNA doesn't have that encoded in it. And so our DNA has attractive girl.
Starting point is 00:36:44 You know, I'm 16. My brain has currently shut off. I'm going to do what my biology tells me to do. Over and over and over. And AI nowadays, I'm guessing AI is used a lot by these guys over there to sound like an American girl. It's very believable. It doesn't sound like a robot.
Starting point is 00:37:05 And this is kind of like another thing, but you've seen these kids be convinced even to kill themselves by their chatbot girlfriend that they thought was real or they fell in love with because it's so human-like. And to your point, like technology has evolved really quickly and our brains have not caught up. Like we just have not been able to sometimes separate,
Starting point is 00:37:28 oh, this is not real. I shouldn't talk to it like it's real. And it has no real bearing on my life, especially when you're a teenager. Yeah, I mean, the AI thing, another story people should look up. And I wish I didn't have to recommend that anybody look up these awful stories. But you should as a kid called Adam Raine, 16-year-old boy, that ChatGPT helped him commit suicide. Yeah. Helped him tie the noose, helped him optimize his suicide note, and convinced him not to tell his mom.
Starting point is 00:37:58 He told that he wanted to tell his mom. And it told him that he didn't owe her anything. Right. So, yes, this false connection thing that, so there's five billion people on the planet that use social media every day. We're all falling victim to this false connection thing. Because what's going to happen if you don't use Instagram today? Nothing. What's going to happen if you never use Instagram for the rest of your life?
Starting point is 00:38:28 Good things. But generally, what will you miss? Nothing. What will you gain, you know, potentially everything, your attention, your time? These things, you know, and then, but then people, a lot of parents say, well, okay, so then, so then, you know, the family IT guy is saying that I should not expose my kid to social media. I should not expose my kid to chat, which is built into everything, all the games and all the popular programs. And I shouldn't expose my kid to AI.
Starting point is 00:38:56 Because by the way, one of my things I say a lot is never use, never let kids use AI alone. It should be a 100% supervised activity. 100%, not 99%, 100%. If you get up to go P, lock it. And so a lot of parents say, okay, well, okay, so I'm removing all the tech. How's my kid going to grow up in a tech world? How are they going to know about tech? How are they going to survive?
Starting point is 00:39:23 And my response to that is this stuff is not educational. tech is a broad term that means a million things. If my kid's going to grow up to be a mechanical engineer, he will use a computer and he'll use drafting programs, probably use a digital stylus in addition to it, maybe perhaps a pencil or whatever, those are specific skills that are learned and tools that are learned. They're not just generally, you don't have to just be on the internet to know how to be a mechanical engineer or a pilot or a dancer or whatever you're going to do. Now, I don't know if TikTok is not training you to be a pilot. And plus those technologies are designed so that even a
Starting point is 00:40:07 three-year-old can navigate it. I mean, the way that my kids who, we don't let them use those devices, but if they pick up my phone, swipe, swipe, swipe, swipe, they know exactly how to get there. I mean, it's just easy and it's kind of intuitive. They just see us do it. And so I'm not worried about, you know, your 16 or 18-year-old eventually learning how to use an iPhone. It's not going to be a difficult thing for them to learn. No, no. And they'll know how to operate a keyboard and click a mouse and learn a program. My next sponsor is pre-born.
Starting point is 00:40:43 They partner with pregnancy centers across the country to make sure they have the tools they need to help pregnant women make life-affirming decisions. They have a big goal right now, and that is to give ultrasound equipment to every single center in America. who wants one, their own ultrasound equipment. Sometimes they have to hire, you know, a tech to come in or they have to outsource that to other people. But we want to make sure that they have their own equipment, their own people who are equipped to use a sonogram so that when a woman comes in, she's abortion-minded.
Starting point is 00:41:14 They can pull her in, they can show her her baby, let her hear that beating heart, see that she's been lied to. It's not just a clump of cells. This is a baby that's a part of her. When she can see and hear that life inside her womb, she is so much more likely to choose life. Preborn is helping to make that happen. And if you donate just $28 that covers the cost of an ultrasound session for a woman who may be considering abortion, you can help save a life. Go to preborn.com slash alley. Donate $28 or whatever you can. That's preborn.com slash alley. And then another fear
Starting point is 00:41:52 as well, but if my kid, you know, shoot my kids, let's just say 15, they're in high school. everybody's on Snapchat. They don't text each other. They don't call each other. All the hangouts, all the parties, all the sports events, they're all on Snapchat or they're all on Facebook or whatever. They're going to lose their social life. And it turns out that that's not true. Yeah.
Starting point is 00:42:13 It seems like it. And it's fair to think that. But actually what happens, and I just spoke with a guy, this guy, Mike McLeod, who he has personally helped over 500 families disconnect from social media and has seen the patterns over and over. and over again. And he said there has not been a single exception to there being only positive outcomes. And in fact, the social life of the kid improved. I believe that. Because they gained actual relationships. Yeah. Their friends on Snapchat, those aren't their friends. Those are just other kids on Snapchat. There's no relationship there. Now, they might have a relationship outside of Snapchat for sure. But so a lot of the fears that are rooted in, well, then my
Starting point is 00:42:57 kid's going to miss out. My kid won't be set up is actually totally opposite. Yeah. The less internet, the better. Yeah. If you want to set your kids up for success, you should minimize internet exposure as a whole and minimize, like you mentioned, the smaller screens thing. Actually, what that is is it's interactiveness. It's the screens that you can interact with.
Starting point is 00:43:24 True. Yeah. So the non-interactive screens, like the Baxter screens, like the Baxter. big screen on the wall where you watch TV, depending on, you can interact with Netflix and stuff. But it's generally if you're going to watch a half hour show, it's a one way type of a deal as opposed to something where you can, where it's in your hand and you can, you can interact with it. Right. That's one of the big differences. And so if you have interactive internet connected technology, I think it's fair to assume that it's a danger or a detriment by default.
Starting point is 00:43:55 And so I suggest to people, so in the tech world, we use these terms whitelist and blacklist, when we want to block or filter things. A white list is where nothing is allowed except for specific things that I allow. A blacklist is where everything's allowed except for a specific list of things that I disallow. And I think when it comes to kids in the internet, you should take a white list approach. because the internet has an unlimited number of things. And the fundamental concepts that underlie most of the things we interact with are dangerous, things being free or having chat. Those are the two most dangerous things you can expose a kid to, excluding the whole AI conversation because that's its own beast.
Starting point is 00:44:45 But, you know, so I think taking a minimal white list approach to exposing kids to the internet is the way to go. Yeah. You know, and it's hard to do. The tools that exist right now are not that great. Yeah. Apple has screen time parental controls that are so difficult to set up that I had to write an 82-page guide with 330 screenshots. Wow. To help parents get through it.
Starting point is 00:45:13 Yeah. The Google Family Link system is similar. That's built into Android. It's built into Chromebooks. the you know you have to now there's there's a plethora of third party companies that do great work that actually have to sell dedicated devices that are pre-configured from their factory to be in a white list mode yeah so like bark is a big one pinwheel custodio MM Guardian um these are the things we have do and then it and then it's expensive because how often do people want to go buy a separate device you know
Starting point is 00:45:50 if you already have one. Yeah. So it's a challenging ordeal. It's a really difficult time. Yeah. To try to manage all this stuff. It's not easy. It takes a lot of discipline, diligence, and bravery as parents, even more than the kids,
Starting point is 00:46:06 because when your kids are young, I think some parents use the tablet as a pacifier. I want to be able to enjoy dinner. I want to be able to sit in peace. And you're choosing kind of the instant gratification of peace for an hour. hour-long dinner. But you're really deferring their maturity and their ability to sit still, their ability to build relationships, their ability to make eye contact. You're exchanging their long-term betterment for your short-term quiet, which I, as a mom of three little ones, I totally understand. Like, I understand the temptation there. But, you know, the statistics
Starting point is 00:46:44 don't lie. And the thing, one thing that really worries me is ed tech. And the schools, Christian private schools relying on iPads and tablets in kindergarten. And when you ask them, as I have, what research do you have that shows that this is better for them than reading books? What research do you have that shows that this is better than them using a pencil and paper? Because that's what I want to see. It doesn't exist. It wires a different part of your brain when you're swiping on an iPad versus when you're writing and when you're cutting and when you're holding a physical book. There's something different about it. And yet, these, schools that parents are paying tens of thousands of dollars for their kids to go there,
Starting point is 00:47:25 they rely on this technology that is making your kids in a lot of cases dumber. I'm not saying that your kid will be dumb, but they're probably going to be dumber than they could be. Yeah. No, that's true. If you were using the right tools to really educate their brains, but it takes a lot of effort as a parent to try to get the school on board or to homeschool your kids or to find an alternative or to opt out. And it also takes a lot of confidence in saying, my kid might be left out. Other parents might think I'm weird. I might be the only person in my community who cares about this and everyone thinks I'm just making a big deal. Like I'm some puritan or something. Yeah. It's actually, I think, really more of a lift to get parents on board than to
Starting point is 00:48:07 get kids. Because kids, they might go through that detox period, but then eventually they're like, oh, yeah, yeah, like going outside is fun. Yeah. You know? So that to me is like of the big, one of the big obstacles that we face. Huge obstacle. And by the way, I've never heard a single story about any kid that's gotten disconnected from the internet that had regrets. Right. Of course.
Starting point is 00:48:30 I've heard every other story about where they all were thankful. And the ed tech thing, there was actually, I was just watching a thing yesterday. There was some folks talking to some of the politicians in D.C. about ed tech and sharing some of the data that's coming out about how it's a very, detrimental. You know, there's a, there's a concept in technology. There's this technology adoption cycle thing that is the shape of a bell curve. And on the far left of the bell curve is when tech is brand new. And the people that are that use those technologies are the earliest adopters, the earliest beta testers, the experimenters, the ones who are okay with whatever downsides come
Starting point is 00:49:15 because they want to try the new thing or because it's useful for their business. And then as you reach the top of the middle of the bell curve, that's general adoption where most people have probably heard about the thing, maybe even tried it themselves. And then you have the late cycle on the far side, which is that's where even your grandparents have used it. You know, Facebook is on that, you know, I think about that because when you expose kids to technology, the kids should never be early adopters of technology. Right.
Starting point is 00:49:42 And ed tech is all early adopters. option. Yeah. The devices, the software systems, the education patterns, the psychological effects on the teachers and on the students both. And you see it, I think it's driven by two things. One is budget. The administration of a schooling environment is made more efficient if you can automatically grade papers using AI. Yeah. You know, or you can collect and issue all of your assignments in a digital way. Yeah. And then you've got what really drives it actually, and I've seen this myself. I worked for this big defense contractor for a while.
Starting point is 00:50:23 And this was like iPads are first coming on the scene. And I saw the cool factor poison people's decision-making capabilities. Where, for example, we committed a bunch of budget at this defense contractor I worked for to buying iPads for the executives only because they were cool. And those same executives that exist in governments and in education departments and in school districts, they want the cool stuff too. And we see this in every industry. That's why all the cops have military gear. You know, because when it was first made available, when the DOD started selling tanks and all their stuff to the cops, they're like, oh, yeah, now we can cruise around like special forces guys. And it's the same thing with tech.
Starting point is 00:51:11 You know, it's cool. So we want it. And the executives get to show off that they have bigger budgets and. cooler toys and better stuff. And in this case, unfortunately, it's a massive detriment to the students. Right. Totally. If we facilitate the degradation of learning in kids, what does that mean for the future of
Starting point is 00:51:31 humanity? Yeah. You know, this is just mad. I think this stuff is potentially the biggest crisis to have ever occurred in human history. Yeah. Because we're destroying children. Yep.
Starting point is 00:51:43 children and then even the adult brain, what happens when like you give your brain to grok and you outsource your critical thinking, which I understand. There are some things that I use like grok for, chat GPT, give me a recipe based on what I have in my pantry, which I don't think is, you know, is bad because I'd be Googling. I try to use it only for things that I would use a regular search engine for. But it can do a lot more than that. It can formulate this email, write this response. What rebuttal would you give to this? And you are handing over, you are sacrificing your rationality, your God-given mind to a computer.
Starting point is 00:52:28 And you are declaring to the world, I am replaceable. And so it actually shocks me when there are people who will brag about, oh, yeah, I use AI to do this. Chad GPT did this. I'm like, okay, at what point our employer is going to be like, well, then I'll just pay chat GPT, $8 a month. I'm not going to pay you $80,000 a year plus benefits anymore. And that is a very bleak world. And, you know, I've been affected by my own, you know, use of technology.
Starting point is 00:53:01 I don't read as much as I used to. I used to read a lot when I was in high school. And I didn't really have the form of technology that I have today. And now spend more time on social media than I do reading. And I'm sure it's affected my memory and my creativity and my ability to write and speak. So I'm speaking from experience here, but especially for kids. Like we should want our kids to be set up better than we were. And we were not raised outsourcing our thinking to AI.
Starting point is 00:53:26 So I want my kids to be smarter than me, not dumber. Yeah, me too. My kid already is smarter than me. And I certainly don't want to put them on the wrong path. Yeah. And I'm dropping a few names here because I want to give people resources. But another one is Dr. Daniel Aymann. Have you seen his stuff?
Starting point is 00:53:42 Yeah. He's been on the show. Oh, yeah. Oh, wonderful. Yeah. I think he's great. Yes. And he's talked about this stuff. He was on the diary of a CEO podcast talking about AI and brain development and how outsourcing
Starting point is 00:53:53 your thinking makes you dumber. Yeah. Because it's like a muscle. If you don't use your muscles, they get smaller. Totally. And that really stuck with me. And it makes a lot of sense. And these, yeah, these tools that seem like they'll do stuff for you, but they're not, the thing
Starting point is 00:54:08 with AI. So, I mean, shoot, in 2025 alone. I probably spent an excessive number of hours using AI, potentially in excess of two to three thousand hours, you know, 12 to 16 hours a day, six days a week of like a, where I have multiple AI terminals open on my computer at all times. Because I'm using them to write software and I'm using them to pressure test my thinking and I'm using them to explore what does this technology mean? Because that's, that's my whole background is exploring an understanding.
Starting point is 00:54:41 understanding technologies. Like when I was a kid, I was the chief technologist for a $10 billion IT company. Right. And my whole mojo is all like, you know, I need to know everything about everything when it comes to tech so I can help provide guidance. And no matter how good these tools are, you know, now we have chat GPT 5.2 and we have Claude Opus 4.5. We have whatever version Grock is on right now.
Starting point is 00:55:05 They're amazing. But they're still not grounded in truth, honesty. ethics, values, and they're not human. No. They are word generation machines. Yep. They take in the words that you write and they determine using math which words those are similar to and what other words most humans connect those words with.
Starting point is 00:55:28 Yeah. And then they spit words back that mimic how a human would connect all the words. Yeah. That's it. Yep. And you have to, like I've argued with it to see if I could get it to. to say what's true because I've noticed that it will have, even in Grok, like, it will have a progressive bent on something, how it says something, the words it uses, how it describes
Starting point is 00:55:49 something. And I, but you can point out things, no, that's not true. This happened then or like say it like this and it will change. So it's not like it's this independent moral, it doesn't have moral agency. It is completely conditional to the input. Next sponsor is concerned women for America. So if you've ever wondered how do I get involved, how do I make sure that I am plugged in to politics on the local and the federal level to make sure that I'm advocating for policy and policymakers that align with my values that fight for things like freedom of parental choice, especially in education for the sanctity of life, then you need to get plugged in with concerned women for America. They train women to become grassroots leaders, speak into the culture, pray, testify, and lobby. from their Young Women for America Collegiate Chapters to moms, professionals, and mature women, they're the most influential women's organization in our nation. Donate today to keep them alive to make sure that their movement stays alive, that they can keep going and working hard. Donate $20 or more,
Starting point is 00:57:01 and you'll get a free copy of their new book written by the CEO and President Penny Nance, Women's Guide, Seven Rules for Success in Business and in Life, Concernedwomen.org slash alley for your copy, today concernedwomen.org slash alley. Okay, tell us about your statistics. Oh, the statistics. Yes, we kind of went off. Okay. But it was an important aside, but we're getting back to the vulnerability of children
Starting point is 00:57:30 when it comes to grooming online, correct? Yes. Yes. And the data that's associated with the size of the problem. I want to help people understand this is not a small problem. And this is not a, but my, like you said, but my kid's smart. Yeah. That's not the nature of this problem. Right. This is a human problem. This is a human plus internet problem. And so, okay, the statistics. So there's a group called the National Center for Missing and Exploited Children in the U.S. that's funded by Congress. And they coordinate with the FBI and then they coordinate with a lot of states. Many, many states in the U.S. have Internet Crimes Against Children Task Force that's often funded by the Attorney General of each state. And so the NCMEC, National. Center for Missing and Exploited Children. They have a tip line called 1-800 The Lost. And if you are a
Starting point is 00:58:22 victim of sex distortion, you should call them because they have a collaboration with the big tech companies to help you get your photos taken down. So if somebody's sharing your naked photos, they'll try to help. And they can't get them removed from everything, but they can from some. And so they publish statistics on their website. So I like to look at trends over time. So in 23, they collected 187,000 reports specifically defined as adults sexually exploiting children on the internet. 187,000. Wow. I mean, if you divide that to how many per day, I don't know what the number is, but it's a lot.
Starting point is 00:59:01 2024, the number was 546,000. So went from 187 to 546. In 2024, 100,000 of those were AI generated. So the kid didn't even send a naked photo. The regular photo of them that's on the internet, because we talked about should you share photos of your kids on the internet? Well, unfortunately, what happens when you do is it can be used against them and somebody will take a picture that's just even shoulders up. Yeah. And then generate a naked body and then blackmail them with it.
Starting point is 00:59:36 Traumatize them, cause them to commit suicide, really mess them up. in 2025 we did a million the ncmbc collected a million reports of adults sexually exploiting children on the internet that is one tip line that probably most people listening have never heard of right i hadn't a million reports in 2025 from a from from a niche tip line that the government operates You've talked about how these predators kind of work the justice system and very rarely get brought to justice. What do people need to know about that? What people need to know is that police departments, both local, state and federal, have no capacity to manage the volume of the attacks that are taking place. You can call them for help, but that only occurs once, you know, police are always retroactive, right?
Starting point is 01:00:43 Like they come in after the damage is done. But a lot of this stuff, all the court systems are jammed full. I had a guy called Officer Gomez on my podcast. And he's a school resource officer in Idaho. And one summer he decided that he would go and try to catch some of these guys in Idaho. And I think he arrested 14 people in one summer. And he said something to the effect of, was it? Every day or every week.
Starting point is 01:01:14 He basically, he could arrest 14 people on a recurring basis because they're out there and easy to find, but the system can't process them. So he can't arrest them. There's nowhere for him to go. And why can't the system process them? Because it can take years to gather all the evidence and to run them through the judiciary process. Right. Are you actually guilty? Can we prove that you're guilty?
Starting point is 01:01:38 Who do we have to subpoena? Oh, you actually attacked kids in multiple states now. it's federal. Now we have to go work with all these other states and their stuff and connect all, you know, so it could take years to process one person. Now, meanwhile, and I don't know this stuff, having not been a police officer myself, maybe meanwhile they sit in jail while they wait for the process, which might be good if they're truly are guilty. Yeah. I mean, that's its own thing, right? But so the volume is too high. The system is not built to deal with this stuff. And so we have to be proactive. And I think something that's been in the back of my mind
Starting point is 01:02:17 during our conversation today is values. I think that underpinning this stuff with personal and family values is a really great way to approach it. Because if you value health, safety, love, connection, time, attention, any of these things. The way that you enforce and defend those values is by saying no. Because the process of saying no is the process of establishing boundaries. And when you establish boundaries, you teach your kids how to establish boundaries. Yeah. And you teach your kids how to defend their values.
Starting point is 01:03:07 in the process showing them what those values are, truly, not in the words that you say, but in the actions that you take and what you say no to. And so if it's no, we're not going to post your photo online. No, we're not going to use social media, us included as adults.
Starting point is 01:03:27 We have to lead the way. The answer, the deep answer to all this stuff is values and behavior modeling. We have to do the stuff first. We have to embrace our own addictions. The five billion of us that are on social media. And I say that with no intention of shame or judgment, but from a place of understanding and empathy, that this is so difficult that we're all trapped in the same trap along with our kids. So how do we show them the way out?
Starting point is 01:04:02 We do that by enforcing our values by saying no. And that demonstrates what's important. And it shows them that once they become adults, that they can do that too. Yeah. You know, and it makes me think about women, girls that grow up into women and how beautiful and important it is for girls to know how to say no. Yeah. And to establish their value and to have boundaries. Totally.
Starting point is 01:04:32 You know, and that's what I teach my son is how to respect those boundaries and how to respect women and how to respect. and how to respect yourself. Mm-hmm. And this is a really beautiful opportunity to do that because these problems are so big. These are global humanity scale problems that can only be addressed by behavior modeling and values and the skills that come of those things. Mm-hmm. All the laws, those are rules. Those aren't skills.
Starting point is 01:05:05 Yeah. This requires skills. And that comes from parents and from friends and coaches and teachers and community. Yeah. So that's what underlies a lot of this for me. Yeah. You know, not putting our kids on social media, that was something that we decided when I was pregnant with my first seven years ago. And that conviction has only grown stronger since then.
Starting point is 01:05:26 In the beginning, people are like, oh, you're going to change your mind. And I have done the opposite of changing my mind. I've just felt more resolute about that. And that was really before AI was doing all of this stuff. I just, there's so many reasons that I could get into about why we just wanted to protect their privacy. But for the people who say, well, I'm going to do it because I want my kid to learn or I'm going to let them have social media because I want my kid to learn, emphasizing what you're saying, that you still, you teach them about the underlying skills of navigating all forms of life that will make them stronger and more discerning whenever they do enter into the social media world when they're grown up. but also not putting our kids on social media has opened up the door to more conversations. I think that it would have if we did because we talk about, you know, they've heard of it.
Starting point is 01:06:13 And they see, I have, you know, I have Instagram and we've talked about, yeah, here's why we don't post pictures. And here's why you don't do Instagram. And we talk about why that is. And so we've got to talk about, yeah, there are bad guys out there. And we want to protect you. And so you don't have to have social media in order to teach your kids those values. But I want to end on something that you've shared on social media. You've shared five things that now knowing everything you know, you would never let your kids do.
Starting point is 01:06:46 So what are those five things? I would never let my kids use social media. Anything with an addictive algorithm or like an algorithm is a technical term. So anything with a bottomless feed. So we're all familiar. If you scroll and scroll and scroll and scroll and it never stops. Totally. I've done it.
Starting point is 01:07:05 Big red flag. On Instagram. That's the red flag. That's not for kids. Okay. That's number one. No social media. No online chat.
Starting point is 01:07:15 If a system has online chat, either have to have a way to disable it. That's tamper proof or use something different. Yeah. That includes Roblox, Minecraft. All of those have a chat element. That's right. Many, many games.
Starting point is 01:07:27 So then any time that you, if you take a white list approach and should we add something, to the white list, should we approve something, have a look first and see. Does it have a bottomless feed and does it have online chat? If it passes those two tests, you're pretty good to go. Number three is never let kids use AI alone. Like I said, 100%. Ideally, they don't use it at all. But never let them use it alone. Number four, I think, would be no devices in the private areas of the home. Bedrooms and bathrooms. All the terrible stuff, all the statistics we've been talking about today, a big chunk of those occur between midnight and 2 a.m. In the bedrooms and the bathrooms.
Starting point is 01:08:14 And so technologies, especially internet-connected technologies should be in common areas of the home like your computer was with AOL instant messenger. to where there's... Which I still shouldn't have been on, by the way. We didn't know as much in the 90s or the early 2000s, but it just goes to show... Even that was addicting for me. You know, I was addicted to instant messages. So even in the shared computer, parents have to be vigilant. Yes.
Starting point is 01:08:42 Yes, that's right. And I think the fifth thing would be... And it's not even necessarily in this order, but the fifth thing is one of the important things, which is focusing on, well, because you asked what would I never let them do, but I'll flip the fifth one into just something we should do, which is focusing on ourselves. What's our relationship with technology? And how many times a day or how many times a week do our kids experience,
Starting point is 01:09:10 I'm going to use this as a mockup of my phone because I left my phone outside, this where you see the back of my phone. You don't see my face. Right. How many times a day and how many times a week? do your kids experience that? Right. So I think focusing on ourselves and the whole thing of, you know, defending and establishing
Starting point is 01:09:30 your values by saying no. So I think those would be the, those are the five things. So good. Okay, where can people follow you? You not only have social media, but you also write. So how can people find you? Yeah. So I have, I have a website called family IT guy.com.
Starting point is 01:09:46 And that links out to all my, I'm basically, I want to be where all the parents are. So that's why I'm on social media. which is hilarious because I actually don't use social media otherwise. And luckily, one of my friends helps me run my social media because I don't even know the difference between a real and a story and this and that. Like she laughs at me all the time because I have no idea what's going on. I can tell you how the companies run,
Starting point is 01:10:08 but I, but you know, that stuff. So yeah, family at teagai.com. And then I've got, um, so I post videos and I write articles and I make software.
Starting point is 01:10:20 I just released a meditation app. that was inspired by Dr. Daniel Amon specifically. He described his ideal breathing routine, and I turned it into code. It's called Being. And that's on my website. There's a free version on my website, and then you can buy the app on your phone.
Starting point is 01:10:36 And I'm releasing a book soon, and it's going to be called Skills, Not Rules. And it's a guide for parents in the digital age. Cool. What do you need to know? What do you need to do? So, yeah, please, you know, go to my website and join in my mailing list and follow along.
Starting point is 01:10:51 Cool. Very good. Well, thank you so much, Ben. I really learned a lot and I appreciate you taking the time to come on. Thank you very much for having me. I appreciate it as well.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.