Family IT Guy Podcast
Ben Gillenwater helps families protect children from digital dangers, bringing 30 years of cybersecurity expertise to the parenting journey. His background includes working with the NSA and serving as Chief Technologist of a $10 billion IT company, where he built global-scale systems and understood technology's risks at every level.
His mission began when he gave his young son an iPad with "kid-safe" apps—only to discover inappropriate content days later. Despite his deep technical background, Ben realized that if protecting children online was challenging for him, it must be even more difficult for parents without his expertise.
Through Family IT Guy, Ben creates videos and articles that help parents and kids learn how to leverage the positive parts of the internet while avoiding the dangerous and risky parts. His approach bridges the knowledge gap between complex technology and practical family protection, making digital safety accessible to everyone.
Episodes

4 hours ago
4 hours ago
Mike McLeod is a speech-language pathologist and ADHD executive function specialist who has worked with over 500 families to eliminate screens since 2016. He hosts the ADHD Parenting Podcast and has two books that released in January 2026.In this conversation, we get into the hard stuff: why screen addiction looks identical to drug addiction, what the withdrawal period actually looks like when you take a phone away, what EdTech has done to education (and how to opt out of school Chromebooks), and why parents need to organize with other families instead of waiting for institutions to fix this.Timestamps: 0:00 - Introduction 1:00 - Mike's background in ADHD and executive functioning 3:05 - ADHD is really an executive functioning disorder 5:05 - The four pillars of executive functioning 9:55 - How screens stop brain development 12:25 - The youth mental health crisis 14:59 - Suicide rates tripled after the iPhone launched 18:31 - Getting a real ADHD evaluation 20:14 - Screens stole boredom from childhood 26:14 - Why parents keep screens in their kids' lives 29:48 - Four cognitive distortions parents tell themselves 33:40 - What phone withdrawal actually looks like 47:12 - You can't teach a kid to manage an addiction 53:04 - 500 families eliminated screens, zero regrets 58:55 - EdTech is destroying education 63:34 - Life at 18 with vs. without executive function 69:17 - The one thing every parent should do right nowIf you're a parent dealing with screen battles at home, you're not alone. Mike has seen this hundreds of times and it does get better: https://www.grownowadhd.com/about/Check out his new book: https://www.grownowadhd.com/grownow-book/Family IT Guy helps parents block harmful content, limit screen time, and prevent contact from strangers. Guides: www.familyitguy.com Subscribe for more conversations with experts who work directly with families.

5 days ago
5 days ago
For the first time, social media executives are being forced to answer to a jury for the impact their platforms have had on children.This trial in Los Angeles (MDL 3047) brings together claims from more than 2,000 families who allege that platform features such as infinite scroll, autoplay, notifications, and reward systems were intentionally designed to encourage compulsive use in kids and teens.Evidence presented in court includes:• Internal memos comparing Instagram to a drug• Research showing vulnerable teens were especially at risk• Warnings about beauty filters contributing to body dysmorphia• Testimony distinguishing “clinical addiction” from “problematic use”• Allegations linking platform contact to exploitation, drug access, and suicideAt the center of the case is a critical question:Are these technology companies — or advertising companies built on capturing attention?Follow Nicki Petrossi of Scrolling to Death for ongoing courtroom coverage and analysis.To track the proceedings, search: MDL 3047

Wednesday Feb 11, 2026
Wednesday Feb 11, 2026
Dr. Lisa Strohman spent 30 years studying what hurts kids—from profiling at FBI Quantico after Columbine to serving as an expert witness in New Mexico v. Meta. Her conclusion? The phone in your child's pocket is more dangerous than a gun on your kitchen table. At least the kid knows to be afraid of the gun.We get into the CDC data, what she saw inside Meta's own research, why 400 girls at one school deleted social media on Valentine's Day, and what happened when she gave her own son Snapchat and immediately regretted it.Timestamps:0:00 - Lisa's background: FBI, Columbine, 30 years in digital safety2:15 - 800,000 kids follow the Columbine ideology5:10 - CDC data: self-harm spikes after social media7:00 - False narratives from platforms9:30 - "A phone is more dangerous than a gun on the table"12:05 - Inside the case against Meta19:40 - 400 girls quit social media on Valentine's Day22:50 - "Tech is a tool, not a toy"27:00 - Warning signs for parents of girls33:37 - The expert's own parenting story39:40 - "I gave my son Snapchat"43:17 - One thing parents can do this week45:11 - Digital Citizen Academy49:45 - Final questionAbout Dr. Lisa Strohman:Clinical psychologist, attorney, and founder of Digital Citizen Academy (digitalcitizenacademy.org). Her free book "Digital Distress" is available at digitalcitizenacademy.org/digital-resources.Resources:Family IT Guy: https://www.familyitguy.comiPhone Setup Guide: http://familyitguy.com/go/iphoneguide

Tuesday Feb 03, 2026
Tuesday Feb 03, 2026
How do you protect your kids online when even adults can’t tell what’s real anymore?AI-generated videos, deepfakes, and synthetic audio are not just a tech issue. They are showing up inside the apps our kids use every day, mixed in with cartoons, music clips, and “safe” educational content. Most children, and plenty of adults, are being trained to trust whatever looks and sounds real.In this episode of the Family IT Guy Podcast, I sat down with Jeremy Carrasco@showtoolsai a media producer and AI analyst, to talk about what parents need to understand right now. How AI content is made, how algorithms push it, and how families can spot it before it causes harm.Jeremy is not guessing from the outside. He has spent years in professional video production, live streaming, and audio engineering. He knows what real human media looks like when it is made by actual people, and where AI still gives itself away.One of the biggest tells?👉 AI doesn’t breathe.AI videos can look believable, especially on a small phone screen. But once you know what to listen and look for, the cracks show up fast. Those cracks matter because kids do not have the life experience or media literacy to notice them on their own.In this conversation, we break things down in a way parents can actually use.First, AI videos versus deepfakes.They are often treated as the same thing, but they are not. Jeremy explains the difference, why deepfakes tend to be targeted, and why mass-produced AI videos are now flooding platforms at scale, often designed to hook kids with familiar characters, faces, or voices.Second, why audio matters more than visuals.Parents are taught to watch what their kids see, but listening is just as important. We talk about unnatural speech pacing, missing breaths, flat or mismatched emotion, and why the human voice is still one of the hardest things for AI to fake convincingly.Third, visual and behavioral red flags parents can learn.Subtle background warping, strange eye movement, awkward timing, and non-human rhythm. These are things media professionals spot quickly, but they can also be taught to parents who want to be more proactive instead of reactive.We also zoom out to the bigger issue parents are up against.Algorithms do not understand childhood, safety, or values. They understand engagement. A feed that starts with something harmless, Bluey, Miss Rachel, animal videos, or learning content, can shift quickly after one curious search or autoplay chain. That is how kids end up exposed to disturbing, violent, or sexualized AI-generated content that looks playful but is not.We talk about:- Why kids’ algorithms are some of the most profitable and dangerous systems online- How “safe” feeds slowly drift without parents realizing- Why YouTube Kids is safer than regular YouTube but still not a set-it-and-forget-it solution- The rise of AI-generated sexualized content involving children- Why sharing kids online can create exposure parents never intended- Safer ways to share family photos using privacy-first tools- Why adults have to act as stewards of their children’s digital privacy, even when the platforms will notThis episode is not about fear or banning technology. It is about giving parents clarity in a digital world that is changing faster than most families realize.If you are raising kids right now, or care about the internet they are growing up in, this conversation is worth your time.🎙️ Guest: Jeremy Carrasco — Media Producer & AI Analyst🎧 Podcast: Family IT Guy

Saturday Jan 31, 2026

Tuesday Jan 27, 2026

Monday Jan 19, 2026

Sunday Jan 11, 2026
Sunday Jan 11, 2026
In this episode of the Family IT Guy Podcast, I sit down with Shawnna Hoffman, CEO of the International Center for Missing and Exploited Children (ICMEC), for a raw and deeply personal conversation about online exploitation, AI-enabled scams, human trafficking, and the growing risks facing kids and teens online.Shawnna shares her journey from decades in Big Tech and AI leadership to leading a global organization focused on returning missing children to their families. She also opens up about her own family’s experience with a long-term online scam that targeted her autistic son, exposing how sophisticated, patient, and psychologically damaging modern online exploitation has become.This episode covers: • How online grooming and long-term scams target kids and young adults • The role AI and social platforms play in exploitation and manipulation • Why parental controls alone are not enough • The reality of missing children and trafficking on a global scale • How ICMEC measures success by one metric only: kids reunited with families • The difference between facial detection and facial recognition • Why digital safety requires community action, better safeguards, and real accountabilityIf you are a parent, caregiver, educator, or anyone concerned about child safety online, this conversation is essential listening.🔒 Learn more about protecting kids online: https://familyitguy.com🌍 Learn more about the International Center for Missing and Exploited Children: https://www.icmec.org

Wednesday Jan 07, 2026
Wednesday Jan 07, 2026
Earlier this year, I spoke with Jason Sokolowski about the loss of his 16-year-old daughter, Penelope, after she was targeted by the online criminal network known as 764.This video is an update — and the situation is escalating.764 is a decentralized exploitation network targeting vulnerable kids, primarily girls ages 10–17, across platforms like Discord, Roblox, TikTok, Instagram, and Snapchat. The FBI and DOJ are actively investigating, but reports are increasing, not slowing down.In this video, I cover:• What 764 is and how it operates• Why it’s designed to turn victims into perpetrators• Warning signs parents should never ignore• Why monitoring alone is not enough• What to do if your child may be targetedIf your child can receive messages from strangers online, they are at risk. This is information every parent needs to hear.📌 If you suspect exploitation:Report to the FBI at ic3.govOr call DHS Know2Protect: 833-591-5669Please share this with other parents.

Friday Dec 12, 2025
Friday Dec 12, 2025
The internet is constant noise: endless scrolling, reacting and stimulation. It's rewiring our brains to consume information in tiny bursts and it's affecting everyone, including our kids.So what's the antidote?Stillness.I've spent 30 years in cybersecurity and I help families navigate technology and online safety. One pattern shows up again and again. We can't teach our kids to manage digital chaos if we can't manage it ourselves.Psychiatrist Dr. Daniel Amen teaches a simple 15-second breathing protocol.Inhale for 4 seconds, hold, exhale for 8 seconds, hold again, then repeat several cycles. That longer exhale sends a signal to the body that things are safe and it's okay to calm down.I looked for an app that uses this pattern without endless menus or decisions and couldn't find one. So I built one called Being - One Minute to Calm. When you open it, it starts immediately. No signups, no subscriptions, just breathing and stillness. It uses gentle haptic taps so you can follow the pattern with your eyes open or closed.Available on iPhone, iPad, Apple Watch and Apple TV. Android is coming soon.Search "Being - One Minute to Calm" in the App Store or visit https://www.familyitguy.com/being/Do you have any experience with meditation or breathwork? Share your story in the comments.





