Keeping Your Family Safe Online in 2026

April 13, 2026
Hey, Jason, welcome back. Thanks. It's been a few months since we last talked. I think we've got a pretty good conversation. Hopefully today, we talked last time a lot about the emerging technologies and what's happening in the threat landscape. I want to pivot the conversation to something that hit home for me as a parent of, you know, 13 year old twins. You know how this is affecting families and children, and a lot about what you're seeing with respect to how you know either threat actors or people that have pretty bad intentions that want to exploit children. You know what's happening? Yeah, and it's, it's just getting worse. Chris, I mean, anybody can Google what's, what's going on in the world right now, and it's not pretty. And I think we're gonna have a good conversation about how parents can help protect the children online, right, right? So there's a certain segment the population that's fairly tech savvy, and they know what to do protect themselves. But then there's folks who aren't as tech savvy, and I'm thinking about minors, I'm thinking about the elderly, and I think that's a good place to start today. Yeah, I've got so I've got 13 year old twins, and so I'm dealing with this every single day. And, you know, they think they know everything, yes, which is just like we did when we were kids, interesting. And you know, they have access to a lot of these technologies, and it's hard as a parent, you've got to balance like social media is a part of the world we live in, so you can't put the block them off. But how do you help them enter social media and engage with it in a safe fashion? You know, I work a lot with my son, who is constantly sort of bouncing around, change accounts, open new accounts. These new social media platforms are out there. Gaming platforms is a big one. What sort of tax Are you seeing or press you seeing? What's maybe go to gaming first? I think it's interesting. Area that a lot of people think about is a social platform for kids. It is what are you seeing in the online gaming space? Yeah, online gaming space? Yeah, I think it's worthwhile talking about the motivation of some some of the attacks at first. So you know, you and I have been in cybersecurity for a long time, and traditionally, what we've seen is financially motivated attacks. That's people attacking companies or individuals via phishing attacks or other types of hacking or or threats like ransomware, but they do that in order to get a financial return, in order to steal money from an organization or an individual. And, you know, we can rationalize that. We understand why they're doing that. That kind of makes sense. Obviously, it's a crime and they need to be prosecuted. But that's one set of attacks that we're seeing, the other set of attacks, the other driver are, you know, what I would describe is maybe nihilists or folks who want to exploit minors. And that's an entirely different thing. And to me, that's the scarier of the two. And I think that's what we're seeing, particularly prevalent with children and online platforms because they know children typically don't have large bank accounts that are not going to steal funds from them, but they are going to attempt to exploit them to do things that they wouldn't otherwise do. Yeah, I think that as a parent, that's incredibly scary. And it also you said, when that motivation, I think we unfortunately understand people that are looking for money, right? That makes sense this other group that's looking to exploit minors. It's something I don't necessarily understand the motivation behind, which means, you know, I struggle to figure out how to, you know, teach my children to stay safe in that environment. You know, what are these threat actors doing? Like, how are they actually trying to get at minors as an example? Well, think about it, when we were kids, right? So my parents always warned me, don't talk to strangers. Beware of the guy who pulls up in a van and offers you can't there you are, and those threats are real. I mean, I actually had an incident like that when I was 10 years old, walking back from a swimming pool by myself, and there was a guy who pulled up in a car, didn't know who he was, and tried to get me in the vehicle. You know, those things don't happen as much of anymore, because of cameras and cell phones and everything else, but there's an online equivalent now, right where you've got these attackers who are trying to coax children off of these platforms, whether it's Facebook or Roblox or Tiktok or whatever discord, Discord, yeah, and then they try to get them on into more private chats, whether it's telegram or signal or or or private discord groups. And that's where they use psychological pressure to coerce these children into doing things that they wouldn't otherwise do, yeah, and I think so, you know, from personal experience, I think a lot of these, you know, these bad people, another good term for them other than that, right? They've right. They they're offering things that are enticing to kids. So it's not candy, but it might be Roblox, right? If you're on the Roblox platform, or it might be, you know, an account that has a bunch of skins, if you play Fortnite or something like that. So they're offering them something that has a lot of value to the child, inherently, in order to gain trust, in order to gain familiarity. And then, oh, by the way, it's easier for me to talk on this platform. You know, do you have an account there of. And then, you know, one of the things as an adult, as a parent, when my child comes and says, Hey, I'd like to add, you know, I want to add a discord, or I want to add a Snapchat, or I want to add some other Telegram, something like that. You know, why? Who you talking to, and making sure aware of what sorts of conversations they're having on it is incredibly important, because it starts off very innocent in many cases, especially in their mind, and then it can quickly accelerate to something you know, more more malicious than that. Yes, exactly, right. And when you're online with children talking to other folks on these platforms, they just assume they're other children their age. But they might not be right. These could be adults who are very well practiced in psychological techniques to entice these children to befriend them and pull them off into areas where they can have darker types of conversations. Another threat vector, another way that these folks operate, is that they'll get to get them to go to links on the web that contain malicious software that will download onto the child's computer or phone or wherever else, and start to pull off personal information that can then be used against that child or send them phishing emails, which essentially do the same thing. They can get their username and password by getting them to click on a link that they control. So there's various different ways that these attackers can get information about the miner and use that information to get the miner to do whatever they want them to do. Yeah, and in some cases, these these threat actors, or, you know, bad people, have have motivation financially as well. So it's not that they just have that, but a lot of times they know a child could be using their parents computer, and a lot of not a lot of kids have a dedicated gaming computer that they use. So they might use a parent's computer that could have passwords and credentials and access to things that you know you might not want, you know you might not want them to have access to, and then exactly that right lead them to, why don't you? Click on this link. This will get you a free account. This will get you account. This will get you a robux. It'll get you something that you're looking for. And then, all of a sudden, you're exposed. A lot of those things are silent listeners, too. So as a parent, you know, making sure that you've got the right malware detection set up on your PC again, talking to your children around awareness that you know if it's if it's too good, if it's too good, if it's something you want, let's speak about it before you go and follow through. Yeah, and I hate to say this, but frankly, that's best case scenario, right? Best case scenario is they're trying to somehow siphon some money from from the child or the children's parents. Worst case scenario is that they're going to blackmail or somehow can coerce these, these miners into doing things like sending them nude photographs or harming themselves or harming other people. I mean, some of these people, I would, and I don't use this term lightly, but I mean they're they're evil, and there's evil people out there exploiting children on these platforms. Yeah, I think the number of stories where you know children have caused self harm, and you know, they're doing things that you know, you know they really shouldn't be involved in. And then, you know, guilt and all those things is really scary. So you talked about your son, you know, what controls what things you have in place, other than, you know, don't do this, yeah. So a couple different things depending on the, you know, the OS and the platform of the phone. One of the things you can do is just set timers around how long they can be involved in these at what times can they be using these particular apps, and then controls on them downloading new apps on on their their phones. There are similar technologies that you can put on any of their devices, whether it's their, you know, an iPad or something like that, or a laptop, and so making sure that I have control over what's being installed is a really, really important thing, because it just allows me to engage before they put some new platform, new technology on, and then I can talk to them about it. Putting controls around how long the screen time or whatever time that they have for any particular app is also an important one, because a lot of times when kids are getting engaged in this, they're not only being kind of pulled into, you know, places that you wouldn't want them more private chat, but they're being pulled into different times, so later at night, when it's sort of quieter and that the threat actors know that parents are less likely to be involved when It gets super late at night, and so they prefer to engage one of those times, versus in the middle of the day when people are around. So like, again, making sure that there are controls on when they're using these apps is important, and then having a lot of dialog right around what is normal and okay behavior. You know, one of the really interesting things is if, if you're especially in gaming, their engagements happen with a lot of adults because gaming is so popular, and we grew up with gaming, and it's become a part of our core culture. You have a lot of adults that game, right? They actually use Call of Duty, or Fortnite, or whatever it might be, and they play that. And so they're used to hearing adult voices, especially. Ask you if there's voice chat on these, on these games they're used to hearing, and if there's things like proximity chat, which is something that I recommend that you turn off in games, and you don't allow your children to be involved in a proximity chat. Proximity chat for your just in case you're not aware, is when you're in a game and you're playing and there's other people you don't know inside that game that are in your proximity, they can actually hear you talking to your your group that you're with, and then they can engage with you and talk to you as well. And so again, that's an uncontrolled engagement with, you know, whoever is playing, playing the platform, and then they start, you know, you don't know what goes on in that conversation. So usually I make sure that proximity chat is turned off in any of the games that he's playing. I usually know the parties. So if he's playing with three or four different people, at least be aware of who those are, how he's met them. And you know, overall, have a pretty important discussion around what you talk about, what you don't talk about, you don't talk about addresses, you don't talk about anything that's inappropriate. You don't talk about bodies, you don't talk about races. There's just a series of things that you don't speak about in order to make sure that there's a more safe environment, other than disabling proximity. Are there anything else that these platforms are doing to assist parents, to help control these situations? For the most part, not there's a pretty big gap between what these platforms are required to do and what they want to do in order to protect children. You can see, there's, you know, there's a lawsuit going on with Facebook at the moment, right with respect to, you know, an insider who came out and said there are hundreds of 1000s of cases of child solicitation that are happening every single day on the Facebook platform, and there is very little, if anything, that they're doing to control it. You're seeing legislation, activities and things like that to try to rein it back in, but there aren't a whole bunch of safety features when it comes to what sort of messages are coming across, the content of those messages in order to push those back to a parental authority figure that can help engage if you need to. So really, it's a lot about you taking a proactive approach to understanding what apps they have, who they're talking to, and then making them aware of what's an appropriate conversation to have or not have. Yeah. Yeah. So, I mean, this is very similar to our life in corporate cybersecurity, right, where we've worked with federal law enforcement, Secret Service, FBI, amazing people at the same time, they just don't have enough resources to chase all this stuff down, right? So it goes back on the platforms, and it sounds like they're not always doing the best job. They're probably overwhelmed as well. So that leaves the parents, right? The parents are kind of the last, or maybe the first, line of protection for children who are online. Yep. We talked a little bit about these, you know, these individuals who are looking to exploit children. I think there was an article that came out. I think it was in The Guardian recently, with a study done out of the UK about this, this calm group that was specifically focused on targeting children for sexual abuse or any sort of self harm or suicide, which is really concerning. Have you heard of that or so? I haven't heard about the calm group, but these sorts of groups have been active for several years now. Probably the best known one is the 764 network, which started in 2015 out of Texas. And it's probably similar to what you're talking about, where they coaxed miners from primarily Minecraft, into private discord groups that would then blackmail these miners to either send nude photos or harm their pets or even carve usernames from the discharge group into their body. Absolutely vile individuals. The good news is they found that that group who interestingly was 15 years old when he started doing this, was charged and then subsequently sentenced to 80 years in prison in 2023 the bad news is there's been a ton of copycat groups sprout up since then doing very, very similar things. So the com group may may be similar to 764, network. That's incredibly scary. I want to pivot into some of the ways that they get, you know, users off these, you know, these kids off of the platform. And I think, you know, phishing is one scenario, one thing that they're using pretty effectively. What again, what are you seeing with phishing attacks, and how they're getting miners out of the platform and going into different environments. Yeah, phishing is interesting, right? You think that would be a solved problem by now? I mean, phishing has been going on for 30 years at this point, and the big email providers, like Google with Gmail, still haven't solved that problem. And there's a lot of corporate software which will help protect companies. Companies against phishing and tax. But from an end user's perspective, it's still very, very challenging. I don't know about you, but I still get emails and texts sometimes, and I'm like, Well, this looks real. Let me take a closer look at it. So I don't think phishing is going away anytime soon. And for tech savvy people, it's even challenging. For minors, it's going to be almost impossible. Sometimes, same for more elderly people who didn't grow up on email like we did. You know, other means that they're getting minors off of these platforms is just, I would say, psychological blackmail, right? So, I mean, oftentimes they're just repeating a playbook that they've known to be successful. They know what words to use in order to coax these children off of the platform because they've done it before, or they've seen other people do it before. So it's it's not like they're inventing a new routine every single time. They're just executing a script that they've seen work before, and so it's going to be befriend this individual you know, could make them think that there's possibly a romantic relationship that's budding, or, Hey, there's something really interesting that I want to show you over on Discord or telegram or wherever else. So rarely does it start with something like a threat. It starts with, Hey, I'm a friend, and I want to show you friendly things. And before you know it, you know things can spiral very, very quickly. Yeah, I think what's concerning about this, and you're having kids myself, is both of them have had people reach out that have tried to get them off of the platform, or get that befriend them, that come to find out were adults that lived in foreign countries, and had I not been paying attention, they could be one of the many stories that are out on you know, you can just Google and see the number of kids who have, again, sent inappropriate photos, have harmed themselves, have committed suicide, horrible, horrible situations, and so it's likely happening to most children. And I don't think that people are aware of the sort of proliferation of this sort of thing going on, and technology is just making it easier and easier. You mentioned early the AI has lowered the barrier for threat actors to go after this. We've talked about phishing and how phishing is is become. It's been around for 30 years, and it's getting more and more sophisticated. Yeah, I mean, again, the stories are out there. What? What? What are, what are you seeing? And what, you know, what's a focus for some of these threat actors to get? Yeah, I mean, AI is amazing, right? We use it every day, in our in our jobs, in our personal life. But there's also a dark side to it, and that's especially prevalent with with deep fakes, and not just video, also audio as well. The AI has gotten so good at this point that these attackers vile individuals. I'm not sure what the right term for these folks are, but they could take a child's photo from Facebook, and then they could create a deep fake of that child nude at that point. Okay, so they have a fake picture of a new child. What are they going to do with that? What they can do is blackmail the child at that point and say, Hey, I have a nude photo of you, and I'm going to make this public unless you do the things that I want you to do. And that technology, it's commonplace these days. I mean, there's open source models that if you have a beefy computer at home, you can create these things inside of your home, right? You don't have to use some sort of fancy open AI provider anymore. You can do it yourself. And the same with voice and and what I'm saying with voice is cloning children's voice or minors voices, and that's very easy to do as well. So if you have a minor who's posting a Tiktok video or a Facebook video, all it takes is 30 seconds of them talking, and they can replicate that voice to say whatever they want to say. And maybe it's not perfect, but if you're an elderly person who's not up to date on these technologies, you could, for example, make a fake phone call to that miner's grandparents pretending to be that child asking for money, for example. And you know, it's very hard to distinguish from what's real these days, what's actually reality and what's fake. Yeah, I think one of the challenges is as we become more digital, right in our identities become more digital. And for kids, it's almost the totality of what they and how they engage with people, all of their friends, if they're on Facebook or they're on. Instagram, or any of these platforms, all of their connections you can just go out and see. And so they can easily look at all their connections and say, Hey, I'm going to send this to your parents, your friends. That's exactly and you know, especially at their age, kids aren't as you know, they're just their brains aren't as formed, right? The world around them is their their world. It means everything to them. And so they get really pressured by these threat actors, and will do almost anything in order to keep a threat actor from embarrassing them or sending things to their parents or their friends that they don't want to. And, you know, again, they get scared. They don't want to tell their parents what's going that's right because, because they're embarrassed by it, exactly. And so you know, as again, as a parent, trying to talk to my children around this is a safe space for you to bring issues like this. If something happens, it's okay. It's not a reprimanding space. But if something seems off, or you're worried about something, bringing it forward instead of sort of internalizing and trying to deal with there on your own, is, I think, a important aspect and an important conversation you have with your children? Yeah, absolutely. I mean, we've, we've talked about this earlier in the conversation, right? And the first line of defense is the parents, just like when we're growing up, don't talk to the guy who wants to offer you freak free candy, right? So the medium has changed. It's no longer the physical world. It's now the online world, but the message is the same, yeah, yeah. And I think one of the things that's changed is the physicality. When we were kids, the person would actually have to be there, right? The likelihood of someone physically showing up on your street at the time when you're outside without a parent and some candy is relatively low. The proliferation of their ability to talk to hundreds, if not 1000s, of children in parallel with very little effort is made this where you don't have to have that proximity, if you have an internet connection, you could be anywhere in the world and engaging with you know your kid or your you know your nephew or your niece or your grandchildren, and having very adult conversations with really bad intent behind it? Yeah, absolutely. And I don't know where this is going to go in the next few years. It's it makes me very uncomfortable and very uneasy when I think about it, because I don't think it's going to get better. I think it's only going to get worse, and having the parental oversight is just going to become so much more important. And you're a good dad, you're a good parent, but there's a lot of parents out there who are not as engaged with their children, and they're going to be more susceptible to these things online. Unfortunately, yeah, what makes me scared is, I mean, I do cyber security for a living. I'm very, sort of proactive with talking to my kids about it, and I know they've been approached, right? So like, again, I think I feel good that they can bring that to me, but if my kids have been approached, what other kids have been approached, where their parents don't know about this, they haven't talked to them at all, and they're they're trying to deal with this on their own, I think that's what scares me the most. Is just many people don't have this level of knowledge, education, engagement, they don't know what's going on. And that's a, you know, that's a really big concern. You don't want a 13 year old or a 10 year old or eight year old or 15 year old trying to navigate this on their own right. And I guess the other aspect that's different than we were children, you know, the the the hypothetical man with with the puppies in his van, more than likely, he was a lone actor, right? He was acting by himself. These days, the people exploiting minors, they tend to congregate in communities online, and they share tactics, they share techniques, and they share their the the exploitation of these miners, and you know, that helps elevate them within their community, so that amplifies the entire problem. Yeah, they've almost become professionals at exploiting, right, you know, weaknesses in this, in the minor population, which is even scarier. Yeah, it's, it's terrifying, yeah? So of course, you mentioned that your son enjoys playing Roblox, yeah. And you mentioned that you give him some guidance. Maybe you could go kind of step by step. What are the exact instructions and guardrails that you give him to keep him safe? Yeah, I talked about proximity chat, which I think is just a good idea, but we're a bit more stringent, and I think of it as almost like a two factor authentication process, right? I separate who he's chatting with from the platform itself and the way that he meets these people. So if he's in Roblox or any other game, whether it's Fortnite or Call of Duty or anything like that, there's no chatting allowed in the platform whatsoever. So he's not allowed to chat through the native platform environment with anyone in the game. What that means is it keeps him from chatting with anyone that he randomly meets online. There's. Just no chat. It's turned off both to hear it as well as him to actually chat with people. But he can still chat with his friends. He can still chat with his friends. So we have a separate chat platform in this court case. We use discord because I'm on the discord and I'm in the discord channel, and I can see what's happening there. And he can only then chat with people that he's met, so friends that he goes to school with are friends that he's in sports with, and we've and we validate that as parents. So what ends up happening is, one, he's only chatting with people that he's met in real life through either school or some shared community event I can validate with those people's parents. And then two, there is just no ability for anyone online to actually start to have a conversation with him. That means that we're turning off text chatting. Some of these games have a lot of, you know, text capabilities, and they do a lot of text chatting. We pull that off again. Everything is handled through a separation of the gaming platform and then the communication channel. And I have much more control over that communication channel and who he's actually engaging with in that way. Okay, so it sounds like you've got very strict guardrails set up for your son. I'm curious about his friends, though, do his friends parents have similar guardrails set up? I mean, I've encountered both. Some don't even think about it. And so when I, you know, we reach out to engage or, you know, see them at a sporting event or the neighborhood or what have you. And we talked like, hey, I want to validate that they're there, that they're communicating with each other online. And a lot of them are like, that's an odd question. Why are you why are you asking me that? Right? And so when I start to explain why, and many of them are like, Tell me more. How do I get more engaged? Do I do something similar? So I've seen a lot that just adopt that, at least that same principle, because it's a good best practice. Some, unfortunately, are just sort of passive about it and a little bit more laissez faire around how they're managing their kids' presence online. But a lot are like, Oh, this is something that we should do as a best practice. And I would encourage anyone to try to separate again, the platform from the communication and that gives you a lot more control of what your children are doing. And have your children come to you and said that people have have reached out to them. Oh, yeah. Or nefarious means, yeah. So when they first started to get, you know, move into the digital world, again, they're 13, but I think, I feel it's important for them to know how to intelligently engage with, you know, the internet and digital technologies, because that's the world they're going to grow up in, and that's essentially the world they need to flourish in. And so as we started to introduce them, you know, with iPads and, you know, and then eventually phones, we had conversation on what was appropriate, what was inappropriate. A lot of the technologies have just evolved so rapidly that that conversation just continues to evolve. But early on, before I actually separated this whole, you know, platform and and how we talk, right? The platform that they talk in, yeah, my son came to me and he had a group of individuals that he was talking to. They weren't. They didn't live in the States. They were actually across. They were in Europe. And some of the conversations were adult to adult. They weren't necessarily solicitation, but they were conversations you wouldn't want, you know, a 10 year old engaged in, and he actually said, Hey, there's these things going on and there's these things they're talking about. And so we did. We changed the we changed the way that he was communicating. My daughter, also same age, is engaged online and plays, or likes to play Roblox and a lot of other things. She plays this game where you're, you know, you can be a horse. Forget the name of the game, and she loves horses, a horse, and you run around the land and you meet other horses, and there's ponies, and you can have horses and all these things and and then you talk to other people that are in the area. Again, it's more of a proximity chat. And yeah, there was definitely someone on there that was knowing this was largely little girls that are playing this game. There was an adult on there who was roaming around and meeting and befriending girls and then trying to get them onto another platform. So this happened to my daughter in this particular game. It's one of the ways I was like, okay, proximity chat is no longer a thing. So a lot of the best practices are, frankly, from having very open conversations with my children, having them be out in the world and open enough to bring me this and saying, Okay, we're going to change the way that we're engaging with the technology that you're using. So, yeah, is it an exaggeration to say that the majority of children, or maybe even every child, on these platforms, is being approached at this point? I mean, you know, I have a sample size of two, right? But it's 100% for my sample size of two. And I think that they're pretty savvy, to be honest with you. And so that that, I would venture to guess that it's pretty significant. Again, we talked about that lawsuit that's out there with Facebook involved. If there are 300,000 incidents a day of someone reaching out. Try to solicit a child on the Facebook platform alone tells you it's probably pretty prevalent across any platform where they can get access to minor So, yeah, I think this is, you know, if you're it, what I will say is, if you're not paying attention to what your children are doing, you're at very high risk that someone has reached out. And maybe, maybe they're smart enough to capture it. But again, they know how to prey on the psychology of a child. To your point, there are groups talking about how to do this successfully, and how do they prey on, you know, the vulnerabilities of a child. So they're just going to get better and better and better at getting what they want. So, you know, I would take a very proactive approach to making sure that you understand what your children are doing, who they're communicating with, how they're communicating, and then you know how they're securing their accounts overall, so that people don't get into their accounts, right? So what Transcribed by https://otter.ai