We live in a world of risk, especially digital risk. I don’t know anyone who has been physically robbed beyond someone breaking into a parked car. I know many people, however, that have been digitally robbed with stolen credit cards, SS numbers, the whole bit. The world of online commerce is just flat out tricky and comes with risk, but companies that offer online services take this very seriously. If your credit card charges something odd, you get notified. If you want to wire money, you have to go through multistep authorizations. If you have something delivered by Amazon and someone steals it, Amazon will usually issue a refund if it’s in the correct timeframe and you meet other criteria. Online commerce is, therefore, acceptably safe because companies need to make safety a feature. The reason social media is so risky is that, quite to the opposite, predation and high risk are features of the platforms.
If we look back to the original icloud and iphones and social platforms, there were real, serious actions taken by some to work with law enforcement. If you were a detective, you could gain full access to someone’s phone or cloud storage pretty easily. Over the last decade, that has gotten harder and harder and harder. Now you have to get a full search warrant to attempt to see one picture, which means if you suspect an individual of having 100s of images of child porn, you need to get 100s of search warrants just to get to the evidence that is right there. SnapChat has always been predator friendly as users enjoy end to end encryption and the pictures sent back and forth get deleted once they’re sent, but instead of avoiding that model, companies like Meta are actually leaning into it and giving users end to end encryption. Why, some ask? Why would we do this when we know it makes it harder for law enforcement to prosecute? Why would companies do this when they know young people are all over these fun, social apps? Why would anyone do this when we know it leads to predation? Because again, predation is a feature of these apps. It’s not an unfortunate accident.
Social platforms could stop predation quickly if they wanted to, or at least cut it way way back. We could treat these platforms like we treat financial institutions. We could have age verification for all parties, especially adults wanting to interact with kids. We could stop employing end to end encryption, stop making evidence harder to find, stop deleting all photos. We could take meaningful strides to protect our children like we protect our money, but that’s not what makes money for social platforms. If they stop enabling predation, they turn off millions and millions of users that go there expressly to pray on children. If they stop deleting photos and encryption, people stop sending nude photos, which means millions of teens would stop using the platforms. If they put a stop to this nonsense, the companies perpetuating it start losing billions and billions of dollars, and let’s face it. Our kids aren’t worth that much to the social media giants. The question is, knowing what we know about how these companies effectively sell our children to predators, how much are our kids worth to us?