Finding a Way Forward on Lawful Access: Bringing Child Predators out of the Shadows
Remarks as delivered.
This past June, in a New England town, a cyber tip came in to FBI agents and local law enforcement. The tip suggested that a 9-year-old girl was being sexually abused. The abuser was using a particular app to send out images of what he was doing to that little girl while remaining anonymous.
Our agents, along with our state and local partners, contacted the app provider. Using legal process, we got information that allowed us to locate the little girl in less than 24 hours. We obtained multiple search warrants, rescued her, and arrested her abuser. In another case over the summer, a different child predator used a different app to distribute sexually explicit images of two young girls—one 12 and one 13 years old. Responding to a tip, agents served legal process on that app provider and located and rescued those two girls in less than 12 hours.
Both of those cases could have ended very differently. Because without the information from the tech companies—both tips and responses to lawful orders—we wouldn’t even have known about those children. And we wouldn’t have been able to rescue them. I gave you just two heart-wrenching examples. Law enforcement receives millions of tips like these every year.
Success stories are great, but the landscape has been changing under our feet. With the spread of user-controlled default encryption, providers frequently can’t identify horrific images within encrypted data. That means tips like the ones that allowed us to rescue the three girls in those examples—those tips just don’t get sent. The harm doesn’t stop. The victims—those little kids—are still out there enduring the abuse.
Only the tips—the information that could help us identify them—disappear. Our ability to use legal process to quickly investigate and save the kids in those images is eroding, too. All too often, vital electronic evidence has been made unavailable through encryption that doesn’t allow for execution of legal process including court-approved search warrants.
That’s why we’re here today: to talk about the challenges of default encryption and lawful access and what we can all do together to find a way forward. Today, you’ll hear about other cases like the two I just mentioned. Some of these stories will come from the victims themselves—survivors of abuse who can tell us firsthand about the sobering costs and consequences we all face if we lose the ability to keep people safe. These stories are hard to listen to—and they should be hard to listen to—because no one should ever have to endure what these victims lived through. It’s hard for us to contemplate what those images actually show. Horrific abuse. Scarring, awful crimes against kids, even infants and toddlers. Photographed and videotaped, so it can follow them for years to come.
But if we don’t talk about this, if we don’t confront these real-life horrors happening to real people, if we don’t take action and do something soon to address the lawful access problem, it will be too late and we’ll lose the ability to find those kids who need to be rescued. We’re going to lose the ability to find the bad guys who need to be arrested and stopped. And we’re going to lose the ability to keep the most vulnerable people we serve safe from harm. We just cannot let that happen.
Technology has made life much easier for the good guy—there’s no doubt. But it’s also made life much easier for a wide range of bad guys—including international and domestic terrorists, hackers, opioid traffickers, and child predators. Like other criminals, child predators routinely rely on encrypted phones and laptops to store explicit photographs and exchange illegal media, contact victims, and coordinate with co-conspirators over encrypted messaging platforms.
These devices and platforms have become spaces where vital rules—against soliciting child abuse, against trading in and feeding that abuse, against threatening abuse victims struggling to make a normal life—can no longer effectively be applied.
It makes sense to gather here today, because we’re at a turning point. Some of our partners in the tech industry have been a huge help in getting us the digital evidence we need. Facebook and some other tech companies employ thousands of people to help identify child sexual abuse imagery, and then notify NCMEC—the National Center for Missing & Exploited Children—a vital partner you’ll hear more from later.
That alone is an uncomfortable fact—the sheer volume of this awful imagery really does keep thousands of people (and more) fully occupied to handle it, and that’s not even talking about the law enforcement response. But uncomfortable facts are still facts. Every year, Facebook provides more than 90 percent of the referrals received by NCMEC—and NCMEC now receives more than 18 million referrals a year, any one of which might be a tip that leads us to the next predator.
In that sense, Facebook is saving lives with those tips. But there are other tech companies that have already chosen to blind themselves to the content on their platforms. Those companies now provide few—if any—leads to law enforcement. We know their platforms host content involving abused children. The companies themselves just can’t identify it anymore, so they don’t warn us about the vast bulk of what’s happening. And some of those companies have millions or even billions of customers—both here and abroad. So there’s a whole lot of abuse going undetected.
Unfortunately, Facebook could be headed in the same direction. Facebook announced a “privacy first” plan in March. Their intention is to make all communications on Facebook and Instagram end-to-end-encrypted. If Facebook carries out that plan, it will have access to metadata—for example, the time a message was sent, and its recipient—but not (double underscore, not) the content of any messages, including attached photos and videos.
When it comes to people who create and distribute child pornography, it shouldn’t be hard to see why timestamps and address blocks are poor substitutes—as leads, and certainly as evidence—for the images and videos they are disseminating.
This is incredibly concerning, to put it mildly. When it comes to protecting children, we’re at a real inflection point—and we risk falling off a cliff. Most of the tips Facebook currently provides are based on content. With end-to-end encryption, those would dry up. Facebook itself would no longer be able to see the content of its users’ accounts.
That won’t just stop the tips. It will prevent Facebook from providing content to law enforcement in response to legal process—the content we need to actually find who and where a victim is. The fact that NCMEC receives over 18 million tips a year shows we aren’t talking about some handful of abusers. This is a huge problem. Fighting it with metadata just isn’t going to work. To conduct a search and bring criminal charges in this country, we in the government have to meet a high standard. To convict, an even higher one. Metadata will almost never meet either standard. While an algorithm or AI might reveal suspicious customer usage, that kind of information—standing alone—will rarely be adequate to make a case and bring the perpetrators to justice.
Even when it can, we will find ourselves laboring only on the tip of the iceberg. Working on the small number of cases that authorities actually learn about, while the vast bulk of the kids who need us then remain out of view hidden below. The real impact would be to those victims, those kids. Facebook would transform from the main provider of child exploitation tips to a dream-come-true for predators and child pornographers. A platform that allows them to find and connect with kids, and like-minded criminals, with little fear of consequences. A lawless space created not by the American people, or their elected officials, but by the owners of one big company.
But this is not just about Facebook. We’ve got to make sure tech companies—all of them—aren’t taking steps that will place content beyond the reach of the courts. Or to blind themselves deliberately to what’s happening on their platforms, where so much child exploitation takes place. We’ve got to make sure that companies can’t keep creating unfettered spaces beyond the protection of law. Because there are kids out there we haven’t found, and dangerous criminals we haven’t caught, who are already moving on to their next victims.
So what are we going to do about it? I’m well aware that encryption is a provocative subject for some. Although I will tell you, I get more than a little frustrated when people suggest that we’re trying to weaken encryption—or weaken cybersecurity more broadly. We’re doing no such thing. And dispensing with straw men would be a big step forward in this discussion. Cybersecurity is a central part of the FBI’s mission. It’s one part of the broader safety net we try to provide the American people: not only safe data, safe personal information, but also safe communities, safe schools.
We also have no interest in any “back door,” another straw man. We—the FBI, our state and local partners—we go through the front door. With a warrant, from a neutral judge, only after we’ve met the requirements of the Fourth Amendment. We’ve got to look at the concerns here more broadly, taking into account the American public’s interest in the security and safety of our society, and our way of life. That’s important because this is an issue that’s getting worse and worse all the time.
As FBI Director, I’ve now visited all 56 of our field offices, and I meet frequently with law enforcement leaders from all over the country and around the world. I can tell you that police chief after police chief, sheriff after sheriff, our closest foreign partners, and other key professionals are raising this issue with growing concern and urgency. They keep telling us that their work is too often blocked by encryption schemes that don’t provide for lawful access.
So while we’re big believers in privacy and security, we also have a duty to protect the American people. That’s the way it’s always been in this country; no technological advance or company’s business model changes that fundamental precept. But make no mistake: that’s the path we’re on now, if we don’t come together to solve this problem. So to those out there who are resisting the need for lawful access, I would ask: What’s your solution? How do you propose to ensure that the hardworking men and women of law enforcement, sworn to protect you and your families, actually maintain lawful access to the information they need to do their jobs? What will you say to victims who are denied justice—or left unrescued—in the name of some incremental amount of additional data security?
I know we’ve started hearing increasingly from experts that there are solutions to be had, that enable both strong cybersecurity and lawful access. And I believe those solutions will be even better if we move forward together.
Today we’ll hear from experts on the dangers we face if lawful access slips away from us. We’ll hear from state and local and federal investigators and prosecutors, who see the impact of lawless spaces on the safety of our communities and can tell you from first-hand experience what a world without lawful access to content looks like. And we’ll hear from those outside law enforcement who work to protect our children, and from some of our foreign partners, who are struggling with the same issues we are.
We’ve made a point of inviting the tech industry to attend today, and we continue to reach out to industry to find ways forward that protect both our data and our families—our children. They aren’t on stage today—today is about showing the human cost of technology that undermines the protections our kids deserve. But there will be many more discussions to come. Let me be clear: we are not here to demonize tech, and we couldn’t do our day-jobs without technological tools. We know the concerns of some tech companies—about privacy and protecting their users’ data—are laudable. With the context we can provide today, about how data security fits in with all the other flesh-and-blood safety needs of our communities, we’ll be better able to forge ahead.
This summit couldn’t come at a better time. Lawful access isn’t just a future problem, it’s here now. Together, we in government, law enforcement, the victim’s advocacy community, and the tech industry have the power, the ability, and the skills to find a mutually acceptable solution. We’ve put some of the brightest minds in the country on this issue, and we’ve learned that it can, responsibly, be done.
We’re not prescribing a particular technical solution—every company that’s instituting default encryption is different, and the companies themselves are likely in the best position to develop lawful access solutions. We all want safe, secure, private data, but we also want safe and secure communities. And we can have both, I really do believe that. I hope you have a great conference and I look forward to hearing some of the ideas that come out of today’s discussions.