Samuel Levine recently gave a speech at the Fordham University Law School in New York. The director of the bureau of consumer protection at the Federal Trade Commission highlighted the ways the agency is working “toward a safer, freer, and fairer digital economy.”

Levine outlined three goals the FTC has for a better digital economy.

“Our work reflects a concerted strategy to make our digital economy work better for people, rather than a handful of tech giants,” Levine told the assembled attendees in the Bronx, where Fordham is located.

Establishing a zone of privacy on the internet

Levin said that at a high level, this part of the solution is straightforward.

“Firms need to collect and retain less data about us, and secure it better. Yet the behavioral ad-driven business model that has shaped the internet for decades has pushed firms in the opposite direction — contributing to privacy abuses growing worse, and data breaches growing ever more catastrophic. And until very recently, there was little in the way of government pushback — binding efforts to realign incentives in the public interest, rather than simply urging companies to disclose what they’re doing,” Levine said.

“But today, that is changing. We are seeing significant momentum in Congress and in states across the country to pass privacy legislation. And at the FTC, we are securing real limits on the handling of people’s data that people did not believe was possible just a few years ago,” he continued.

Levine said he strongly supports both state and federal efforts to pass strong data protection legislation.

“Indeed, these efforts are urgent. But at the FTC, we are not just watching and waiting. We are undertaking a concerted strategy to demonstrate how the FTC Act requires substantive protections for people’s data, rather than simply more disclosures,” he said. “And the wins that we are securing point the way toward how our digital future can afford us more autonomy, more privacy, and more freedom than is possible under the status quo.”

Making the internet less like a casino

Levine explained why he chose the analogy of the site so prevalent in places like Las Vegas or Atlantic City.

“One of the key consequences of our status quo of unchecked surveillance has been the emergence of a casino-like interface across the web, with firms using sophisticated behavioral techniques — refined and perfected by hoovering up data — to manipulate us. That is why another key goal for the FTC is to make the internet safer by cracking down on harmful online interfaces,” he said.

“We have a lot of work to do,” Levine continued. “In 2022, we published a staff report warning about the increasing prevalence of dark patterns across the internet. Gig platforms are reportedly using nudges and other gaming-like features to keep drivers on the road. Investment apps are designed to lure users into making bigger and riskier bets. And as many firms turn from a model of selling goods to a model of selling subscriptions, too many of them are embracing user interfaces that obscure fees or deter cancellation.”

Levine pointed out that the FTC’s work to limit overcollection is addressing one of the root causes of this phenomenon, and the agency is also challenging harmful and coercive interfaces more directly.

“Our work is not going to fix the internet overnight,” he said. “And I worry constantly that for kids today, who are facing an onslaught of harmful design practices, a better internet may come too late. But as with our efforts on data minimization, it’s important to step back and recognize the leap we’ve made. Manipulative design choices do not always involve misleading claims.

“But through our rulemaking, our unfairness actions, and our forward-leaning remedies, we’re making clear that they are not beyond the reach of our authority. And this shift — while early — creates a path for more assertive actions across the board,” Levine went on to say.

Ensuring AI work for us, and not the other way around

And what seems to arise in any conversation about technology nowadays, Levine touched on artificial intelligence when noting the FTC’s third goal.

“I don’t need to tell you that we are seeing a lot of hype around artificial intelligence. At the FTC, just as we examined the broader credit ecosystem in crafting the Holder Rule, we are looking at every layer of the AI tech stack, including cloud infrastructure, microprocessors, and foundation models. Today, I want to focus particularly on AI tools, where the FTC is working to ensure they work for people, and not the other way around,” he said.

“In these early days, people are using AI tools in all sorts of exciting ways — from learning a new language to disputing medical bills,” Levine continued. “But we know from the development of the web over the last three decades that these dynamics can change. Young people today might enjoy using image generation tools to imagine how they’ll look in the future. But these same tools — trained on millions of faces — can be used by private and public actors alike to surveil and manipulate us in harmful ways.

“Likewise, while job applicants might appreciate being able to enhance their resume with the help of AI, we may soon find employers relying on AI recommendation systems that replicate discriminatory practices we’ve tried to eradicate in the real world,” he went on to say.

Levine pledged that the FTC “will not repeat the mistakes of the 2000s,” in reference to the regulator not being swifter with actions when the internet was in its infancy.

“First, we want to make sure AI tools aren’t used to defraud people. We’re deploying our rulemaking authority to arm us with new tools to combat AI-powered impersonation frauds, and to hold accountable firms that provide the means and instrumentalities to commit such fraud,” Levine said.

“We’ve made clear that AI robocalls are not exempt from our Telemarketing Sales Rule. We launched a groundbreaking Voice Cloning Challenge to generate ideas on tools to fight back against voice cloning fraud. And we proposed a rule cracking down on firms that generate fake reviews – yet another online scourge that AI threatens to turbocharge,” he continued.

“We also want to make sure that AI tools are developed responsibly, and are not trained on illegally collected data. We’ve put out business guidance making clear that firms can’t sneak changes into their terms of services to hoover up more data to train AI, and we’ve laid the groundwork for challenging such practices in a recent enforcement action,” Levine went on to say.

Final thoughts

Levine wrapped up his prepared remarks by mentioning the FTC received more than 60,000 comments on its proposal to ban “junk fees” and more than 27,000 comments on its proposal to ban noncompetes.

“Having such broad engagement with our work is something the FTC has not seen in decades,” Levine said. “Whether you look at our rulemaking docket, watch one of our open commission meetings, or attend one of our recent workshops, it is clear we have tapped into grassroots frustration with how markets are working. And our engagement with the public is helping us lay out clear proposals for how we can make markets work better.

“So, at the FTC and across the government, the tide is turning. We are actively tackling the biggest problems facing the public, and we are building a track record of real wins for the American people,” he concluded.