Senators put executives from YouTube, TikTok and Snapchat on the defensive Tuesday, questioning them about what they’re doing to ensure young users’ safety on their platforms.

In Tuesday's subcommittee hearing, one key Senator said that YouTube, TikTok and Snapchat are offering only “tweaks and minor changes” in their operations to ensure young users’ safety amid rising concern over the platforms’ potential harm to children.


What You Need To Know

  • Sen. Richard Blumenthal, D-Conn., charged that YouTube, TikTok and Snapchat are offering only “tweaks and minor changes” in their operations to ensure young users’ safety amid rising concern over the platforms’ potential harm to children

  • The panel took testimony recently from a former Facebook data scientist, who laid out internal company research showing that the company’s Instagram photo-sharing service appears to seriously harm some teens

  • Sen. Blumenthal called this "a Big Tobacco moment" for tech: "It is a moment of reckoning. There will be accountability."

  • The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy, with the aim to develop legislation to protect young people and give parents tools to protect their children

“Everything you do is to add more eyeballs, especially kids’, and keep them on your platforms for longer,” Sen. Richard Blumenthal, D-Conn., said at the start of a hearing by the Senate Commerce subcommittee on consumer protection that he heads.

Citing the harm that can come to vulnerable young people from the sites — ranging from eating disorders to exposure to sexually explicit content and material promoting addictive drugs — the lawmakers also sought the executives’ support for legislation bolstering protection of children on social media. But they received little firm commitment.

"The problem is clear," Sen. Ed Markey, D-Mass., said. "Big Tech preys on children and teens to make more money. Now is the time for the legislative solutions to these problems, and that starts with privacy."

The panel took testimony recently from a former Facebook data scientist, who laid out internal company research showing that the company’s Instagram photo-sharing service appears to seriously harm some teens. The subcommittee is widening its focus to examine other tech platforms, with millions or billions of users, that also compete for young people’s attention and loyalty.

“We’re hearing the same stories of harm” caused by YouTube, TikTok and Snapchat, Blumenthal said.

“This is for Big Tech a Big Tobacco moment," Blumenthal said. It is a moment of reckoning. There will be accountability. This time is different.”

To that end, Markey asked the three executives — Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube’s owner Google; and Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc. — if they would support his bipartisan legislation that would give new privacy rights to children, and ban targeted ads and video autoplay for kids.

In a lengthy exchange as Markey tried to draw out a commitment of support, the executives avoided providing a direct endorsement, insisting that their platforms already are complying with the proposed restrictions. They said they’re seeking a dialogue with lawmakers as the legislation is crafted.

That wasn’t good enough for Markey and Blumenthal, who perceived a classic Washington lobbying game in a moment of crisis for social media and the tech industry. “This is the talk that we’ve seen again and again and again and again,” Blumenthal told them. Applauding legislative goals in a general way is “meaningless” unless backed up by specific support, he said.

“Sex and drugs are violations of our community standards; they have no place on TikTok,” Beckerman said. TikTok has tools in place, such as screen-time management, to help young people and parents moderate how long children spend on the app and what they see, he said.

"I encourage all the parents that are listening to the hearing today to take an active role in your teen's phone and app use," Beckerman said in his opening remarks.

The company says it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. The video platform, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. In only five years since launching, it has gained an estimated 1 billion monthly users.

Early this year after federal regulators order TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for users under 18.

Pressed by Sen. Amy Klobuchar, D-Minn., about a 19-year-old said to have died from counterfeit pain medication he bought through Snapchat, Stout said, “We’re absolutely determined to remove all drug dealers from Snapchat.” She said the platform has deployed detection measures against dealers but acknowledged that they are often evaded.

Stout made the case that Snapchat’s platform differs from the others in relying on humans, not artificial intelligence, for moderating content.

"Snapchat was built as an antidote to social media," Stout said.

Snapchat allows people to send photos, videos and messages that are meant to quickly disappear, an enticement to its young users seeking to avoid snooping parents and teachers.

Only 10 years old, Snapchat says an eye-popping 90% of 13- to 24-year-olds in the U.S. use the service. It reported 306 million daily users in the July-September quarter.

Miller said YouTube has worked to provide children and families with protections and parental controls like time limits, to limit viewing to age-appropriate content. The offshoot YouTube Kids, available in around 70 countries, has an estimated 35 million weekly users.

"Between April and June of this year we removed nearly 1.8 million videos for violations of our child safety policies, of which about 85% were removed before they had 10 views," Miller said in her opening remarks.

“We do not prioritize profits over safety. We do not wait to act,” she said.

The three platforms are woven into the fabric of young people’s lives, often influencing their dress, dance moves and diet, potentially to the point of obsession. Peer pressure to get on the apps is strong. Social media can offer entertainment and education, but platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, lawmakers say.

The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy. And Blumenthal especially asked the executives whether independent research had been conducted on the impact on young people of the platforms. He said the lawmakers want to receive information from the companies on such research soon.

TikTok, in its first time testifying before Congress, received especially fierce criticism during the hearing, particularly from conservative Republican lawmakers who highlighted its Chinese ownership. The company says it stores all TikTok U.S. data in the United States, with a backup facility in Singapore.

“TikTok actually collects less data than many of our peers,” Beckerman said.

Sen. Ted Cruz, R-Texas, told Beckerman that he dodged questions more than any witness he’s ever seen in Congress.

"You have dodged the questions more than any witness I have seen in my 9 years serving in the Senate," Cruz said. "You answer non sequiturs and refuse to answer very simple questions.

When a witness does that, it is because they are hiding something," Cruz continued.

TikTok’s privacy policy states, “We may share all of the information we collect with a parent, subsidiary or other affiliate of our corporate group.” Senators drilled down on whether “other affiliate” includes ByteDance and what that means for Chinese access to data.