(Bloomberg) — Some of the tech industry’s most prominent and powerful leaders will descend on Capitol Hill Wednesday for a Senate hearing focused on protecting children online.
Most read from Bloomberg
CEOs of Meta Platforms Inc., Congress has increasingly scrutinized social media platforms as mounting evidence suggests that overuse and the spread of harmful content can harm young people’s mental health.
Several bipartisan proposals aim to hold tech companies accountable, strengthen protections for young users and stop the sexual exploitation of children online. Yet numerous technology trade groups and civil liberties organizations have criticized many of the proposed measures as flawed and counterproductive, arguing that they would worsen online privacy and security if implemented. A handful of social media companies, including TikTok, owned by ByteDance Ltd., and Meta, are facing lawsuits in California that allege the companies were negligent and ignored the potential harm their platforms caused teens.
Questions from elected officials rarely stay hyper-focused on the topic when tech CEOs visit Washington, especially during hearings on online content moderation. It is possible that Wednesday’s session will be comprehensive. Here’s who from each company will testify and what they are likely to discuss. Read more: Meta, X and Tiktok face senators’ scrutiny over children’s online safety
Mark Zuckerberg Meta
Meta, owner of the social networking apps Facebook and Instagram, has faced significant backlash in recent years for its child safety practices. Research by whistleblowers, news organizations and academic researchers has found that the company’s sites can harm the mental health of young users and connect networks of predators to child sex content. In October, more than 30 states sued Meta, claiming its social media apps push harmful content to young people. Meta had plans in 2021 to create a version of Instagram for children under 13, but later scrapped those plans after criticism related to Instagram’s impact on teens’ mental health.
CEO Mark Zuckerberg, who has testified extensively before Congress in the past but has recently turned away from policy issues, will focus on the company’s efforts to improve child safety. Earlier this month, the company announced plans to tighten default messaging settings for teens on Instagram and Facebook, as well as restrict teens from seeing age-inappropriate content. Meta has an advertising policy that prohibits marketers from showing certain types of ads to teens or targeting them based on certain factors, such as gender or their activity on the network.
Linda Yaccarino will appear before Congress for the first time as CEO of X, a role she took on last June. Former NBCUniversal advertising boss Yaccarino has spent her first eight months on the job trying to win back advertisers and convince skeptics that X’s owner Elon Musk still cares about oversight of the social network. Musk has spoken or tweeted several times about the importance of protecting children online, making it a key part of the company’s public campaign to regain user trust and approval.
Yaccarino was in Washington this week to meet with senators ahead of Wednesday’s hearing and talk about the company’s efforts to combat content related to child sexual exploitation. During meetings, she also emphasized that X is a very different company from its predecessor, Twitter, and that during the hearing she will likely attempt to further distance X announced this weekend that it will build a new trust and safety center in Austin, with staff primarily focused on combating content related to child sexual exploitation. It is also likely that Yaccarino will face questions about the rise of AI-generated content, including explicit content, and its spread on X. Last week, explicit images that looked like Taylor Swift circulated on X for hours before being removed, which generated millions. views and raises questions about the platform’s ability to quickly moderate offensive and illegal posts.
CEO Evan Spiegel oversees Snapchat, an app popular with teens that focuses more on personal messages than public posting. But that hasn’t protected it from criticism. Snap is facing a lawsuit in California filed by families who claim their children died of overdoses after buying drugs through the app.
In 2022, Snap introduced a feature that allows parents or guardians to see certain activity on their child’s account and implement controls, such as whether their child can interact with the company’s AI chatbot. Last year, the app also introduced a strike system for accounts that publicly post content to Stories or Spotlight that is inappropriate for teens.
Before Wednesday’s hearing, Snap became the first tech company to endorse the Kids Online Safety Act, opposing trade group NetChoice’s position on the bill.
Shou Chew — TikTok
CEO Shou Chew returns to Congress nearly a year after his first solo testimony before the House of Representatives. He was then questioned about child safety concerns surrounding the addictive nature of TikTok and its content promoting eating disorders, drug sales and sexual exploitation. Chew, who faced a confrontational audience at that hearing, argued that these issues are not unique to TikTok.
Last year, the company introduced a one-hour preset time limit for users under 18 before the app asked for a passcode to continue watching. Users who say they are 13 to 15 years old have a private account by default and cannot send messages. Like other apps, TikTok has a dashboard that can share usage information with parents and guardians.
Through parent company ByteDance, Chew was also able to receive questions about the company’s relationship with China. Senators can also ask about recent hot-button issues unrelated to children, including perceived bias when it comes to conflicts such as the war between Israel and Hamas, as well as the spread of AI-generated videos.
Jason Citron – Discord
Originally a chat app for gamers, Discord has been involved in several high-profile investigations into child predation, extremism, and even terrorism. Today, Discord is mainstream among millennials and Gen Z for daily communication with friends and online acquaintances. In 2021, the company reported 150 million monthly active users and even explored an acquisition by Microsoft for $12 billion.
With the increased popularity comes a high level of abuse. According to data from the National Center for Missing and Exploited Children, the number of child sexual exploitation cases on the platform increased nearly sixfold to 169,800 between 2021 and 2022. That is 73% higher than X, although the increase is also partly due to better detection methods.
CEO and co-founder Jason Citron will represent the company before the committee and discuss initiatives to protect children on the platform. That likely includes the new, open-source model for detecting new child abuse and its partnership with cross-platform child safety organization Lantern. In late 2023, Discord introduced new features for teens and families to better manage their online experience.
–With help from Aisha Counts and Oma Seddiq.
Most read from Bloomberg Businessweek