Instagram chief Adam Mosseri defends app, calls for regulation at Senate hearing

Head of Instagram Adam Mosseri testifies before the Senate Commerce, Science, & Transportation Subcommittee at a hearing titled ‘Protecting Kids Online: Instagram and Reforms for Young Users’

Head of Instagram Adam Mosseri testifies before the Senate Commerce, Science, & Transportation Subcommittee at a hearing titled ‘Protecting Kids Online: Instagram and Reforms for Young Users.’

Instagram head Adam Mosseri testified about the company’s efforts to address user safety Wednesday during a Senate hearing on concerns that the social media app is having a harmful effect on the mental health of teenagers.

In prepared opening remarks, Mosseri argued Instagram was working to address the app’s negative effects. The Instagram chief said he was "proud" of the company’s efforts to "help keep young people safe," though he reiterated the company’s call for the introduction of industry-wide regulations to govern how social media platforms operate.

"I recognize that many in this room have deep reservations about our company," Mosseri said, "but I want to assure you that we do have the same goal. We all want teens to be safe online. The internet isn’t going away, and I believe there’s important work that we can do together – industry and policymakers – to raise the standards across the internet to better serve and protect young people."

This Aug. 23, 2019 photo shows the Instagram app icon on the screen of a mobile device in New York. (AP Photo/Jenny Kane, file) (AP Images)

Mosseri agreed to testify at a fifth hearing of the Senate Commerce Committee’s consumer protection panel, which is working to address public concerns about online safety for children. He called for the creation of an independent oversight body that would set standards for the tech industry on key safety policies such as age verification, parental controls, and building what he described as "age-appropriate experiences." 

"This body should receive input from civil society, parents, and regulators. The standards should be high and protections universal," Mosseri said. "And I believe that companies like ours should have to adhere to these standards to earn some of our Section 230 protections."

Mosseri also took a shot at rival social media platforms TikTok and Snapchat, arguing efforts to protect users’ mental health are an industry-wide challenge rather than one specific to Instagram.

"The reality is that keeping people safe is not just about one company. An external survey from just last month suggested that more U.S. teens are using TikTok and YouTube than Instagram. This is an industry-wide challenge that requires industry-wide solutions and industry-wide standards."

Instagram CEO Adam Mosseri takes his seat before a Senate Commerce, Science and Transportation Committee’s Consumer Protection, Product Safety, and Data Security Subcommittee hearing on “Protecting Kids Online: Instagram and Reforms for Young Users”

On the eve of the hearing, Instagram released a set of tools meant to promote user health. The tools include a "take a break" feature and one that will "nudge" teen users to view a different topic if they have engaged with one for too long.

Instagram said it would release its first tools for parents and guardians early next year.

During his opening statement, the subcommittee's chairman, Sen. Richard Blumenthal, D-Conn, said the country was "in the midst of a teen mental health crisis."

"I believe that the time for self-policing and self-regulation is over. Some of the big tech companies have said ‘trust us.’ That seems to be what Instagram is saying in your testimony," Blumenthal said. "But self-policing depends on trust and the trust is gone."

Blumenthal said Congress was prepared to act with legislation to address the crisis.

Sen. Richard Blumenthal, D-Conn., speaks during a Senate Judiciary Committee hearing on Capitol Hill in Washington, Tuesday, Nov. 10, 2020, on a probe of the FBI’s Russia investigation. (AP Photo/Susan Walsh, Pool)

"What we need now is independent researchers, objective overseers not chosen by big tech but from outside, and strong, vigorous enforcement of standards that stop the destructive, toxic content that now too often is driven to kids and takes them down rabbit holes to dark places," Blumenthal said.


Meta, the parent company of Facebook and Instagram, has faced unprecedented scrutiny in recent months after whistleblower Frances Haugen, a former employee, leaked thousands of documents detailing internal research into harm caused by the platform. In previous testimony on Capitol Hill, Haugen said company executives were aware of the harmful effects on teen users but prioritized profit over safety.

A Wall Street Journal series called the "Facebook Files" revealed internal research showing Instagram was harmful to the mental health of teenage girls.

Source: Read Full Article