POLICY
POLICY
POLICY
Instagram Chief Executive Adam Mosseri sat in front of U.S. senators today to explain why the app isn’t as harmful to children as many people are saying.
In his opening speech, Mosseri (pictured) said he believes the app could be a “positive force” in children’s lives, which seems to go directly against leaks that showed Meta Platforms Inc., then Facebook Inc. had conducted research into the matter and discovered that the app was in some ways harmful to young people – the main reason Mosseri was at Congress today.
After talking about some of the positives, he said it was now time to create an “industry body” that would improve safety based on input from parents, regulators and civil society. Together, Mosseri said, they should agree on standards relating to what children do and see on the app and how much control parents have.
“The standards need to be high, and the protections universal,” said Mosseri. “I believe that companies such as ours should have to earn their 230 protections by adhering to these standards.” Section 230 is a statute in the Communications Decency Act that protects companies such as Facebook Inc. from being held legally liable for user content.
Mosseri then mentioned the “pause” on the development of Meta’s “Instagram Kids,” which he said will remain on pause for some time while safety issues are being ironed out. He did say, though, that the company is presently working on better ways to verify a child’s age, while thinking about developing ways to make a safer Instagram for kids up to the age of 18.
As for the research that was leaked suggesting the app could be detrimental to a child’s mental health, he said the public and media had mischaracterized the research. Some of the research, in fact, shows that Instagram provides relief for kids, not heightens mental health problems, he added. At one point, he said, “Respectfully, I don’t believe that research suggests that our products are addictive.”
Sen. Richard Blumenthal asked if Instagram would commit to having independent researchers look into possible harm caused by algorithms, to which Mosseri said he would “provide meaningful access to data so that third-party researchers could design their own studies.”
“Will you commit to a legal requirement?” asked Blumenthal in relation to that, to which Mosseri seemed to give his assent. Blumenthal concluded, “Self-policing based on trust is no longer a viable solution.” Mosseri throughout doubled down on his belief that he doesn’t think Instagram is a negative force in children’s lives.
“I’m a father of three,” he said. “To any parent who’s lost a child [or] even had a child hurt themselves, I can’t begin to imagine what that would be like for one of my three boys. As the head of Instagram, it’s my responsibility to do all I can to keep people safe. I’ve been committed to that for years and I’m going to continue to do so.”
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.