Facebook unveils new controls for kids on its platforms10/10/2021
Facebook unveils controls it says will protect child users including prompts to take breaks from Instagram and a ‘nudge’ for teens who spend too long looking at one topic days after whistleblower claims firm endangered children
- Facebook will roll out a series of features aimed at protecting children from being harmed on the social media giant’s platforms
- The announcements come just days after whistleblower Frances Haugen claimed that founder Mark Zuckerberg puts profits over the safety of its users
- One feature will allow parents/guardians to supervise what their kids are doing on their social media accounts
- Critics say they’re skeptical that the features will be effective in protecting users
Facebook announced plans to roll out a series of controls and features for children and teenagers using its platform and Instagram, just days after whistleblower Frances Haugen claimed that it puts profits over the safety of its users.
Nick Clegg, Facebook’s vice president for global affairs, made the rounds on various news networks Sunday to unveil the new features to protect teens who do use the app, though critics say they lack details and risk being ineffective.
Among the new controls will be one on Instagram prompting teens to take a break from the app and another ‘nudging’ teens who are looking at one topic for too long.
Another feature is an optional control allowing parents/guardians to supervise what their kids are doing on social media.
Facebook will roll out a series of features aimed at protecting children from being harmed on the social media giant’s platforms, following backlash that Mark Zuckerberg ‘puts profits over people’
In an interview with Dana Bash on CNN’s ‘State of the Union’ Sunday, Clegg. the former Deputy Prime Minister of Great Britain, said, ‘We are constantly iterating in order to improve our products. We cannot, with a wave of the wand, make everyone´s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use.’
The social media giant has also come under increased scrutiny over its role in the January 6th Capitol riot and for allegedly ignoring research that revealed the the social media platform harmed the mental health of teenage girls.
Following the backlash, Instagram revealed just last month that it would be halting its plans for Instagram Kids, a version of the app for those under 13 which will feature optional parental controls.
The announcements come just days after whistleblower Frances Haugen claimed that the social media giant is harmful to children and contribute to the spread of hate speech and misinformation
Josh Golin, executive director of a children’s media marketing watchdog Fairplay, said that he doesn’t think introducing controls to help parents supervise teens would be effective since many teens set up secret accounts any way.
He was also dubious about how effective nudging teens to take a break or move away from harmful content would be. He noted Facebook needs to show exactly how they would implement it and offer research that shows these tools are effective.
Nick Clegg, Facebook’s vice president for global affairs, made the rounds on various news networks Sunday to unveil the new features to protect teens who do use the app
‘There is tremendous reason to be skeptical,’ he said. He added that regulators need to restrict what Facebook does with its algorithms.
He said he also believes that Facebook should cancel Instagram for Kids permanently.
Democratic Sen. Amy Klobuchar of Minnesota, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust, and Consumer Rights, said that it’s time to update children’s privacy laws and offer more transparency in the use of algorithms.
‘I appreciate that he is willing to talk about things, but I believe the time for conversation is done. The time for action is now,’ she told CNN’s Dana Bash on Sunday.
Clegg was also grilled in different interviews on both CNN and ABC’s ‘This Week with George Stephanopoulos’ about the use of algorithms in amplifying misinformation ahead of January 6 Capitol riots.
However, he responded by claiming that Facebook’s algorithms are ‘giant spam filters’ and, if removed, people would see more potentially harmful content like hate speech and misinformation.
Clegg said that Facebook has invested $13 billion over the past few years in making sure to keep the platform safe and that the company has 40,000 people working on such issues.
However, last Tuesday, Frances Haugen, a 37-year-old data expert, said during her blistering testimony in front of Congress that Facebook founder Mark Zuckerberg is only ‘accountable to himself’ and has even been directly involved in company decisions that saw Facebook putting profit over ‘changes that would have significantly decreased misinformation, hate speech and other inciting content.’
Critics say they’re skeptical that the features will be effective in protecting users
Her claims were devastating for Facebook’s public image and prompted multiple senators to attack founder Mark Zuckerberg – who Haugen alleges knew of and encouraged his site’s harmful practices.
She said that executives were aware that Facebook and Instagram, which it owns, were harmful for children, with a leaked internal study revealing that teenage girls had increased suicidal thoughts from using Instagram.
The 37-year-old said that Facebook’s algorithms, centered around ‘likes’ and ‘shares’, rewarded ‘dangerous online talk has led to actual violence that harms and even kills people.’
Haugen´s accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company´s civic integrity unit.
The tech giant slapped down Haugen after she testified, saying that the data scientist never attended meetings with top executives and that she was wildly misinformed about the company.
Mark Zuckerberg wrote in an open letter to his staff: ‘I think most of us just don’t recognize the false picture of the company that is being painted.’
Facebook’s director of policy communications, Lena Pietsch, responded to Haugen’s testimony by pointing out she worked at the company for less than two years.
Pietsch added that Haugen ‘had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question.’
Source: Read Full Article