For Immediate Release
November 23, 2022
On Monday, Meta announced updates to Facebook and Instagram to better protect children and teens from online harm on their platforms. These changes come two months after the monumental bipartisan passage of the California Age Appropriate Design Code (AADC) in September— a first-of-its-kind measure requiring online platforms to protect the health, safety, and privacy of children and teens by design and default. The AADC is setting the stage for a new era in online accountability by requiring platforms to assess the risks their services and features pose to young people and implement systemic changes to mitigate harm. Meta’s recent updates will apply largely to children under the age of 16 and include automatically defaulting new accounts on Facebook into more private settings, prompting children and teens to report accounts after they block them, and expanding work to limit the spread and exploitation of young peoples’ intimate images online.
Accountable Tech Co-Founder and Executive Director Nicole Gill released the following statement:
“These updates show why passing laws that change the incentive structure for Big Tech is imperative. The California Age Appropriate Design Code is already pushing companies like Meta to proactively enhance their existing limited privacy and safety measures for children and teens. The AADC is transformative, taking direct aim at exploitative Big Tech business models that rely on manipulating and profiling young people for profit; and with this announcement it is clear: Meta is not exempt from following the law.
For far too long social media companies have preyed on young people online, prioritizing profits over the well-being of children. We must continue to fight for legislation that will ensure Big tech platforms prioritize the safety, wellbeing, and privacy of children and teens, and put an end to the pervasive tracking, targeting, and manipulation they currently face online.”
# # #