Meta to Train AI Models Using Public U.K. Facebook and Instagram Posts

Meta has announced plans to use public content from adult users on Facebook and Instagram in the U.K. to train its artificial intelligence (AI) systems. The social media giant stated that this initiative will help its generative AI models reflect British culture, history, and language, while providing U.K. businesses and institutions access to advanced AI technology.
Starting this week, users aged 18 and over will receive notifications on Facebook and Instagram explaining how their public content will be used for AI training. The company assured users that private messages and data from minors will not be included. Additionally, users will be able to object to the use of their data through a simple opt-out form, and Meta emphasized that it will honor those objections.
Meta's move follows consultations with the U.K.'s Information Commissioner’s Office (ICO) and aligns with the legal principle of "Legitimate Interests" for using first-party data. The company highlighted its efforts to improve transparency, incorporating feedback from the ICO to make the opt-out process more user-friendly.
While Meta has paused similar efforts in the European Union following a request from the Irish Data Protection Commission (DPC) in June 2024, the company remains committed to its U.K. plans. However, it has faced criticism from privacy groups like noyb, which argue that Meta is unfairly shifting the burden onto users by making the process opt-out instead of opt-in.
The ICO has stated that it will closely monitor Meta’s actions to ensure transparency and data protection compliance. According to Stephen Almond, the ICO’s executive director of regulatory risk, organizations using personal data for AI training must provide clear information and safeguards.
Meta has also suspended the use of generative AI in Brazil following objections from the country’s data protection authority, marking a growing trend of regulatory scrutiny around AI and privacy issues globally.
Read More

Latest