Meta Platforms was told by European Union (EU) tech regulators on Friday to provide more details on measures taken to tackle child sexual abuse material on its photo and video sharing app Instagram by December 22 or risk a formal investigation under new EU online content rules.
The European Commission in October sent a first request for information on measures taken to counter the spread of terrorist and violent content, and a second last month on measures to protect minors. “Information is also requested about Instagram's recommender system and amplification of potentially harmful content,” it said.
The request for information was done under the EU’s Digital Services Act (DSA), new tech rules requiring Big Tech to do more to police illegal and harmful content on their platforms.
The request for information was done under the EU’s Digital Services Act (DSA), new tech rules requiring Big Tech to do more to police illegal and harmful content on their platforms.