According to the report, the music video referenced a 2017 shooting and could lead to “retaliatory violence” in the London gang scene. As a result, Instagram manually removed 52 posts containing the track and automated systems removed it a further 112 times. The enforcement of the decision was inconsistent, as the creator appealed and had the song reinstated by a reviewer outside the escalation team, only to have that decision overruled, and the song re-banned a week later, following another request from the police. Now the oversight board ruled that the removal of those videos was a mistake. The board said law enforcement should give a useful context for moderation and questioned whether Meta had assessed independently. The board argued police violated basic principles of free speech, equality and transparency by allowing police to censor a musician in secret. While Meta gives reports on formal government requests to remove data, it has been less transparent about soft requests, like the Metropolitan Police email. “While law enforcement can sometimes provide context and expertise, not every piece of content that law enforcement would prefer to have taken down should be taken down,” the board said in its ruling.

Crackdown on drill music

Crackdown on drill music

Crackdown on drill music

As part of the investigation, the board filed multiple freedom of information requests with the Met police, finding that the force had filed 286 requests to take down or review posts about drill music in the 12 months from June 2021 and that 255 of those had resulted in the removal of content. During the same period, there wasn’t a single request to remove any other content, “This intensive focus on one music genre among many that include reference to violence raises serious concerns of potential over-policing of certain communities,” the board argued. According to the police, the drill music video did not break any laws. But they argued that the music video broke Facebook and Instagram’s community standards since it was “potentially threatening or likely to contribute to imminent violence or physical harm”. Further, the police did not provide any physical evidence to back up their claim. Meta’s oversight board recommends Meta should have a public and standardized system for content removal requests from law enforcement, and pieces of evidence backing up claims of the policy violation. Meta should also review decisions in aggregate to assess for systemic biases against communities or minorities.