European Commission Releases Results of Sixth Monitoring Exercise
On October 7th 2021, the European Commision published the results of its sixth evaluation of the Code of Conduct on countering illegal hate speech online. The results for this EC Monitoring Exercise show a mixed picture as IT companies reviewed 81% of the notifications within 24 hours and removed an average of 62.5% of flagged content. These results are lower than the average recorded in 2019 and 2020.
Speaking on the results, Didier Reynders, Commissioner for Justice, stated: "The results show that IT companies cannot be complacent: just because the results were very good in the last years, they cannot take their task less seriously. They have to address any downward trend without delay. It is matter of protecting a democratic space and fundamental rights of all users. I trust that a swift adoption of the Digital Services Act will also help solving some of the persisting gaps, such as the insufficient transparency and feedback to users.”
What is the EC Monitoring Exercise?
The EC Monitoring Exercise was originally set up to respond to the proliferation of racist and xenophobic hate speech online and the original Code of Conduct was presented on 31 May 2016 by the EC along with Facebook, Microsoft, Twitter and YouTube. Since then, Instagram, Google+, Snapchat, Dailymotion, Jeuxvideo.com and TikTok joined the Code, with LinkedIn also joining on 24 June 2021.
The implementation of the Code of Conduct is evaluated through a regular monitoring exercise set up by the EC in collaboration with a network of organisations located in the different EU countries. Using a commonly agreed methodology, these organisations test how the IT companies are implementing the commitments in the Code.
To read more about the code, and to see the results of past monitoring exercise, check the EC website here
Several partners in the Get The Trolls Out! project took part in this year’s EC Monitoring Exercise, specifically LICRA (our partner in France) and ‘NEVER AGAIN’ (our partner in Poland). The project’s lead partner, the Media Diversity Institute, also completed the 6-week monitoring period; however, due to Brexit and the United Kingdom no longer being part of the EU, the results were not included in the overall results of the EC Monitoring Exercise. In total, the MDI team logged 79 cases during the 6-week period, 61 of which were on religious grounds – most of them antisemitic. We recorded an overall removal rate of 41% (75% removal rate on Instagram, 18.4% on Twitter). Together with two other organisations from the UK, the average removal rate for the country was 43%. We feel that is important to keep working together on research such as the EC Monitoring Exercise despite political decision such as Brexit. In order to fight hate speech, we need to continue cross-country collaboration.
The exercise gave us further insight into current hate speech trends on different social media platforms, and how these platforms are responding to reports. As we monitor for hate speech on social media on a regular basis in the project, it is useful to have this additional data to make comparisons. It is important to note that the EC Monitoring Exercise has in the past received some criticism due to the fact that social media companies seem to be somewhat aware of the exercise taking place (and thus tend to respond a lot faster compared to regular monitoring activities); however, we feel this is still a beneficial monitoring exercise to be part of.
To read the full results, visit the official website here
You can download the factsheet here.