Illicit Content on Elon Musk’s X Draws E.U. Investigation


The European Union on Monday announced a formal investigation into X, the social media platform owned by Elon Musk, for failure to counter illicit content and disinformation, a lack of transparency about advertising and “deceptive” design practices.

The inquiry is perhaps the most substantial regulatory move to date against X since it scaled back its content moderation policies after Mr. Musk bought the service, once known as Twitter, last year. The company’s new policies have led to a rise in incendiary content on the platform, according to researchers, causing brands to scale back advertising.

In going after X, the European Union is for the first time using the authority gained after last year’s passage of the Digital Services Act. The law gives regulators vast new powers to force social media companies to police their platforms for hate speech, misinformation and other divisive content.

The European Commission, the 27-nation bloc’s executive branch, had signaled its intention to look more closely at X’s business practices. In October, regulators initiated a preliminary investigation into the spread of “terrorist and violent content and hate speech” on X after the start of the Israel-Gaza conflict.

X did not respond to a request for comment.

The investigation highlights a major difference between the United States and Europe in policing the internet. While online posts are largely unregulated in the United States as a result of free speech protections, European governments, for historical and cultural reasons, have put more restrictions in place around hate speech, incitement to violence and other harmful material.

The Digital Services Act was an attempt by the E.U. to compel companies to put procedures in place to comply more consistently with rules around such content online.

Monday’s announcement is the beginning of an investigation without a specified deadline. The inquiry is expected to include interviews with outside groups and requests for more evidence from X. If found guilty of violating the Digital Services Act, the company could be fined up to 6 percent of global revenue.

E.U. officials said X may not be in compliance with rules that require online platforms to quickly respond after being made aware of illicit and hateful content, such as antisemitism and incitement of terrorism. The law also requires companies to conduct risk assessments about the spread of harmful content on their platforms and implement mitigation measures.

Officials also raised concerns about X’s content moderation policies in non-English languages, particularly as elections across the continent approach in 2024.

In addition, the investigation will examine X’s efforts to address the spread of false information. The company relies on a feature, called Community Notes, that lets users add context to posts that they believe are misleading, an approach that E.U. officials said may not be sufficient. Regulators will also look into the ways in which posts by X users who pay to be authenticated, symbolized by a blue check mark, are given more visibility.