Saturday, February 14, 2026

France Wants Access to X’s Algorithm. Elon Musk’s Company Says Absolutely Not

Elon Musk’s social media platform X, previously known as Twitter, is under investigation by French authorities over allegations that the company manipulated its algorithm to influence the visibility of certain content. The probe, which began earlier this year, has now been escalated to a specialized cybercrime unit within France’s national police. In response, X issued a public statement rejecting the allegations and announcing that it would not cooperate with investigators.

French Cybercrime Investigation Targets Algorithm and Data Practices

The case was first opened in January after French Member of Parliament Eric Bothorel filed a complaint alleging that changes to the platform’s algorithm may have interfered with public discourse. His concerns were amplified following Elon Musk’s acquisition of the company. According to reporting from Le Monde, the complaint pointed to a lack of transparency and raised fears of foreign interference. These claims became the basis for a legal argument that X may have violated French law.

In July, the Paris prosecutor’s office took over the investigation and formally cited potential charges related to data tampering and fraud. If carried forward, these charges could result in penalties of up to ten years in prison. The case has since been handed to France’s national cybercrime division, which has formally requested detailed access to the company’s algorithm along with real-time user data. Investigators intend to analyze the platform’s content ranking systems to determine whether manipulation has occurred.

Authorities are looking into whether the platform’s algorithm has unfairly boosted or suppressed certain types of content. This concern has been echoed by researchers and lawmakers, who continue to raise questions about bias in content recommendation systems. One academic study from Queensland University of Technology found that changes in user engagement appeared around mid-July 2024, shortly after the attempted assassination of former U.S. President Donald Trump. The findings suggested that right-leaning content may have gained increased visibility during that time.

X Denounces the Probe as Politically Motivated

In a statement released Monday through its Global Government Affairs account, X described the investigation as a politically motivated effort and denied all wrongdoing. The company referred to the inquiry as an overreach that threatens user privacy and freedom of speech. It described the allegations of algorithmic manipulation and fraudulent data extraction as inaccurate and unfounded.

X also rejected the idea that it is legally required to comply with the demands of French authorities. The statement noted that investigators requested full access to the company’s recommendation engine and real-time data for every user post. According to X, the company has refused to comply, asserting that it is within its legal rights to do so.

The company emphasized its commitment to protecting user data and upholding freedom of expression. It stated that the refusal to cooperate was not taken lightly but was based on legal principles and the need to defend against political censorship. As of now, the French government has not issued a formal reply to the company’s decision.

Ongoing Scrutiny Over Algorithmic Transparency

This situation reflects broader international calls for transparency in how algorithms operate on social media platforms. Within the European Union, the Digital Services Act requires large platforms to conduct risk assessments related to algorithmic amplification and content moderation. Although these rules are still being implemented, several member states have begun taking their own steps to evaluate platform behavior more closely.

The case involving X is particularly notable because the company has refused a direct request from a national law enforcement agency. This places corporate policy in potential conflict with government authority. It also raises questions about whether private technology companies can be compelled to reveal how their content systems function, especially in politically sensitive contexts.

For years, researchers and advocacy organizations have warned that opaque algorithms can contribute to the spread of misinformation and political polarization. While companies like X argue that their systems are neutral and focused on engagement, critics continue to investigate how design choices may affect what users see. The French inquiry, although still in its early phase, is one of the most assertive government-led efforts to obtain forensic access to such systems.

Whether X’s refusal will trigger legal escalation or more aggressive regulatory action remains uncertain. What is clear is that this case highlights the growing tension between global technology platforms and national governments over accountability, transparency, and control of digital ecosystems.