Home / World / Poland: X (formerly Twitter) & Anti-LGBTQ+ Hate Speech

Poland: X (formerly Twitter) & Anti-LGBTQ+ Hate Speech

Poland: X (formerly Twitter) & Anti-LGBTQ+ Hate Speech

Poland: X⁢ (formerly Twitter) Enabled the Spread ⁢of Anti-LGBTQI+‍ Hatred and Harassment⁤ -⁢ A Deep Dive

Amnesty International‘s latest examination reveals a disturbing pattern: X​ (formerly‍ Twitter) has ‍demonstrably failed to protect LGBTQI+ individuals in Poland from a surge ​of hateful content⁢ and targeted harassment. Our findings highlight systemic shortcomings in X’s content moderation, risk assessment, and compliance​ with‍ the EU’s Digital Services Act (DSA). This ⁣isn’t simply a platform failing‍ to police its content; it’s a platform actively contributing to human rights abuses.

Understanding the Scope of the Problem

Poland has become a focal point for anti-LGBTQI+ rhetoric, and X has sadly served as a key amplifier. the platform’s inadequate resources and policies have ⁢allowed hateful narratives to⁣ flourish, creating a hostile online⁣ surroundings with real-world consequences.

Here’s a breakdown of the critical issues:

Severe Understaffing: X employs a shockingly small number ‌of polish-speaking content moderators ​- just two, one of whom ⁢has Polish as a second language.
Massive Coverage Area: These two individuals are tasked with ⁢monitoring content for a⁣ population⁤ of 37.45 million, including 5.33 million X users. This is simply unsustainable.
Lack of Response: Amnesty international reached out to X with detailed questions regarding its operations in Poland (August 2024) and to present⁤ our findings (June 2025). We received no response.

Failure to ⁢Uphold Digital services Act Standards

The EU’s Digital ⁤Services‍ Act⁣ (DSA) mandates that very​ Large Online Platforms⁢ (VLOPs) like X proactively assess and mitigate systemic human rights⁤ risks. X’s 2024 risk assessment acknowledged the ⁢presence of hate speech, but crucially failed to specifically address harms targeting the‍ LGBTQI+ community.

Further, an independent DSA audit (covering the period up to august ⁣23, 2024) found X’s risk assessment​ and mitigation measures ⁣to be:

Weak.
Ineffective.
Lacking‌ safeguards for algorithmic⁢ systems.

This demonstrates a clear disregard for‌ the DSA’s requirements and a⁢ failure to protect vulnerable‍ groups. You deserve a safe ⁢online experience, and X is falling short.

The Human Cost: A ​Direct Quote

“This⁤ combination of​ poor resourcing, policy and practice has ‌contributed to X becoming‌ a platform awash with⁣ hateful content targeting the LGBTI community,” states Alia Al Ghussain, highlighting the direct link between X’s failings and the ⁤harm experienced by LGBTQI+ individuals in Poland.

A Pattern‍ of Neglect: Amnesty international’s Previous Findings

This isn’t an‌ isolated incident. Amnesty International has consistently documented how X’s design and policies contribute to the spread of harmful narratives. ​

We’ve previously reported on:

The amplification of false narratives targeting Muslims and migrants in the UK. ⁣(August 2025)
The prevalence of violence and abuse against women on⁣ the platform. (March 2018)
The inherent human rights risks posed by surveillance-based business models – a‍ concern initially raised in ‌our 2019 report on Google and ⁢Meta, concluding their models are “inherently incompatible” with essential human rights.

These findings paint a clear picture: ⁢X’s core business model ⁤incentivizes engagement, even if that ⁣engagement is fueled by⁣ hate and​ misinformation.

What Needs to Happen Now

The​ European ‌Commission must ‍take decisive action. We urge them to:

Expand current investigations into X to‌ specifically include the company’s ability to address the risk of targeted hate and harassment against LGBTQI+ individuals.
Demand urgent reforms to ensure X stops contributing to human rights ‌abuses.
Require increased investment in Polish-language content moderation.
Re-evaluate X’s surveillance-based business model ‍ and its inherent incompatibility with human rights.Your Role in Demanding Change

You can help hold X accountable. Share this report, ⁤contact​ your‍ representatives, and ‍demand ⁢that platforms prioritize safety and respect ‍for all users. The DSA provides a crucial framework for accountability, but‍ it requires robust enforcement ⁣to be effective.

X has ⁢a responsibility to protect its users. It’s time ⁤they fulfill that obligation.

*Learn

Also Read:  Irish Grandmother Held in US Immigration Cell Over $25 Check | The Irish Times

Leave a Reply