The Hundred Heroines charity’s Facebook page has been reinstated after it was falsely reported due to drug content | Photography

✨ Read this insightful post from Culture | The Guardian 📖

📂 Category: Photography,Facebook,Meta,Technology,Art and design,Culture

📌 Here’s what you’ll learn:

When the UK charity Hundred Heroines removed its Facebook group, it was accompanied by a message from the social media company simply saying the page “goes against our community standards on drugs”.

Now, after more than a month of appeals, the photography charity is celebrating the reinstatement of its group after the tech company’s AI tools mistook it for an organization promoting Class A opioid heroin.

The Gloucestershire-based organisation, which celebrates female photographers, had her Facebook group removed twice in 2025 for apparent breaches of community guidelines around drug promotion.

The last removal came in September. After the second appeal in 12 months, the Hundred Heroines: Women in Photography Today page was restored last week without any explanation or apology.

The charity’s founder and former president of the Royal Photographic Society, Dr Dale Barrett, said the decision had had a “devastating” impact on the organization, which relies on Facebook to attract visitors.

“The AI ​​technology picks up the word heroin without the ‘e’, so we get banned for violating community guidelines,” she said. “And no matter what you do, you can’t reach anyone, and that really affects us because we rely on Facebook to reach our local audience.”

Founded in 2020, the charity has a physical space in Nailsworth, near Stroud, with around 8,000 items in its collection focusing on the work of female photographers and spanning the history of the art form.

In 2024, Meta increased its vigilance of drug-related groups in light of the opioid crisis in the United States, where 80,000 people died after overdoses last year.

Meta says buying and selling drugs is strictly prohibited on its platforms, and claims it has “robust measures” in place to detect and remove such content.

In a statement on its website, Meta says: “We recognize the importance of the drug crisis and are committed to using our platforms to keep people safe… and strictly enforcing our community standards.”

But when its software incorrectly identifies groups that violate its standards, the result can be Kafkaesque, according to users, as feedback forms are often the only way to report bugs.

AI tools are “essential to us,” Meta says. [its] Content review process,” adding that “AI can detect and remove content that goes against our community standards before anyone reports it.”

Sometimes, technology reports content to “human review teams,” though Barrett said when Hundred Heroines complained about the lack of human interaction.

“We thought, should we change our name?” But why should we? Why do we have to mess with our brand just because of Facebook?“, said Barrett, who estimates about 75% of Hundred Heroines’ visitors come through Facebook.

She added: “It sounds scary and funny. You think these robots run the world and can’t tell the difference between a woman and an opioid drug. Heaven help us.”

Earlier this year, Meta came under fire for blanket banning or suspending accounts on Facebook and Instagram.

Users blamed its AI moderation tools for the faulty bans, but while the tech company admitted there was a “technical bug” affecting Facebook groups, it denied an increase in incorrect enforcement of its rules across its platforms.

Meta said it was working to resolve the issue that emerged over the summer after groups, including those that shared memes about the bugs, were told they were not following standards regarding “dangerous organizations or individuals.”

Meta has been contacted for comment.

💬 Share your opinion below!

#️⃣ #Heroines #charitys #Facebook #page #reinstated #falsely #reported #due #drug #content #Photography

By

Leave a Reply

Your email address will not be published. Required fields are marked *