• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

NGOs.AI

AI in Action

  • Home
  • AI for NGOs
  • Case Stories
  • AI Project Ideas for NGOs
  • Contact
You are here: Home / Category / NGOs and AI-Generated Imagery: A Reputation Risk?

NGOs and AI-Generated Imagery: A Reputation Risk?

Dated: March 23, 2026

New research from the University of East Anglia has raised concerns about the growing use of AI-generated imagery in NGO communications, suggesting that it may unintentionally undermine the very causes organisations are trying to promote. The study examined 171 AI-generated images used by 17 major organisations, including Amnesty International, Plan International, and WWF, along with more than 400 public comments those images received. It found that fewer than one in five comments actually engaged with the humanitarian issue being highlighted, while most responses focused instead on whether the images were real or on flaws in their technical quality, causing the core message to be overshadowed.

The research argues that trust, which is central to the NGO sector, may be at risk when organisations rely on AI-generated visuals. Although such imagery is fast, flexible, and increasingly affordable, it can quietly erode public confidence. Even transparency did not fully solve the issue. Despite 85% of the images in the study being clearly labelled as AI-generated, audiences still reacted with scepticism, often scrutinising the images for inaccuracies and questioning the ethics behind their creation rather than responding to the underlying cause.

Another major concern highlighted is the mismatch that can occur between an organisation’s values and the tools it uses to communicate. WWF Denmark, for example, faced backlash after using energy-intensive AI tools in a sustainability campaign, with supporters arguing that the method contradicted the organisation’s environmental mission. This kind of “message-medium misalignment” can damage credibility, especially when AI-generated visuals are seen as inconsistent with an NGO’s ethical, social, or environmental commitments. Critics have also pointed out that such tools may threaten the livelihoods of local photographers and filmmakers, while AI-generated films in particular can have significantly higher energy demands than still images.

At the same time, the research acknowledges that AI-generated imagery can be useful in certain ethical contexts, particularly when working with survivors of conflict, abuse, or displacement, where traditional photography or filming may risk harm, retraumatisation, or privacy violations. In such cases, synthetic visuals may offer a safer alternative. However, even here the study notes a tension, as some donors and audiences still tend to value “authentic” imagery more highly than participant privacy, making it important for organisations to carefully consider how such choices may affect supporter trust and emotional connection.

Rather than calling for a ban on AI in NGO communications, the study encourages more thoughtful and responsible use. It recommends that organisations develop clear policies outlining when AI-generated imagery is appropriate, how it should be reviewed, and how it must be disclosed. It also stresses the need to train communications teams so they understand the ethical implications of choices around representation, including skin tone, clothing, cultural markers, and setting, all of which shape how communities are perceived.

The findings further suggest that NGOs should avoid highly photorealistic AI visuals, as these tend to attract the most scrutiny and backlash. Instead, more stylised, illustrative, or clearly non-photographic visuals may be better received by audiences. The study also emphasises the importance of involving the communities being represented in the creative process, allowing them to help shape prompts, review outputs, and approve final images so that the resulting visuals reflect lived realities rather than assumptions from outside.

Finally, the research urges NGOs to move beyond narrow and repetitive charity tropes often reproduced by AI, such as poverty, crisis, and vulnerable children, and instead tell broader, more nuanced stories that reflect the diversity, resilience, and complexity of the communities they serve. At a time when public trust in institutions is already fragile and audiences are increasingly quick to identify synthetic content, the study concludes that AI is not inherently harmful to humanitarian storytelling. However, using it as a shortcut to emotional engagement carries significant reputational risks that organisations can no longer afford to ignore.

Related Posts

  • How Community-Driven AI Is Shaping the Future of Humanitarian Communication
  • Governments Move to Curb AI Child Exploitation Content with Tough New Legislation
  • Photo Digital painting
    AI-Generated Art: Redefining Creativity in the Modern Age
  • How Global Mobility Can Build a Strong AI Foundation
  • East Asia’s AI-Ready Workforce: A Comparative Study on Reskilling

Primary Sidebar

Apply Now: AI to Accelerate Charitable Giving Grand Challenge

NSF Grants $11M to Boost AI Training for K-12 Teachers Nationwide

Cloudberry Ventures Raises €50M to Fund AI and Infrastructure Startups

AI in Healthcare: Driving a Rapid Revolution

AI Risks and Opportunities for Sustainability Leaders

Digital Edge Secures $665M Green Loan for Indonesia AI Data Center

NGOs and AI-Generated Imagery: A Reputation Risk?

Infosys, Formula E Unveil AI-Powered Race Centre

UNESCO Promotes Safe and Inclusive AI for Women in the Caribbean

AI Apprenticeships to Bridge Europe’s Digital Skills Gap

Google Empowers Europe’s Workforce with New AI Initiative

Expanding AI Learning Opportunities Across Africa: A Path to Innovation

How Global Mobility Can Build a Strong AI Foundation

Rwanda Launches Digital Public Infrastructure Strategy

UK Launches £45M AI Supercomputer for Fusion Energy

Pro-Human AI: GSI Insights from India AI Impact Summit

Global Forum on Harnessing Artificial Intelligence for Health Equity

€7M Funding for AI Research in Agriculture and Food Sectors

AI EmpowerED by Global Skills Academy Now Available in Malaysia

Explore Fursa Beta: Africa’s Latest Digital Infrastructure

AI in Kuwait: Driving Productivity and Business Growth

AI in Higher Education: Leadership and Lessons from Asia-Pacific

How AI Helped Save a Common Kestrel in Wildlife Rescue

How Artificial Intelligence Influences Gender Discrimination

UNESCO and MeitY Release India AI Readiness Report 2026

© NGOs.AI. All rights reserved.

Grants Management And Research Pte. Ltd., 21 Merchant Road #04-01 Singapore 058267

Manage Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
  • Manage options
  • Manage services
  • Manage {vendor_count} vendors
  • Read more about these purposes
View preferences
  • {title}
  • {title}
  • {title}