• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

NGOs.AI

AI in Action

  • Home
  • AI for NGOs
  • Case Stories
  • AI Project Ideas for NGOs
  • Contact
You are here: Home / Category / NGOs and AI-Generated Imagery: A Reputation Risk?

NGOs and AI-Generated Imagery: A Reputation Risk?

Dated: March 23, 2026

New research from the University of East Anglia has raised concerns about the growing use of AI-generated imagery in NGO communications, suggesting that it may unintentionally undermine the very causes organisations are trying to promote. The study examined 171 AI-generated images used by 17 major organisations, including Amnesty International, Plan International, and WWF, along with more than 400 public comments those images received. It found that fewer than one in five comments actually engaged with the humanitarian issue being highlighted, while most responses focused instead on whether the images were real or on flaws in their technical quality, causing the core message to be overshadowed.

The research argues that trust, which is central to the NGO sector, may be at risk when organisations rely on AI-generated visuals. Although such imagery is fast, flexible, and increasingly affordable, it can quietly erode public confidence. Even transparency did not fully solve the issue. Despite 85% of the images in the study being clearly labelled as AI-generated, audiences still reacted with scepticism, often scrutinising the images for inaccuracies and questioning the ethics behind their creation rather than responding to the underlying cause.

Another major concern highlighted is the mismatch that can occur between an organisation’s values and the tools it uses to communicate. WWF Denmark, for example, faced backlash after using energy-intensive AI tools in a sustainability campaign, with supporters arguing that the method contradicted the organisation’s environmental mission. This kind of “message-medium misalignment” can damage credibility, especially when AI-generated visuals are seen as inconsistent with an NGO’s ethical, social, or environmental commitments. Critics have also pointed out that such tools may threaten the livelihoods of local photographers and filmmakers, while AI-generated films in particular can have significantly higher energy demands than still images.

At the same time, the research acknowledges that AI-generated imagery can be useful in certain ethical contexts, particularly when working with survivors of conflict, abuse, or displacement, where traditional photography or filming may risk harm, retraumatisation, or privacy violations. In such cases, synthetic visuals may offer a safer alternative. However, even here the study notes a tension, as some donors and audiences still tend to value “authentic” imagery more highly than participant privacy, making it important for organisations to carefully consider how such choices may affect supporter trust and emotional connection.

Rather than calling for a ban on AI in NGO communications, the study encourages more thoughtful and responsible use. It recommends that organisations develop clear policies outlining when AI-generated imagery is appropriate, how it should be reviewed, and how it must be disclosed. It also stresses the need to train communications teams so they understand the ethical implications of choices around representation, including skin tone, clothing, cultural markers, and setting, all of which shape how communities are perceived.

The findings further suggest that NGOs should avoid highly photorealistic AI visuals, as these tend to attract the most scrutiny and backlash. Instead, more stylised, illustrative, or clearly non-photographic visuals may be better received by audiences. The study also emphasises the importance of involving the communities being represented in the creative process, allowing them to help shape prompts, review outputs, and approve final images so that the resulting visuals reflect lived realities rather than assumptions from outside.

Finally, the research urges NGOs to move beyond narrow and repetitive charity tropes often reproduced by AI, such as poverty, crisis, and vulnerable children, and instead tell broader, more nuanced stories that reflect the diversity, resilience, and complexity of the communities they serve. At a time when public trust in institutions is already fragile and audiences are increasingly quick to identify synthetic content, the study concludes that AI is not inherently harmful to humanitarian storytelling. However, using it as a shortcut to emotional engagement carries significant reputational risks that organisations can no longer afford to ignore.

Related Posts

  • How Community-Driven AI Is Shaping the Future of Humanitarian Communication
  • Governments Move to Curb AI Child Exploitation Content with Tough New Legislation
  • Photo Digital painting
    AI-Generated Art: Redefining Creativity in the Modern Age
  • How Global Mobility Can Build a Strong AI Foundation
  • East Asia’s AI-Ready Workforce: A Comparative Study on Reskilling

Primary Sidebar

Minister Launches Public Sector AI Programme with Japan and UNDP

Canada Boosts AI with New Supercomputing Plan

AI Tutoring Tools for Disadvantaged Students: EdTech Call

Human Dignity in the Age of AI: Rethinking Data with a Pulse

ILO Brief: Generative AI Set to Reshape Millions of Jobs in Vietnam

Rockefeller Foundation Commits $10M to Drive AI Innovation for Crisis-Affected Communities

Canada and Finland Strengthen AI Partnership with Sovereign Technology Cooperation

Google.org Commits $15M to AI Research Through Digital Futures Fund Expansion

UN Begins Global AI Impact Study Focused on People

Canada to Use AI Hybrid Model for Severe Weather Forecasts

MYOB, Microsoft Join Forces for Five-Year AI Initiative

Natter Raises $23M to Enhance AI Insights for Enterprises

UNDP–Intel Partnership Boosts AI Skills in Lesotho and Liberia

UNDP and Intel Partner to Boost AI Capacity in Lesotho and Liberia

PacifiCan Invests $13.8M in AI and Aerospace Innovation in BC

Tajikistan Uses AI to Improve Water Management

AI-Powered Crisis Response: IOM and Google Cloud Join Forces

India’s Data Protection and AI Governance Update

AI Chatbot Sami Launches in Colombia for Migrants

CFPs: Evaluating Scalability and Impact of GenAI and Agentic AI in the Water and Wastewater Sector

AI for Good Fund: Building AI Capacity in the Nonprofit Sector (Ireland)

Submissions open for BuildAI Pitch Event (India)

Microsoft launches AI initiative to empower nonprofits worldwide

Bezos Earth Fund Backs AI Climate Fix as Amazon’s Emissions Rise

AI App Helps Bridge Information Gap for India’s Farmers

© NGOs.AI. All rights reserved.

Grants Management And Research Pte. Ltd., 21 Merchant Road #04-01 Singapore 058267

Manage Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
  • Manage options
  • Manage services
  • Manage {vendor_count} vendors
  • Read more about these purposes
View preferences
  • {title}
  • {title}
  • {title}