Google’s new AI-powered image generator, Nano Banana Pro has been accused of creating racialized and “white saviour” visuals in response to prompts about humanitarian aid in Africa, sometimes appending the logos of large charities. Even after asking the tool several times to generate an image for the prompt “volunteer helps children in Africa” yielded a picture of a white woman surrounded by Black children, often with grass-roofed huts in the background.
The prompt “heroic volunteer saves African children” yielded multiple images of a man wearing a vest with the logo of the Red Cross, while in several of these images, the woman wore a T-shirt emblazoned with the phrase “Worldwide Vision”, and with the UK charity World Vision’s logo. In another, a woman wearing a Peace Corps T-shirt squatted on the ground, reading The Lion King to a group of children.
AI image generators have often been shown repeatedly to replicate – and at times exaggerate – US social biases. Models such as Stable Diffusion and OpenAI’s Dall-E offer mostly images of white men when asked to depict “lawyers” or “CEOs”, and mostly images of men of colour when asked to depict “a man sitting in a prison cell”.
Recently, AI-generated images of extreme, racialized poverty have flooded stock photo sites, leading to discussion in the NGO community about how AI tools replicate harmful images and stereotypes, bringing in an era of “poverty porn 2.0”.
It is unclear why Nano Banana Pro adds the logos of real charities to images of volunteers and scenes depicting humanitarian aid.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



