Digital Diplomacy in the Age of Visual AI

12
digital-diplomacy-in-the-age-of-visual-ai

Last week I began exploring possible biases in popular (Artificial Intelligence) AI tools. Within the context of AI, “bias” refers to the generation of skewed output or content. AI tools such as ChatGPT or Microsoft’s Copilot may suffer from biases because they were trained on skewed data or because humans with biases and prejudices programmed the algorithms on which these AI are based. For instance, in 2018 Amazon replaced its AI recruitment tool which was found to be biased against women. The algorithm automatically assigned lower scores to resumes that included the word “women’s” or resumes of candidates that attended all women colleges. Amazon’s AI tool had a systematic or consistent bias against women. This bias also had real world consequences as women were less likely to be hired by Amazon. The same may be true of AIs such as ChatGPT or Copilot, which are trained of swarms of readily available information on the internet, information that may be biased or skewed.

The main question is how can one detect AI biases and what implications might these biases have on society? This is a pertinent question given the speed with which individuals are adopting Generative AI tools such as ChatGPT or Copilot. To uncover possible biases in AIs, I used a text-to-image AI tool which converts words to images. My assumption was that biases in the AI would be evident in the images it generates.

Last week I asked Copilot to generate images of diplomats, uncovering biased results as “diplomats” were visually depicted as white men. There were almost no images of female diplomats or non-white diplomats. This week I continued this exploration. I first asked Copilot, Microsoft’s AI tool, to generate the image of a world leader from Western Europe. The image generated, shown below, was that of a white male. I next asked Copilot to generate the images of world leaders from Eastern Europe or Russia and again all generated images were those of white men. Not a single image depicted a woman as a world leader nor did any image depict a non-white leader. These images may be indicative of an AI bias which skews results against women and racial or ethnic minorities.  In Copilot’s world it is unimaginable that France, the UK, Lithuania or Ukraine will be led by a woman or a Black man.  

Next, I asked Copilot to generate images of two world leaders meeting each other. This request yielded three images, all of whom included women as can be seen below. Yet even within these images, the male world leaders were always white and ‘Western’, while the women were usually dressed in suits like their male counterparts. Only one image depicted a female world leader that is not wearing a suit. Here again one notices an enduring bias in Copilot- world leaders are white, and are mostly male. When a woman does become a world leader it is because she has embraced the behavior and appearance of a man. According to Copilot, there are no world leaders from Africa, Latin America or even South-East Asia.

World leaders are important diplomatic actors. As Piki Isha-Shalom has noted, leaders are increasingly becoming King Diplomats as they assume diplomatic roles given the high frequency at which leaders now meet, be it at G7 summits or UN General Assemblies. The same Intelligence agencies are also diplomatic actors that provide diplomats with valuable information. I thus asked Copilot to generate the image of an Intelligence Agency Chief. Once again, the results depicted only white ‘Western’ men.

Lastly, I used Copilot to generate an image of a NATO diplomat, hoping to gain insight into the visual representation of multilateral diplomats. Copilot generated four images shown below- four images of white ‘Western’ men shaking hands with other white ‘Western’ men.  

At this stage one can conclude that Copilot suffers from a bias when it comes to depicting diplomats, world leaders and even intelligence chiefs. Copilot automatically assumes that positions of power are occupied solely by white ‘Westen’ men. Women or men from other parts of the world are not part of the imaginary of Copilot or the imaginary that it creates in the minds of users. Indeed, Copilot users, hoping to learn about their world, will encounter this bias which may lead to skewed beliefs and attitudes. After learning from Copilot that diplomacy and global leadership are inherently male positions, users may be less likely to vote for women candidates, or support the appointment of female Ambassadors or have faith in women diplomats. This problem is compounded when it comes to female users who may conclude that they lack the traits and skills necessary to become world leaders and diplomats- a complete fallacy with dramatic real-world implications.

Notably, I assumed that gendered and racial biases in Copilot may be limited to this specific AI. Thus, I turned to another text-to-image application called “AI Chat”. I first asked this AI tool to generate an image of three diplomats at the United Nations (UN). Next, I asked this AI to generate the image of diplomats meeting for a summit at the UN. Finally, I asked for an image of a NATO diplomat. These images may all be seen below. Notably, I found the same bias in this AI tool- diplomacy is a profession in which white men in suits engage with other white men in suits.

Notably, there was one image that broke this mold and is shown below. This AI generated image was supposed to depict Ambassadors at the UN. As can be seen there are two women in this image yet they are positioned behind the dominant male figure. Moreover, both women are white, and both are dressed in suits matching the style of their male counterparts.

So, once again this AI tool segments the world into the West and the rest, while women can be found behind every successful man. The fact that a similar bias was found in two, different, AI tools could suggest that these gendered and racial biases are widespread among AI applications. This highlights the fact that diplomats and states must now work with AI and tech companies to address persistent biases that can lead to skewed worldviews. This is very much a part of contemporary digital diplomacy.

Read More