Is the Meta AI system really helping to make fake Asian people in mainstream media? A response to a comment on Meta’s text generator
Have you ever seen a white person and an Asian person, or two friends of the same race? It’s commonplace to me, I have a lot of white friends.
Tweaking the text-based prompt didn’t seem to help. Meta’s text generator on social media gave me three separate pictures of two Asian people, when I asked for a picture of an Asian man and white woman smiling. When I changed my name to Caucasian, it did the same thing. While looking at the picture, it seems to me that the man in the suit and Asian woman in the kimono are the same person. Multiculturalism is amazing.
It was able to create an image that featured an older man and a young woman, but I am not talking about the age gap discourse. I generated a new picture using the same prompt, and it showed a man with an asian woman.
Meta introduced its AI image generator tools last year, and its sticker creation tool promptly went off the rails as people made things like nude images and Nintendo characters with guns.
Those who aren’t part of the monolith are forgotten in the cultural consciousness, and even underrepresented in mainstream media. Asians are considered to be perpetual foreigners. Breaking type is easy in real life and impossible in Meta’s AI system. Once again, generative AI, rather than allowing the imagination to take flight, imprisons it within a formalization of society’s dumber impulses.
After I reached out for comment yesterday, a Meta spokeswoman asked for more information about my story, like when my deadline was. I didn’t hear back after I replied. I was curious if the problem was solved or if the system was still unable to create an accurate image of an Asian person and their white friend. I received an error message saying it was “looks like something went wrong”, rather than a bunch of inaccurate pictures. Please try again later or try a different prompt.”
I asked about Asian people like an Asian man in a suit, an Asian woman shopping, and an Asian woman smiling. Instead of an image, I got the same error message. Again, I reached out to Meta’s communications team. Let me make fake Asian people! (During this time, I was also unable to generate images using prompts like “Latino man in suit” and “African American man in suit,” which I asked Meta about as well.)
Forty minutes later, after I got out of a meeting, I still hadn’t heard back from Meta. The feature worked for the prompt, “Asian man.” Many companies that I cover will often change something, fix an error or remove a feature after a reporter asks about it. Did I personally cause a temporary shortage of AI-generated Asian people? Was it just a coincidence? The problem is being fixed by Meta. I wish Meta had answered my questions or offered an explanation.
Whatever is happening over at Meta HQ, it still has some work to do — prompts like “Asian man and white woman” now return an image, but the system still screws up the races and makes them both Asian like yesterday. I guess we are back to where we started. I will keep a close eye on it.