Here’s How Generative AI Reflects Queer People


Another way to improve the output from AI models is for developers to add protections and adapt to user needs, making the software compatible. OpenAI seems to have taken this approach. When I asked Dall-E 3 via ChatGPT to “take a photo of a funky couple enjoying a night out in the Castro,” it expanded the photo, without my asking, into a full paragraph that included gender, race, and more. more. Here is all the information generated by ChatGPT from my initial request:

A portrait of a famous couple enjoying a night out in the Castro District, San Francisco. They are happily walking hand in hand, smiling and chatting. One character is a Caucasian woman with short red hair, wearing a cute denim jacket and jeans. The other is Black with short black hair, sporting a plain green t-shirt and black pants. In the background lies the street life of the Castro, with rainbow flags, crowded restaurants, and crowds of people. The atmosphere is fun and welcoming, reflecting the diversity and inclusiveness of the area.

Although useful in some cases, the modification of incentives can be frustrating for users if it is not implemented properly. The CEO of Google he apologized when Gemini, Google’s generative-AI platform, changed user controls and created images of Black Nazis and other historical images. Was there a secret conspiracy within the company to erase Caucasians from history? It is understandable that the Gemini engineers found the original device made with images of white men, such as. many AI tools are doing itand Google devs seem to have overdone their tweaks during the rush to install registered chatbot company.

Even with good sample data and security software, the melting pot of human presence can avoid the immutability of algorithmic categorization. “They’re using the past to create the future,” says William Agnew, a postdoctoral fellow at Carnegie Mellon and a longtime Queer in the AI ​​Institute. “It seems counterintuitive to the never-ending potential for growth and change that is so much a part of negative communities.” By amplifying stereotypes, AI tools not only run the risk of misrepresenting minority groups to the general public, these algorithms also have the potential to distort the way idiots see and feel about themselves.

It’s worth stopping for a moment accepting breakneck speed while other aspects of AI output continue to evolve. In the year 2023, the internet is ablaze with mockery of the most evil person Will Smith’s AI movie eating spaghetti. After one year, screenshots and videos from OpenAI’s unreleased Sora The model is still imperfect but often unusual with their photorealism.

The AI ​​video tool is still in the research phase and not yet the liberation of the people, but I wanted to better understand how it represents its people. So, I reached out to OpenAI and gave Sora three instructions: “a group of diverse friends celebrating San Francisco’s pride parade on a beautiful, rainbow float”; “Two women in stunning wedding dresses getting married on a farm in Kansas”; and “a transgender person with a non-teenage friend playing an outdoor game.” A week later, I received only three videoclips that the company claimed were made with its video-to-video format without editing.

This AI-generated video was quickly created of “a diverse group of friends celebrating San Francisco’s Pride parade on a beautiful, rainbow-colored float.” As you watch the video again, take a closer look at the various people who step up to see the strangest of the generation, from fading to funny feet.

Sister via OpenAI

The cinematography is messy but amazing. Floaters at San Francisco’s Pride parade wave rainbow flags that defy the laws of physics as they fade into nothingness and disappear into thin air. Two brides in white are smiling at each other standing at the altar, while their hands are joined together in an ungodly finger. When a professional family plays a board game, they are seen passing by and playing pieces, like ghosts.

This AI-generated video was quickly created of “a non-business man and his non-business partner playing an outdoor game.” It is a good idea for real astronauts to wear their helmets while floating in space.

Sister via OpenAI

A movie that is supposed to show a non-native playing an alien game is the best among the three movies. Lilac-colored locks return, messy tattoos scatter over their skin, and other reptilian-like scale-like pigmentations cover their faces. Even for an impressive AI generator like Sora, renderings for non-business people seems difficult.

This AI-generated video hastily created “two women in fancy dresses getting married on a farm in Kansas.” Although it seems realistic at first, look at how the hands of the brides melt together.

Sister via OpenAI

When WIRED showed these clips of the members of Queer in AI, they asked Sora about the various aspects of the group’s friendship at the Pride parade. “Samples are our basis for what diversity looks like?” asked Sabine Weber, a German computer scientist. In addition to showing the high-end beauty of the people in the video, a ordinary events In the AI ​​scene, Weber asked why there isn’t more representation of people of color who are older, older, or disabled.

At the end of our conversation, Agnew explained why algorithmic reporting may be uncomfortable for LGBTQ people. They said: “It’s easy to get them to combine things that are good on their own but together they have big problems.” “I’m worried that our pictures, which are already battlefields, will suddenly be taken out of our hands.” Although AI tools include more and more visualizations of future human beings in the future, the simulations may show unexpected results.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *