Computer-generated inclusivity: fashion turns to ‘diverse’ AI models

  • 4/3/2023
  • 00:00
  • 3
  • 0
  • 0
news-picture

The star of Levi’s new campaign looks like any other model. Her tousled hair hangs over her shoulders as she gazes into the camera with that far-off high-fashion stare. But look closer, and something starts to seem a little off. The shadow between her chin and neck looks muddled, like a bad attempt at using FaceTune’s eraser effect to hide a double chin. Her French-manicured fingernails appear scrubbed clean and uniform in a creepy real doll kind of way. The model is AI-generated, a digital rendering of a human being that will start appearing on Levi’s e-commerce website later this year. The brand teamed with LaLaLand.ai, a digital studio that makes customized AI models for companies like Calvin Klein and Tommy Hilfiger, to dream up this avatar. Amy Gershkoff Bolles, Levi’s global head of digital and emerging technology strategy, announced the model’s debut at a Business of Fashion event in March. AI models will not completely replace the humans, she said, but will serve as a “supplement” intended to aid in the brand’s representation of various sizes, skin tones and ages. “When we say supplement, we mean the AI-generated models can be used in conjunction with human models to potentially expand the number of models per product,” a Levi’s spokesperson said. “We are excited about a world where consumers can see more models on our site, potentially reflecting any combination of body type, age, size, race and ethnicity, enabling us to create a more personal and inclusive shopping experience.” Michael Musandu, the founder of LaLaLand.ai, created the software in part because he struggled to find models who look like him. He was born in Zimbabwe, raised in South Africa, and moved to the Netherlands to study computer science. “Any good technologist, instead of complaining about a problem, will build a future where you could actually have this representation,” Musandu said. What about simply hiring a diverse cast of models? Musandu said that LaLaLand.ai is not meant to “replace” models, but allow brands to afford showing off different clothes on as many bodies as possible. “It is not feasible for brands to shoot nine models for every single product they sell, because they’re not just hiring models, they’re hiring photographers, hair stylists and makeup artists for those models.” AI-generated images don’t need glam squads, so brands can cut costs they would spend on set by using fake avatars. A spokesperson for Levi’s added: “The models Levi’s hires are already diverse and this will continue to be a priority for us. Over the past year, we’ve been focused on ensuring that those working on the content both in front and behind the camera are reflective of our broad consumer base.” Yet the diversity that AI can provide is always going to be virtual – a computer-generated sense of inclusivity. Are brands who generate, for example, black models for pieces where they only photographed a white human model engaging in a kind of digital blackface? This is not a new question. There are already “digital influencers” like Lil Miquela and Shudu, fake avatars with millions of followers on social media. They model Prada, Dior and Gucci clothing with the idea that their (human) audience will purchase the pieces. Neither model is white, but both have at least one white creator (Shudu was created by British fashion photographer Cameron-James Wilson and Miquela by Trevor McFedries and Sara Decou). Criticism of Levi’s for casting AI models instead of real ones echoes the wave of response Lil Miquela got when she was first launched in 2016, or when Shudu made her debut two years later. The New Yorker’s Lauren Michele Jackson called Shudu “a white man’s digital projection of real Black womanhood”. Lil Miquela’s creators also filled her fake life with “events” to try to give her personality. Calvin Klein apologized for a Pride ad that showed Lil Miquela kissing the real model Bella Hadid. A few months later, Lil Miquela came out with a story of experiencing sexual assault in the back of a ride-share, and followers accused her creators of making up a traumatic event for clout. Unlike their mortal counterparts, these models also never age. Miquela, a “19-year-old Robot living in LA”, is forever 19 – making her a hot commodity in a youth-obsessed industry. Deep Agency, another Netherlands-based AI company, made headlines this month after debuting its own “AI modeling agency”. The service, which costs $29 a month, brands itself as a way for creators to “say goodbye to traditional photoshoots”. Users type in description for what they want their photo to look like, and receive “high-quality” photos of fake models in return. Paid subscribers of the service gain access to 12 models of various races, though all appear to be smaller-bodied and in their 20s and 30s. Users browse through the site’s catalog of existing images, which include photos of models engaging in activities like reading books or giving the camera a peace sign. Those photos serve as the inspiration for the final result. In a photo rendered by the Guardian, one model named “Chai” had an unnervingly plastic-looking face and extra-long, slender fingers that belonged in a horror film. Another, “Caitlin”, had a concerning amount of veins popping out from under the skin of her neck. A male model, “Airik”, seemed incredibly uncomfortable and stick-straight as he posed in front of a drab gray building. How long before these models are taking away jobs from real people? Sara Ziff, founder of the advocacy group The Model Alliance,is concerned, “capitalizing on someone else’s identity to the exclusion of hiring people who are actually Black could be compared to Blackface”, Ziff said. Ziff’s New York office hosts a support line where models call in to discuss things that have made them uncomfortable on set. Lately, the topic of conversation has been AI, and specifically body scans, which brands can use to create digital, 3D replicas of models’ bodies. “We’ve received an increasing number of calls from models who after receiving body scans found that the rights to their body were being assigned to a company, which meant that they were losing the rights to their own image,” Ziff said. “We’ve particularly heard this from fit models, who are concerned over how their personal information would be used or capitalized on without their permission.” Fit models work in the initial process of fashion design. They are essentially human mannequins for creatives, who try on drafts of clothing to see how the garment looks on a real body. Summer Foley, a 25-year-old model in New York, said it was not uncommon to make about $400 an hour as a fit model. “If someone wanted to scan my body, I’d want to charge them every time they used it!” Foley said. “That’s my body, and I work hard to keep these measurements. You can’t make a scan of me and use my likeness in perpetuity without me making any money.” Sinead Bovell has modeled for six years and wrote about the topic of AI models for Vogue in 2020. She frequently posts on social media about the ethical dilemma that comes with companies using models’ bodies to create their images. Last year, the portrait app Lensa went viral for generating highly stylized portraits of users. It used Stable Diffusion, a text-to-image app that is trained to learn patterns through an online database of images. Those photos are sourced from across the internet, which led to artists saying Lensa was stealing their work to create the pictures. Similarly, brands could train their AI on real-life photos or body scans of human models. But who gets paid when the photo generated from their likeness lands the next big ad campaign? “Who would own that data? Where would it live? I’m sure there are ways that you have full rights over it, but as that area of tech is being ironed out, I’d rather not be the guinea pig,” Bovell said. Musandu, the LaLaLand.ai founder, said that his algorithm only works off data that the company owns. But he agrees that companies should compensate models if they base imageson their likeness. “I think if any algorithm has used you in the training set, you should have the rights for licensing those images,” he said. It’s easy to remain pessimistic about the long-term affects this will have of fashion and body image. “I can see a future with AI where beauty standards become even more unrealistic because clothing is literally worn by people who aren’t real,” Bovell said. “If you look at the history of how tech has evolved – things like selfie sand filters – it’s not super positive.” Bovell, who is Black, does not believe that someone can only create a virtual identity that reflects their own. But she worries about the ethics of who will ultimately profit from images of models of color. “I call that robot cultural appropriation,” she said. “The core question is: who has the right to own and speak on identities that AI models represent?”

مشاركة :