In today’s Tech@Work, a Facebook page that commemorates the lives of food delivery workers who’ve died on bicycles highlights perilous working conditions; the advent of generative AI points to the need for labor protections for models in the fashion industry; and Sports Illustrated is accused of using AI generated journalists to write AI generated articles.
The New York Times reported this week on a Facebook page that commemorates the lives of bicycle gig delivery workers who’ve died on the job, highlighting what’s become one of the most deadliest jobs in New York City. A 2022 report by the City found that in construction, previously considered one of the most dangerous industries, there was a rate of about seven fatalities per 100,000 workers in 2020 (most recent available data), compared to about 36 fatalities per 100,000 non-car delivery workers in 2021. New York’s app-based delivery workers won a major victory last year in pressuring the City to set an $18 minimum wage. And yet as the Facebook page documents, the working conditions of app-based delivery continue to be perilous.
The advent of generative AI is changing the role of human fashion models, posing a threat to jobs and prompting calls from models for labor protections. As Bloomberg reports, fashion companies that have traditionally used human fashion models are beginning to use AI generated faces and bodies in their stead. AI generated fashion models, such as “Shudu,” are inspired by real people’s faces, but don’t have to be compensated like their human predecessors / counterparts. As noted by Bloomberg, a “survey by the Model Alliance, a nonprofit advocacy group, found nearly 18% of the 106 responding models reported being asked to undergo a body scan for a 3D model of their body or face, without knowing how the scan would be used.” This lack of knowledge, consent, or compensation for future usage of models’ images parallels background actors’ concerns in the SAG-AFTRA negotiations – that companies would exploit workers’ relatively weak economic position and reuse their likeness without proper compensation. Replacing models would ripple throughout the industry, affecting hair stylists, makeup artists, and others who work alongside models. Unlike background actors, however, models are independent contractors with few existing labor protections. As a first step towards basic protections, the Model Alliance is pushing for a bill in the New York State Legislature. It passed in the Senate and is now stalled in the Assembly.
At the end of November, the online publication Futurism reported that Sports Illustrated created fake profiles of AI-generated journalists to publish stories that were potentially written by generative AI. One of these “authors” was Drew Ortiz, who according to his byline enjoyed the outdoors. Futurism couldn’t find prior records of Ortiz’s work online, but did find his face on a website that sells AI-generated headshots, with the description: “neutral white young-adult male with short brown hair and blue eyes.” One of the articles written by Ortiz sounded a bit off by human standards, saying that volleyball “can be a little tricky to get into, especially without an actual ball to practice with.” According to Futurism, as soon as its story went up, Sports Illustrated took down the articles in question. The union that represents Sports Illustrated writers issued a statement denouncing the AI authors, saying “if true, these practices violate everything we believe in about journalism.” Sports Illustrated’s publisher The Arena Group denied that the stories were AI-generated, and said that sometimes writers use pen names to protect their privacy. This episode raises questions about how the journalism industry will address not only the use of AI in journalistic output, but also its use to create fake workers. It also indicates that fields that are generally thought of as creative are just as at risk for displacement as compared to industries that have more traditionally grappled with the impact of automation.