Department of Labor

Tech@Work — September 7, 2023

Maddie Chang

Maddie Chang is a student at Harvard Law School.

In today’s Tech@Work, Wisconsin launches a state taskforce on AI and the workforce; FTC Commissioner Bedoya expresses concern about Hollywood’s coercive practice of scanning background actors’ bodies in an LA Times editorial, and the Department of Labor’s federal contract watchdog will require that the contractors it audits disclose the use of AI hiring tools. 

Late last month Wisconsin Governor Tony Evers signed an executive order to create a new statewide taskforce to focus on the impact of artificial intelligence (AI) on the workforce. Housed within the state’s Department of Workforce Development, the taskforce will include members of state of local governments, organized labor, technology company representatives, and representatives from the University of Wisconsin and Wisconsin Technical College Systems. The group is tasked with studying and producing recommendations to address how generative AI will affect Wisconsin’s key industries, such as manufacturing, healthcare, education, transportation, and agriculture. According to AP reporting, in creating this taskforce, Wisconsin joins Texas, North Dakota, West Virginia and Puerto Rico who have in recent months launched similar state committees dedicated to examining various aspects of AI. 

On Labor Day FTC Commissioner Alvaro Bedoya authored an op-ed in the LA Times expressing concern about the implications of artificial intelligence for workers in Hollywood. Bedoya asks readers: “Who should decide whether those are used to train a for-profit algorithm — you? Or someone else?” He links to a series of social media posts where background actors tell tales of agreeing to have their bodies or faces scanned in highly coercive contexts. An actor for the television series Westworld recounted that the show “scanned us fully nude with explicit coercion that if we didn’t get scanned, we’d be terminated. They were going to make life-size molds of our bodies to use in background. I hated it, but needed the job and was non-union at the time.” Without getting too specific, Commissioner Bedoya goes on to note that practices such as scanning actors’ bodies and feeding writers’ scripts into generative AI models to produce future scripts, are as much competition concerns as they are labor issues. He warns that these practices will reverberate beyond Hollywood, writing: “I worry that what’s happening in the entertainment industry is part of a broader effort to digitize and appropriate our capacity for human connection — starting with the exact workers with the least power to say no.”

Finally, the Department of Labor’s Office of Federal Contract Compliance Programs (OFCCP) will require entities it audits that contract with the federal government to provide information about the technology tools they use to recruit and hire employees. The OFCCP’s updated supply and service scheduling letter and itemized listing asks audited companies to list and explain policies around “artificial intelligence, algorithms, automated systems, or other technology-based selection procedures.” This is the first time Labor’s contractor watchdog will request this kind of information on data-driven hiring practices. It comes at a time when other federal agencies are taking action on AI hiring bias, most notably as reflected in the Equal Opportunity Employment Commission’s draft enforcement plan and the Federal Trade Commission (FTC)’s joint statement with the Consumer Financial Protection Bureau and the EEOC against bias and discrimination in automated systems. 

Enjoy OnLabor’s fresh takes on the day’s labor news, right in your inbox.