AI智能总结
An analysis of recent statementsabout technology by unions andother worker organizations By Mishal Khan and Kung Feng Over the past few years,various unions and workerorganizations have published a series of principles, publicstatements, frameworks, and resolutions articulating a vision forhow AI and other digital technologies should be developed anddeployed in the workplace. We analyzed 17 of these documentsby 15 organizations and identified key values being put forward.This report represents our interpretation of these documents. (Seethe “Legend of union and worker organization acronyms” belowfor an abbreviation key). LEGEND OF UNION AND WORKER ORGANIZATION ACRONYMS American Federation of State, County,and Municipal Employees:AFSCMEAmerican Federation of Teachers:AFTCalifornia Faculty Association:CFACalifornia Federation of Labor Unions:CFLUCommunication Workers of America:CWAHealth Career Advancement Program:HCAPHuman Artistry Campaign:HACInternational Alliance of Theatrical StageEmployees:IATSE National Education Association:NEANational Nurses United:NNUNational Writers Union:NWUTeamsters:IBTTrades Union Congress:TUCUNI Global Union (AlgorithmicManagement):UNIAUNI Global Union (Data Protection):UNIDUNI Global Union (General):UNIUNITE HERE:UH Transparency and disclosure THEME 1:Layingdownrules forresponsibletech use Transparency rights are vitally important to labor. They areunderstood to be foundational to all other rights (UNI) and forstrengthening collective bargaining (CWA). Transparency aroundthe use of digital technologies in the workplace is articulated as both the right to advancenotice (CFLU, CWA, NNU,NEA, UNI, TUC, IATSE)and the right of post-useexplanation (UNI, UNID).Companies should discloseany copyrighted worksused to train an AI system(NWU, IATSE), clearlyidentify AI-generatedcontent (AFSCME), andshare the results of impact assessments with workers (UNI). For UNITE HERE, transparencymeans ensuring that algorithmic systems provide not only clearinstructions but also clear explanations for why specific tasks areassigned (UH). For others, the data and ethical considerations thatgo into making an AI system should be available for investigationwhen questions of liability arise (AFSCME, UNI). Guardrails around the collection and use of data For many groups, worker data rights are fundamental as newdigital technologies are used in the workplace. Workers must beable to know about, access, correct, and delete any data gatheredabout them by employers (UNID, UH, NNU, CFLU, CWA). In addition,they should be able to influencehowemployers use their data—especially if it is used to make employment decisions about themor to train an AI system (UNIA, CFLU, TUC, CWA). UNI Global arguesthat worker consent, by itself, does not provide adequate protectionwhen employers collect worker data, and that additional guardrailsare necessary (UNID). These include strict data minimization rules,opportunities for workers to actively opt in or out of data collection,and placing limits on surveillance without a clear purpose (UNID, NNU, NEA). Other concerns revolve around the privacy of workersand the public they serve (NNU, AFT), the safe storage of workerdata (NEA, AFT, UH), and the right to transfer worker data betweenplatforms upon request (UNID). THEME 1:Layingdownrules forresponsibletech use Human-made employment decisions The right to have important decisions about workers made by ahuman—not an algorithm—is critical for worker groups (CWA,UNIA). For the California Federation of Labor Unions, decisions suchas “hiring, discipline, terminations, work quotas, or wage setting” aresimply too important to be made solely by an algorithm. NEA arguesagainst employers relying on technology to evaluate or disciplineteachers, instead emphasizing the importance of collaborativeprocesses, personalized feedback, and providing opportunities forgrowth. Protection from discrimination and bias Numerous organizations recognize that digital technologies canmanifest bias—especially when used in hiring, promotion, and firing(AFSCME, UH, UNI, NEA, HCAP). Workers therefore should have theright to be protected from any potentially discriminatory impacts ofthese technologies. Employers should set procurement standardsfor the technologies they purchase, regularly test for harms, and bantechnologies found to be problematic (NEA, UNI, CWA). UNI Globalpoints out that certain business practices are inherently biased,highlighting for example that customer ratings can be a “backdoorto bias and discrimination” (UNIA). For NEA, it is important thatthe decision-makers who shape how technology is deployed ineducational settings are themselves from diverse backgrounds. Health and safety at work Labor wants to prioritize workers’ rights to health and safety as newdigital technologies are introduced (CWA, TUC, NNU). Testing forhealth and safety harms through impact assessments is importantfor ensuring these protections (CFLU, NEA). In add



