Tutoring firm settles US agency's first bias lawsuit involving AI software
2023-08-10 - Scroll down for original article
Click the button to request GPT analysis of the article, or scroll down to read the original article text
[1/2] AI (Artificial Intelligence) letters and robot miniature in this illustration taken, June 23, 2023. REUTERS/Dado Ruvic/Illustration/File photo Aug 10 (Reuters) - A China-based tutoring company has agreed to settle a U.S. government agency's novel lawsuit claiming it used hiring software powered by artificial intelligence to illegally weed out older job applicants. The 2022 lawsuit against iTutorGroup Inc was the first by the U.S. Equal Employment Opportunity Commission (EEOC) involving a company's use of AI to make employment decisions. The commission, which enforces workplace bias laws, in 2021 launched an initiative to ensure that AI software used by U.S. employers complies with anti-discrimination laws. The EEOC has warned that it will focus enforcement efforts on companies that misuse AI. ITutorGroup agreed to pay $365,000 to more than 200 job applicants allegedly passed over because of their age, according to a joint filing made in New York federal court on Wednesday. The settlement must be approved by a federal judge. The company, which provides English-language tutoring to students in China, denied wrongdoing in the settlement. The EEOC had alleged that iTutorGroup in 2020 programmed online recruitment software to screen out women aged 55 or older and men who were 60 or older. ITutorGroup, a unit of Ping An Insurance Group Co of China (601318.SS), did not immediately respond to a request for comment. An EEOC spokesperson said the agency would not comment until the settlement is approved. At least 85% of large U.S. employers are using AI in some aspects of employment, according to recent surveys. That includes software that screens out job applicants before a human reviews any applications, human resources "chatbots," and programs that conduct performance reviews and make recommendations for promotions. Many worker advocates and policymakers are concerned about the potential for existing biases to be baked into AI software, even unintentionally. In a pending proposed class action in California federal court, Workday (WDAY.O) is accused of designing hiring software used by scores of large companies that screens out Black, disabled and older applicants. Workday has denied wrongdoing. Experts expect an increasing number of lawsuits accusing employers of discriminating through their use of AI software. Reporting by Daniel Wiessner in Albany, New York; Editing by Alexia Garamfalvi and Andy Sullivan Our Standards: The Thomson Reuters Trust Principles.