On December 20, 1976, the United States Supreme Court ruled unconstitutional a state law that prohibited the sale of 3.2% beer to males under the age of 21 but not to females under the age of 18. The Court held that the law violated the Equal Protection Clause of the Fourteenth Amendment, as it discriminated against males on the basis of sex. The case, Craig v. Boren, was argued before the court by future Justice Ruth Bader Ginsburg, who was then working for the ACLU. The ruling established "intermediate scrutiny", which requires that a law that discriminates on the basis of sex, race, religion, or other protected factor be substantially related to an important government interest.
THIS WEEK:
- A federal class-action suit alleges fast-food restaurants in Alabama are using "modern-day slavery" leased from the state's prison system
- A new report reveals how well (or not) large language model AI systems can analyze SEC filings
- And Rite Aid will now be barred from using facial-recognition tech in its stores
❌ exploitation
Fast-Food Chain Gang
Are fast-food chains in Alabama using modern-day slavery?
A federal class-action lawsuit believes so. The plaintiffs, which include several current and former incarcerated people, as well as the SEIU, Union of Southern Service Workers, and more, allege that Alabama "entrapped in a system of 'convict leasing'" its disproportionately black prison population, who are then "forced to work, often for little or no money, for the benefit of the numerous government entities and private businesses that 'employ' them." This scheme has generated $450 million for the Alabama Department of Corrections (ADOC), notes the Alabama Political Reporter.
The suit further claims that the scheme incentivizes the ADOC to keep inmates incarcerated and deny them parole in order to further generate profits. After detailing the brutal working conditions of Lakiera Walker, one of the plaintiffs, who was then denied parole, the suit adds "there is no reasonable argument that Ms. Walker posed a threat to public safety for more than a decade before she was released, given her history of performing work both inside and outside prison walls without incident.” And Alabama State Representative Christopher England agrees, telling AL.com,“they work jobs every single day for eight to ten hours unsupervised… So, saying they are a threat to public safety is only a convenient excuse to deny them parole.”
As LegalDive details, some prisoners who refused to work, or even requested time off for mental/physical health, were then subject to disciplinary action by ADOC. They also note that, after various deductions, prisoners who were ostensibly paid Alabama's $7.25 minimum wage, only earned $2.06 an hour.
“This lawsuit is a strong first step toward eliminating forced labor in the Alabama prison system, and righting the wrongs from this egregious labor exploitation,” Fred Redmond of the AFL-CIO said. “Fighting to abolish forced labor is a priority of the AFL-CIO and the American labor movement. And we won’t rest until this corrupt, immoral scheme ends for good.”
Governor Kay Ivey and Attorney General Steve Marshall, who are named the architects of this scheme, have yet to publicly comment on the case.
A History of Forced Labor
In its suit, the plaintiffs cite a similar convict leasing program used by the state between 1875 and 1928. "This scheme, whereby Black laborers were forced to work for private companies, who in turn paid substantial fees to state and county governments, covered 73% of Alabama’s budget by 1898," the suit continues. "Defendants here have resurrected this practice for their own financial gain, in violation of: federal laws against human trafficking and conspiracies to deny equal protection of the laws; the Alabama Constitution, which was recently amended to outlaw all involuntary servitude; and the Ex Post Facto Clause and the First and Fourteenth Amendments of the U.S. Constitution."
THE VERDICT:
It seems shocking that such practices could still exist in the United States today. For Alabama, which has a history of using forced labor in its prison system, its an even more egregious scheme. Prison reform and prisoner rights may not be the most politically popular cause, but its a moral case worth fighting for.
🤖 Artificial intelligence
Is AI Any Good At Regulation?
It seems like an obvious move: train your firm's AI system on SEC filings and then let it perform regulatory tasks. But new research from Patronus AI should give you pause.
When even the most-advanced large language model AI system (GPT-4) was put to the test, it answered only 79% of questions about an SEC document correct. In other words: a C+ grade. As CNBC explains, the large language models tended to either refuse a response, or “hallucinate” an answer with incorrect data points.
"That type of performance rate is just absolutely unacceptable," said Anand Kannappan, Patronus’s co-founder, of the findings. "It has to be much much higher for it to really work in an automated and production-ready way."
The findings cast doubt on both how quickly AI systems can be integrated into firms, and what tasks they’ll perform. Still, it’s not slowing Wall Street down.
In April, Bloomberg announced it was working on an internal AI system based on Chat GPT. The company says the model (dubbed BloombergGPT) will help “in improving existing financial NLP (natural language processing) tasks, such as sentiment analysis, named entity recognition, news classification, and question answering, among others.” And fellow financial giant JPMorgan is reportedly working on an internal model too, having filed a patent claim for a product called IndexGPT that may be able to provide investment advice.
But doling out financial advice or judging investor sentiment, while morally dubious, is not the same as performing regulatory and compliance tasks.
"There just is no margin for error that's acceptable, because, especially in regulated industries, even if the model gets the answer wrong 1 out of 20 times, that's still not high enough accuracy," Rebecca Qian, Patronus’s co-founder, said. Kannappan added, “models will continue to get better over time. We're very hopeful that in the long term, a lot of this can be automated. But today, you will definitely need to have at least a human in the loop to help support and guide whatever workflow you have."
A Word of Caution
In a July assessment on implementing AI into regulatory tasks, consulting and accounting firm EY cautioned that "as AI tools start to become viable from a cost and value perspective, compliance professionals should perform top-down and bottom-up assessments of their operating model to proactively identify areas for potential enhancement. Designing a future-state compliance framework with key operational objectives — risk outcomes, cost reduction, process improvement — can help govern the implementation journey."
THE VERDICT:
Silicon Valley is no stranger to the hype cycle. We have been here before with cryptocurrencies, self-driving cars, and even online grocers. AI is clearly a revolutionary technology that will profoundly reshape our world, but for now, it still needs some work. Regulatory compliance departments would be wise to begin thinking of how AI can integrate into their workflows and processes, but they should not put the proverbial cart before the horse for risk of creating violation nightmares.
‍
đź”’ Privacy
Rite Aid Faces Backlash Over Tech
A proposed settlement between the Federal Trade Commission and Rite Aid will bar the national pharmacy retailer from using facial recognition technology in its stores over the next five years. The settlement will see the end of a case that is over 10 years in the making.
According to the FTC, between 2012 and 2020, Rite Aid used facial recognition software across several major cities and without customer notification to scan those entering the store and predict who may be likely to shoplift. The FTC says Rite Aid not only of falsely identified thousands of people as potential criminals, but that black and latino customers were disproportionally targeted, then followed around the store by employees or security guards and were even reported to the police. CNBC adds that Rite Aid's technology was more likely to generate false positive matches for shoplifters in predominantly black and Asian neighborhoods as opposed to predominantly white ones.
“This is a groundbreaking case, a major stride for privacy and civil rights, and hopefully just the beginning of a trend,” John Davison of the Electronic Privacy Information Center told The Guardian. “But it’s important to note that Rite Aid isn’t alone. Businesses routinely use unproven algorithms and snake oil surveillance tools to screen consumers, often in secret. The FTC is right to crack down on these practices, and businesses would be wise to take note. Algorithmic lawlessness is not an option any more.”
For its part, Rite Aid issued a statement on its website that the face-spotting tech was a "pilot program the Company deployed in a limited number of stores. Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC’s investigation regarding the Company’s use of the technology began." The retailer added that "we fundamentally disagree with the facial recognition allegations in the [FTC's] complaint."
Clearview
Beyond Rite Aid, facial recognition technology is increasingly used by law enforcement to identify and locate potential criminals. The BBC has found that police across the US have used software by the firm Clearview AI, which has a database of "30 billion images scraped from platforms such as Facebook taken without users' permissions", to conduct over 1 million searches. The Miami Police Department said that it uses the software for "every type of crime". Civil liberties groups have put intense pressure on law enforcement agencies to ban the software for both its violation of privacy, and for its poor track record with accurate matches.
THE VERDICT:
It's interesting to watch the furor over IP infringement claims that AI training models have sparked, but the relatively lackluster response to face-training models scraping our photos from the internet without our consent. That being said, civil and privacy rights groups continue to fight for protections and regulations, but widespread awareness of the issue remains elusive.
Be a smarter legal leader
Join 7,000+ subscribers getting the 4-minute monthly newsletter with fresh takes on the legal news and industry trends that matter.