- Catholic Review - https://catholicreview.org -

Catholic Law conference puts spotlight on Big Tech ethics in the era of AI

WASHINGTON (OSV News) — Artificial intelligence is rapidly changing the way people live and work. But who decides what society is owed by those who develop it, sell it and use it? Especially when it comes to the potential good or harm AI might — or can — do?

A daylong gathering on Nov. 14 in Washington at The Catholic University of America Columbus School of Law, also known as Catholic Law, approached those questions with a series of speakers and panels.

Taylor Black, director of AI & Venture Ecosystems in the Office of the Chief Technology Officer at Microsoft, opened the morning with a keynote address for the Conference on Corporate Social Responsibility of Big Tech posing the query, “What might we ask of Big Tech, through a Catholic lens?”

Black, who was named founding director of Catholic University’s new interdisciplinary Institute for AI & Emerging Technologies in September, said Catholic social teaching can be “super helpful” — but he had a reminder.

“Catholic social teaching does not begin with systems or technologies or institutions. It begins with us, the human person, created in the image and likeness of God — endowed with inherent dignity that no algorithm, market force, or technological system can confer or remove,” he told his audience.

“Instead of asking, ‘What does this technology do?’ we ask, ‘What does this technology do to the human person?’ Instead of asking, ‘How fast can we scale this?’ we ask, ‘Who is harmed, who is helped, and who is left behind?’ Instead of asking, ‘What is legally permissible?’ we ask, ‘What leads to human flourishing and the common good?'” explained Black.

He suggested several areas where the conference could create “real movement,” including building shared frameworks for responsible AI and tech companies; fostering cross-sector accountability structures; and investing in formation, not just training.

“The goal of this conference is not simply to discuss technology,” concluded Black. “It’s to shape the moral ecosystem in which technology is built.”

The first panel of the day, “Big Tech as Facilitator of Exploitation” — moderated by Catholic Law professor Mary Graw Leary — included Danielle Bianculli Pinter, chief legal officer and director of the National Center on Sexual Exploitation Law Center; John Cotton Richmond, president of the Libertas Council and retired U.S. ambassador-at-large to monitor and combat trafficking in persons; and Annick Febrey, co-founder and principal of the Better Trade Collective.

Bianculli Pinter said Big Tech corporate responsibility teams do care, but are ignored by executives. The industry is largely unregulated — and spends major sums of money to avoid regulation — while also enjoying near-blanket immunity, she noted.

“This is a societal crisis, but it’s easily solved,” Bianculli Pinter said. “We need to implement liability.”

Annick Febrey explained how tech is used to both fuel forced labor and mask it — by luring potential employees to promised good jobs in which they find themselves practically enslaved.

“In terms of scale and sectors, there’s estimated to be nearly 28 million people in forced labor,” said Febrey.

“Technology is probably morally neutral,” suggested Cotton Richmond. “It can be used for bad, or it can be used for good.”

He added, “We keep being surprised that we have a human adversary — that there actually is a group of individuals in the world who want to not treat people in a way that shows dignity … but instead tries to commoditize and devalue people for their own illicit profit.”

The mid-morning panel, “Corporate Responsibility and Ethics in the Era of AI” — moderated by Meaghan Pedati, senior counsel at Mars Inc. — included Charles Duan, assistant professor at American University’s Washington College of Law; Adam Eisgrau, senior director of AI, creativity and copyright policy at the Chamber of Progress; Paul Lekas, senior vice president and head of global public policy and government affairs at the Software & Information Industry Association; and Maryann Cusimano Love, chair of Catholic University’s Department of Politics.

“The Catholic Church has actually been engaged for quite a while — trying to engage with industry; engage with users — on these questions of building ethics into the AI that we develop,” said Cusimano Love, a consultant to the Holy See Mission at the United Nations.

The “Rome Call for AI Ethics” was signed by the Vatican, Microsoft, and IBM in 2020. It advocated for transparency, inclusion, responsibility, impartiality, reliability, and security and privacy.

Other institutions have also released AI codes, which — while “soft law” without mandated observance — have a way, Cusimano Love noted, of eventually becoming “hard law” with enforceable compliance.

“Largely, we have consensus on what AI ethics are,” Lekas said. “It’s just a question of how do you put those into practice?”

Eisgrau discussed issues around AI and fair use, an emerging legal question as generative AI is trained with existing, copyrighted works. Will we overprotect, or “as the Constitution compels us to do, we continue to promote by striking a legal, and an ethical, and a policy balance?” he asked.

“You have to take these high-level considerations of what’s ethical, what is proper, what sort of things do we want to say yes or no to,” explained Duan, “and then turn that into something that a computer system can understand in the form of guardrails.”

At lunch, attendees heard from Rep. Brandon Guffey, R-S.C., whose son encountered a scammer on Instagram and unwittingly became a victim of sexual extortion — an encounter that led him to take his life. Guffey, who sued Instagram, and in February testified before the U.S. Senate Committee on the Judiciary, has since become a spokesperson against online crimes.

The final panel of the day was on “Sustainability: Risk Management in Light of Changing Expectations” moderated by John Polanin, director of the Corporate Responsibility and Compliance Program at Catholic Law. Panelists included David Curran, co-chair of the Sustainability Advisory Practice and executive director of the Sustainability and Law Institute; Brian Downing, assistant professor of law at the University of Mississippi School of Law; Erica Lasdon, program director for climate change and environmental justice at the Interfaith Center on Corporate Responsibility; and Kevin Tubbs, retired vice president and chief ethics, compliance and sustainability officer at Oshkosh Corporation.

“The business dynamic drives anything in sustainability,” Curran said. “The laws and regulators are far behind.”

Erica Lasdon noted that while some companies may have initially considered sustainability something of an optional extra, “In operational terms, I think it’s much better understood than it was 20 years ago … that those are material factors.”

While environmental issues have become widely partisan, Tubbs urged corporations to examine them from another angle.

“Let’s get away from the politics,” he advised, “and say, ‘Is it serving the need that we need to serve, in the most efficient way possible?'”

Downing admitted he was surprised mainstream media hasn’t focused more intensely on data centers — facilities that manage and store data using physical infrastructure and virtual technology, and which also have an environmental impact.

“When is the shoe going to drop on the data centers — who benefits from data centers?” asked Downing.

The conference was a collaborative effort by Catholic University’s Corporate Responsibility and Compliance Program, Law and Technology Institute and Bakhita Initiative for the Study and Disruption of Modern Slavery.

A video of Taylor Black’s keynote speech can be found at https://youtu.be/uAP2qoLOkWw.

Read More Colleges

Copyright © 2025 OSV News