A panel of Pennsylvania attorneys speaking on advances in the use of artificial intelligence in criminal justice and surveillance expressed concern over the potential misuse of such technologies, predicting they could result in rights violations on both individual and mass scales.
The discussion took place Friday at the Philadelphia Bench-Bar & Annual Conference at the Borgata Hotel Casino & Spa in Atlantic City. It spanned from the use of police body cameras to consumer security devices, to speculation as to whether a "Minority Report"-style system — a reference to the Philip K. Dick novella and later Stephen Spielberg film about a computer that can predict crimes — could be implemented in the U.S.
The panel members admitted there is utility to using AI for time-saving measures like summarizing deposition testimony, but they also noted that unfettered use by prosecutors and police without analyzing bias in programming can lead to serious consequences.
"I'm scared as hell because there are major issues," said panelist and criminal defense lawyer Troy H. Wilson of Wilson Law Offices in Philadelphia.
"What I learned early on dealing with these issues is it's not about the computer, it's about the physical person developing the algorithm, it's about the physical person putting his or her biases in the program," Wilson continued.
Chad Marlow, senior policy counsel at the American Civil Liberties Union, said tools like facial and gait recognition can be affected by biased programming in that it often displays reduced accuracy in identifying Black, female and elderly people.
Marlow also noted that AI used to summarize police reports and from audio captured from body cams is programmed to find evidence of guilt and may not look for exculpatory evidence, increasing the risk of prosecuting innocent people.
He said the risk is increased when law enforcement uses tools to identify potential suspects of crimes not yet committed, mentioning that such technology is especially dangerous in the hands of authoritarian governments. Marlow added that such a "Minority Report" system already exists in China and that the idea of predicting crimes is based on what amounts to analytical stereotypes of certain populations.
"Predictive policing follows in the garbage in, garbage out data analysis," Marlow said.
Additionally, Marlow said that in all aspects of life, people can exhibit "automation bias," or the belief that an AI-generated analysis is superior to human reasoning simply because it is the product of advanced technology.
"People are just deferring to computers," he said.
Recordings from police body cameras of interactions between officers and community members, whether the incidents they depict result in criminal charges or not, could be used to misinform AI as to what criminal activity looks like, said Catherine Twigg, who is general counsel to the city of Philadelphia's Citizens Police Oversight Commission.
"We're creating a trove of data that might look like it tells you what crime is like in Philadelphia, but it really tells you what policing is like in Philadelphia," Twigg said.
Twigg also expressed concern about what she saw as an overall lack of oversight in the tech industry, with AI being developed faster than the government can regulate it. She also pointed to consumer companies like Ring partnering with law enforcement agencies to allow access to doorbell cameras, turning unsuspecting residents' security devices into part of a mass surveillance apparatus.
--Editing by Lakshna Mehta.
Try our Advanced Search for more refined results
Law360
|The Practice of Law
Access to Justice
Aerospace & Defense
Appellate
Asset Management
Banking
Bankruptcy
Benefits
California
Cannabis
Capital Markets
Class Action
Colorado
Commercial Contracts
Competition
Compliance
Connecticut
Construction
Consumer Protection
Corporate
Criminal Practice
Cybersecurity & Privacy
Delaware
Employment
Energy
Environmental
Fintech
Florida
Food & Beverage
Georgia
Government Contracts
Health
Hospitality
Illinois
Immigration
Insurance
Intellectual Property
International Arbitration
International Trade
Legal Ethics
Legal Industry
Life Sciences
Massachusetts
Media & Entertainment
Mergers & Acquisitions
Michigan
Native American
Law360 Pulse
|Business of Law
Law360 Authority
|Deep News & Analysis
Healthcare Authority
Deals & Corporate Governance Digital Health & Technology Other Policy & ComplianceGlobal
- Law360
- Law360 UK
- Law360 Pulse
- Law360 Employment Authority
- Law360 Tax Authority
- Law360 Insurance Authority
- Law360 Real Estate Authority
- Law360 Healthcare Authority
- Law360 Bankruptcy Authority
- Products
- Lexis®
- Law360 In-Depth
- Law360 Podcasts
- Rankings
- Leaderboard Analytics
- Regional Powerhouses
- Law360's MVPs
- Women in Law Report
- Law360 400
- Diversity Snapshot
- Practice Groups of the Year
- Rising Stars
- Titans of the Plaintiffs Bar
- Sections
- Adv. Search & Platform Tools
- About all sections
- Browse all sections
- Banking
- Bankruptcy
- Class Action
- Competition
- Employment
- Energy
- Expert Analysis
- Insurance
- Intellectual Property
- Product Liability
- Securities
- Beta Tools
- Track docs
- Track attorneys
- Track judges
Are you happy at your in-house job?
Click here to take the Law360 survey
This article has been saved to your Briefcase
This article has been added to your Saved Articles
Attys Suspect AI In Police Surveillance Could Lead To Bias
By P.J. D'Annunzio | October 14, 2025, 3:44 PM EDT · Listen to article