See Full Research Library
Join Mailing List
AI and machine learning can help FIs avoid risk — but they have risk of their own.
Mercator Advisory Group releases a new research report that examines the impact of hidden biases in ML and Artificial Intelligence—and how to avoid them.
Published on: August 10, 2020 Author: Tim Sloane Alternate Point of Contact: Amy Dunckelmann
AI models reflect existing biases if these biases are not explicitly eliminated by the data scientists developing the systems. Constant monitoring of the entire operation is required to detect these shifts. The remedy for such lack of focus is training.
Mercator Advisory Group’s latest research Report, Tracking Mistakes in AI: Use Vigilance to Avoid Errors, discusses modes in which data models can deliver biased results, and the ways and means by which financial institutions (FIs) can correct for these biases.
“AI solutions can unwittingly go astray,” comments Tim Sloane, the Report’s author and director of Mercator Advisory Group’s Emerging Technology Advisory Service and its VP Payments Innovation. “Applying AI to issues that can have large negative social consequences should be avoided. One example of this is using AI to implement the business plan of social networks Facebook, You Tube, and others, as presented in the documentary “The Social Dilemma.” The documentary contends that social networks have optimized AI to drive advertising revenue at the expense of the individual and society. To drive revenue, social networks build psychographic models for each user to predict exactly which content will best engage that user.”
This document contains 15 pages and 3 exhibits.
Companies mentioned in this research note include: The Federal Reserve, ProPublica, The Verge.
(Click to Enlarge)
Highlights of the research note include:
Commercial & Enterprise Payments
Debit & Alternative Products
North American PaymentsInsights
Small Business PaymentsInsights
Fraud Experience PaymentsInsights
News & Events
Copyright 2003 – 2022 by Mercator Advisory Group, Inc. | All Rights Reserved