Accenture to launch new tool to help customers identify and fix unfair bias in AI algorithms

ai
Credit: CC0 Public Domain

Accenture, a professional services company, will soon launch a new tool aimed at helping its customers find unfair bias in AI algorithms. Once such unfair bias is discovered, reps for the company have told the press, they can be removed.

As scientists and engineers continue to improve AI technology, more companies are using AI-based tools to conduct business. Using AI to process credit applications is becoming routine, for example. But there has a been a concern that such applications might have biases built in, which produce results that could be construed as unfair. Such applications might, for example, have a baked-in racial or gender bias, which could skew results. In response to such claims, many large corporations have begun adding bias screeners to their suite of applications. But as reps for Accenture have pointed out, smaller companies likely do not have the resources to do that. The new they have developed will be marketed with these companies in mind.

A prototype of the new tool (which some have begun calling the Fairness Tool) is currently being field tested with an unknown partner on credit risk applications. The company has announced that they are planning for a soft launch in the near future. They have also announced that the tool will be part of a larger program offered by the called AI Launchpad. In addition to AI tools, the program also includes ethics and accountability training for employees.

To figure out which sorts of data being used by an AI application might represent bias, the tool uses statistical methods designed to compare predefined variables as fair or unfair. It will also look at other variables that might contain hidden bias. Data that includes income, for example, might be representing a hidden against women or minorities, even if there is no data specifying gender or race. The tool can also be used to implement changes to a customer's algorithm, hopefully making it less biased. But reps for Accenture note that doing so has been found to cause some algorithms to be less accurate overall.


Explore further

Researchers tackle bias in algorithms

More information: newsroom.accenture.com/news/ac … testing-services.htm

© 2018 Tech Xplore

Citation: Accenture to launch new tool to help customers identify and fix unfair bias in AI algorithms (2018, June 13) retrieved 16 December 2018 from https://techxplore.com/news/2018-06-accenture-tool-customers-unfair-bias.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
40 shares

Feedback to editors

User comments

Jun 13, 2018
What about data that sets insurance rates beased on crime in a given area? Do we apply a Politically Correct Filter?

Jun 13, 2018
"a new tool aimed at helping its customers find unfair bias in AI algorithms2

Can we use that on people, too?

Jun 13, 2018
yeah, this ought to go well. about as well as LBJ's welfare program.

Jun 13, 2018
d_b & z, since the two of you are too incompetent to honestly compete in modern society? Let's ship you both back to which ever old country pesthole your gramps ran away from.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more