Credit: CC0 Public Domain

A study by a U.S. agency has found that facial recognition technology often performs unevenly based on a person's race, gender or age.

This is the first time the National Institute of Standards and Technology has investigated demographic differences in how face-scanning algorithms are able to identify people.

It comes as lawmakers and have raised concerns about biased results in the commercial face recognition software increasingly used by law enforcement, airports and a variety of businesses.

The report published Thursday was based on research testing the algorithms of nearly 100 companies on millions of mugshots, visa application photos and other government-held images.