Request Media Kit

FTC: Make Sure Your AI Algorithms Are Unbiased…Or Else

The Federal Trade Commission (FTC) has sent a stark warning for companies to ensure their AI algorithms are unbiased...or else....
FTC: Make Sure Your AI Algorithms Are Unbiased…Or Else
Written by Matt Milano
  • The Federal Trade Commission (FTC) has sent a stark warning for companies to ensure their AI algorithms are unbiased…or else.

    AI is being adopted across a wide spectrum of industries. Unfortunately, studies repeatedly demonstrate the propensity for AI algorithms to be biased. In many cases, this is the result of the datasets used to train AIs not reflecting the necessary diversity.

    In a blog post, the FTC addresses this issue:

    Watch out for discriminatory outcomes. Every year, the FTC holds PrivacyCon, a showcase for cutting-edge developments in privacy, data security, and artificial intelligence. During PrivacyCon 2020, researchers presented work showing that algorithms developed for benign purposes like healthcare resource allocation and advertising actually resulted in racial bias. How can you reduce the risk of your company becoming the example of a business whose well-intentioned algorithm perpetuates racial inequity? It’s essential to test your algorithm – both before you use it and periodically after that – to make sure that it doesn’t discriminate on the basis of race, gender, or other protected class.

    The FTC also warns companies to be careful not to overpromise what their AI can do, such as advertising a product that delivers “100% unbiased hiring decisions,” yet was created with data that wasn’t truly diverse. The FTC advises companies to be transparent, use independent standards and be truthful about how they will use customer data.

    The FTC warns that companies failing to follow its advice will deal with the consequences:

    Hold yourself accountable – or be ready for the FTC to do it for you. As we’ve noted, it’s important to hold yourself accountable for your algorithm’s performance. Our recommendations for transparency and independence can help you do just that. But keep in mind that if you don’t hold yourself accountable, the FTC may do it for you. For example, if your algorithm results in credit discrimination against a protected class, you could find yourself facing a complaint alleging violations of the FTC Act and ECOA. Whether caused by a biased algorithm or by human misconduct of the more prosaic variety, the FTC takes allegations of credit discrimination very seriously, as its recent action against Bronx Honda demonstrates.

    Get the WebProNews newsletter
    delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit