The Commissioner argues that algorithms are not neutral, safe harbors around proprietary technologies are inappropriate, and outsourcing liability for algorithmic discrimination will lead to a race to the bottom among vendors.
Federal Trade Commissioner Rohit Chopra outlined concerns with the Department of Housing and Urban Development’s proposed rule regarding the Fair Housing Act’s discriminatory effects standard, also referred to as the disparate impact rule, in a comment letter submitted to HUD. Chopra believes the "proposal appears to fundamentally misunderstand how algorithms, big data, and machine learning work in practice." The Commissioner says his comment outlines three arguments against HUD’s proposed changes: algorithms are not neutral, and even valid inputs can produce discriminatory results; it is inappropriate to create safe harbors around technologies that are proprietary, opaque, and rapidly evolving; and outsourcing liability for algorithmic discrimination to third parties distorts incentives and could lead to a race to the bottom among vendors.
Algorithms. Chopra stated that HUD’s "proposed safe harbors rest on the false assumption that it is possible for an algorithm to operate free from bias and that neutral inputs produce neutral outputs." As an example, he pointed to HUD’s complaint against Facebook. The complaint alleges that the design of Facebook’s online behavioral advertising platform leads to biased results that discriminate against people from protected classes. Companies cannot use protected class statuses such as race, gender, or religion as factors to predict risk or make business decisions, Chopra noted. However, he cautioned, keeping data about protected characteristics out of an algorithm is not enough to prevent it from producing discriminatory predictions. Other attributes can serve as a substitute or proxy for a protected class when used as algorithmic inputs. HUD’s Facebook complaint, Chopra says, clearly outlines how "neutral" inputs can produce pernicious discrimination.
Safe harbors. The proposal would provide safe harbors to the same technologies at issue in HUD’s own action against Facebook, according to Chopra, a complaint that details the many ways that platforms can discriminate by design. Chopra believes the proposed safe harbors would make it too difficult for victims to bring claims and would give defendants multiple ways to justify using algorithms with a discriminatory effect to achieve "legitimate objectives." Safe harbors do not work, he said, if regulators have no visibility into the practice at issue. Algorithms are not only nonpublic, said Chopra, they are actually treated as proprietary trade secrets by many companies. "Victims of discriminatory algorithms seldom if ever know they have been victimized."
Outsourcing liability. Shielding firms that purchase algorithms from third-party vendors creates a major loophole in civil rights protections by encouraging both lenders and vendors to ignore discriminatory outcomes, Chopra contends. If landlords are shielded from liability, he says, they have little incentive to vet vendors carefully. He states that the result of this dynamic is predictable: vendors will not compete on algorithmic fairness, leading to a race to the bottom. Chopra is concerned that where there are discriminatory effects, companies and individuals would escape liability by pinning the blame on vendor algorithms, while vendors will claim that they were not the ones that made the final decision.
MainStory: TopStory BankingFinance CommunityDevelopment ConsumerCredit EqualCreditOpportunity FedTracker Loans Mortgages
Interested in submitting an article?
Submit your information to us today!Learn More
Banking and Finance Law Daily: Breaking legal news at your fingertips
Sign up today for your free trial to this daily reporting service created by attorneys, for attorneys. Stay up to date on banking and finance legal matters with same-day coverage of breaking news, court decisions, legislation, and regulatory activity with easy access through email or mobile app.