Remarks of Director Rohit Chopra at a Joint DOJ, CFPB, and OCC Press Conference on the Trustmark National Bank Enforcement Action
Good morning. It’s a pleasure to join Attorney General Merrick Garland, Assistant Attorney General Kristen Clarke, and Acting Comptroller Michael Hsu.
Owning a home or a small business has long been a vehicle for creating wealth and neighborhood stability in our country. Many of our nation’s policies of the past, like redlining, led to the exclusion of too many families and too many neighborhoods to have those opportunities. The impact of those policies continue to affect us today.
One 2018 study found the median value of a home in redlined areas is $276,000, nearly $50,000 less than the $325,000 median value of homes in the surrounding areas. The gap demonstrates how the practice – outlawed in 1968 – has had a lasting impact on neighborhoods and the people who live there.
Disparate homeownership rates are a major cause of wealth disparity in this country, with 74% homeownership rates in white communities but only 44% in Black communities.
In terms of how this plays out today – the median family wealth of a Black family is $24,000 compared to $188,000 for a white family.
If we allow racist and discriminatory policies to persist, we will not live up to our country’s ideals. We need a fair housing market that is free from old forms of redlining, as well as new digital and algorithmic redlining.
Today’s action against Trustmark involved a form of redlining that we have seen for decades, which is unacceptable, where a lender acts to discriminate against and discourage prospective applicants from seeking credit in certain neighborhoods on the basis of race. The people of Memphis deserve better. The CFPB will continue to do its part to end this form of redlining, but it is an endemic problem that requires all hands on deck, with enforcement from across the federal government, as well as our state, local, and tribal partners.
Trustmark’s conduct was egregious, but at the CFPB, we will also be closely watching for digital redlining, disguised through so-called neutral algorithms, that may reinforce the biases that have long existed.
Technology companies and financial institutions are amassing massive amounts of data and using it to make more and more decisions about our lives, including loan underwriting and advertising.
While machines crunching numbers might seem capable of taking human bias out of the equation, that’s not what is happening.
Findings from academic studies and news reporting raise serious questions about algorithmic bias. For example, a statistical analysis of 2 million mortgage applications found that Black families were 80% more likely to be denied by an algorithm when compared to white families with similar financial and credit backgrounds. The response of mortgage companies has been that researchers do not have all the data that feeds into their algorithms or full knowledge of the algorithms. But their defense illuminates the problem: the algorithms are black boxes behind brick walls. When consumers and regulators do not know how decisions are made by the algorithms, consumers are unable to participate in a fair and competitive market free from bias. Algorithms can help remove bias, but black box underwriting algorithms are not creating a more equal playing field and only exacerbate the biases fed into them.
Given what we have seen in other contexts, the speed with which banks and lenders are turning lending and advertising decisions over to algorithms is concerning. Too many families were victimized by the robo-signing scandals from the last crisis, and we must not allow robo-discrimination to proliferate in a new crisis.
We should never assume that algorithms will be free of bias. If we want to move toward a society where each of us has equal opportunities, we need to investigate whether discriminatory black box models are undermining that goal.
I am pleased that the CFPB will continue to contribute to the all-of-government mission to root out all forms of redlining, including algorithmic redlining.
Thank you.