ALL DATA IS NOT CREDIT DATA: CLOSING THE GAP BETWEEN THE FAIR HOUSING ACT AND ALGORITHMIC DECISIONMAKING IN THE LENDING INDUSTRY

ALL DATA IS NOT CREDIT DATA: CLOSING THE GAP BETWEEN THE FAIR HOUSING ACT AND ALGORITHMIC DECISIONMAKING IN THE LENDING INDUSTRY

In June 2015, the Supreme Court decided Texas Department of Housing and Community Affairs v. Inclusive Communities Project, Inc. and held that disparate impact claims are cognizable under the Fair Housing Act. Four years later, in August 2019, the Department of Housing and Urban Development published a proposed rule purporting to align the agency’s regulations with the Supreme Court’s interpretation of the Fair Housing Act in Inclusive Communities. The proposed rule, however, is inconsistent with Inclusive Communities and, in practice, effectively allows lenders to circumvent liability for algorithm-based disparate impact. This Note argues that these consequences are the result of a gap in statutory accountability within the Fair Housing Act for algorithm-based discrimination. It then calls for more permanent solutions to this problem that would prevent HUD—or any other agency under any administration—from interpreting the Fair Housing Act in a manner that contravenes the statute’s history and purpose.

The full text of this Note can be found by clicking the PDF link to the left.

Introduction

When Rachelle Faroul, a thirty-three-year-old Black woman, applied for a loan from Philadelphia Mortgage Advisers in April 2016, she didn’t foresee any problems. 1 Aaron Glantz & Emmanuel Martinez, For People of Color, Banks Are Shutting the Door to Homeownership, Reveal (Feb. 15, 2018), https://www.revealnews.org/article/for-people-of-color-banks-are-shutting-the-door-to-homeownership [https://perma.cc/GTQ3-N8RY] [hereinafter Glantz & Martinez, Shutting the Door]. She was a Northwestern University graduate, had good credit and a decent amount of savings, and at the time was making approximately $60,000 a year as a computer programming instructor at Rutgers University. 2 Id. But Philadelphia Mortgage Advisers denied her initial loan application, citing that her contract income was inconsistent. 3 Id. Rachelle persisted and got a full-time job at the University of Pennsylvania and again applied for a loan, this time with Santander Bank. 4 Id. Santander Bank also denied her application. 5 Id. By that point, Rachelle had been trying to get a mortgage loan for over a year, and the several hard inquiries from the various lenders she had sought a loan from had lowered her credit score. 6 Id. It wasn’t until Rachelle’s partner, who is half-white and half-Japanese and was then working part-time at a grocery store, agreed to sign on to the loan application that Rachelle was finally approved for the loan. 7 Id. While Rachelle spent over a year attempting to get her loan approved, Jonathan Jacobs—another loan applicant—was approved for his loan from TD Bank soon after filling out the paperwork, which took him all of about fifteen minutes. 8 Aaron Glantz & Emmanuel Martinez, Gentrification Became Low-Income Lending Law’s Unintended Consequence, Reveal (Feb. 16, 2018), https://www.revealnews.org/article/gentrification-became-low-income-lending-laws-unintended-consequence [https://perma.cc/BC93-GG5C]. He had almost no savings, a modest income, and a less-than-stellar credit report. 9 Id. But Jonathan is white. 10 Id.

Such stark racial disparities are not unique to Rachelle’s and Jonathan’s stories or to Philadelphia. 11 Philadelphia, however, is notorious for having one of the widest lending disparities among the largest U.S. cities. See Emmanuel Martinez & Aaron Glantz, How Reveal Identified Lending Disparities in Federal Mortgage Data 2 (2018), https://s3-us-west-2.amazonaws.com/revealnews.org/uploads/lending_disparities_whitepaper_180214.pdf [https://perma.cc/PY5C-4E8H] [hereinafter Martinez & Glantz, How Reveal Identified Lending Disparities] (explaining that the authors chose to focus investigation on Philadelphia because “among the largest metro areas, it has one of the widest lending disparities”). In 2018, Reveal from the Center for Investigative Reporting conducted a study of thirty-one million mortgage records covering nearly every time an American sought a conventional mortgage loan in 2015 and 2016. 12 Banks Continue to Deny Home Loans to People of Color, Equal Just. Initiative (Feb. 19, 2018), https://eji.org/news/banks-deny-home-loans-to-people-color [https://perma.cc/TK33-P6CT]. The analysis revealed that, even after controlling for several economic and social factors, Black applicants were almost three times more likely than white applicants to be denied a conventional home purchase loan. 13 Martinez & Glantz, How Reveal Identified Lending Disparities, supra note 11, at 2. Reveal also reported that lenders acknowledged the disparate impact of lending industry practices on people of color, but that lenders also claimed that the racial disparity can be explained by factors the industry has kept hidden from the public, such as credit scores. 14 Glantz & Martinez, Shutting the Door, supra note 1.

Housing discrimination, however, is not a recent problem; in fact, it has a long, sordid history in the United States. Ongoing housing discrimination throughout the twentieth century ultimately prompted Congress to act by passing the Fair Housing Act of 1968 (FHA), which prohibits discrimination in the sale, rental, and financing of housing on the basis of race, color, religion, sex, national origin, disability status, and familial status. 15 Title VIII of the Civil Rights Act of 1968, Pub. L. No. 90-284, 82 Stat. 81 (codified as amended at 42 U.S.C. § 3601); Fair Housing Act, History (Jan. 27, 2010), https://www.history.com/topics/black-history/fair-housing-act [https://perma.cc/7XDW-DAMY] (last updated Sept. 12, 2018). But outlawing only overtly discriminatory practices would not adequately address the country’s long history of housing discrimination. Consequently, courts and government agencies began applying the disparate impact framework first developed in the employment context in Griggs v. Duke Power Co. to housing discrimination claims. 16 See 401 U.S. 424, 431 (1971) (“The [Civil Rights] Act proscribes not only overt discrimination but also practices that are fair in form, but discriminatory in operation.”); Michael G. Allen, Jamie L. Crook & John P. Relman, Assessing HUD’s Disparate Impact Rule: A Practitioner’s Perspective, 49 Harv. C.R.-C.L. L. Rev. 155, 156 (2014) (“Consistent with . . . legislative intent, courts across the country have applied the disparate impact standard in evaluating claims under the FHA, in recognition that ‘[e]ffect, not motivation, is the touchstone because a thoughtless housing practice can be as unfair to minority rights as a willful scheme.’” (quoting Smith v. Anchor Bldg. Corp., 536 F.2d 231, 233 (8th Cir. 1976))).

While all eleven federal circuit courts to consider the question recognized disparate impact claims under the FHA, 17 Allen et al., supra note 16, at 156 (“Every circuit to consider the question—eleven in all—has held that the FHA prohibits housing practices that have a disparate impact on a protected group, even in the absence of discriminatory intent.”). it was not until 2015 that the Supreme Court of the United States formally held in Texas Department of Housing and Community Affairs v. Inclusive Communities Project, Inc. that disparate impact claims are cognizable under the FHA. 18 135 S. Ct. 2507, 2525 (2015) (“The Court holds that disparate-impact claims are cognizable under the Fair Housing Act upon considering its results-oriented language, the Court’s interpretation of similar language in Title VII [of the Civil Rights Act] and the [Age Discrimination Employment Act], Congress’ ratification of disparate-impact claims in 1988 . . . [,] and the statutory purpose.”). The holding was largely perceived as a win for fair housing activists because it acknowledged liability for unintentional or covert discrimination under the FHA. 19 See Maxwell Tani, The Supreme Court Just Granted a Huge Victory to Fair-Housing Advocates, Bus. Insider (June 25, 2015), https://www.businessinsider.com/supreme-court-opinion-in-fair-housing-case-2015-6 [https://perma.cc/YK35-ZL6X].

Recent technological advancements, however, have raised questions about the FHA’s reach. Once thought only marginally possible to achieve, artificial intelligence is now ubiquitous. Among other things, artificial intelligence is used today to estimate a defendant’s likelihood of committing a future crime, 20 Andrea Nishi, Note, Privatizing Sentencing: A Delegation Framework for Recidivism Risk Assessment, 119 Colum. L. Rev. 1671, 1672 (2019) (“Artificial intelligence already plays a significant role in judicial decisionmaking through the widespread use of recidivism risk assessment algorithms in state criminal justice systems. Today, at least twenty states use risk assessment algorithms to predict recidivism in their bail, parole, and sentencing proceedings . . . .”); Derek Thompson, Should We Be Afraid of AI in the Criminal-Justice System?, Atlantic (June 20, 2019), https://www.theatlantic.com/ideas/archive/2019/06/should-we-be-afraid-of-ai-in-the-criminal-justice-system/592084 [https://perma.cc/R65T-RKPT] (describing the experience of a public defense attorney learning that her client “had been deemed a ‘high risk’ for criminal activity” due to a “criminal-sentencing [artificial intelligence]” model). predict what content you want  to  see  on  Netflix  and  on  your  Facebook  Feed, 21 Anne Sraders, What Is Artificial Intelligence? Examples and News in 2019, TheStreet (Jan. 3, 2019), https://www.thestreet.com/technology/what-is-artificial-intelligence-14822076 [https://perma.cc/Z6F7-8DJS].   perform  robot-assisted surgery, 22 Robotic Surgery, Mayo Clinic (Feb. 4, 2020), https://www.mayoclinic.org/tests-procedures/robotic-surgery/about/pac-20394974 [https://perma.cc/XSG6-67UU]. power the spam filter in your inbox, 23 Daniel Faggella, Everyday Examples of Artificial Intelligence and Machine Learning, Emerj, https://emerj.com/ai-sector-overviews/everyday-examples-of-ai [https://perma.cc/5NZG-VRQS] (last updated Apr. 11, 2020). deposit checks through your bank’s smartphone app, 24 Id. and even place students in schools. 25 Dominique Harrison, Civil Rights Violations in the Face of Technological Change, Aspen Inst. (Apr. 22, 2019), https://www.aspeninstitute.org/blog-posts/civil-rights-violations-in-the-face-of-technological-change [https://perma.cc/7BYT-MQGA] (last updated Oct. 22, 2020). The pervasiveness of artificial intelligence is also changing the development of the housing market. Fifty years ago, when the Fair Housing Act of 1968 was passed, Congress could not have imagined how technological advances would impact the housing market and the ability of certain groups of people to access it. Today, artificial intelligence technology and big data play an important role in housing access, as landlords and lenders increasingly rely on predictive analytics to evaluate applicants. 26 See, e.g., Fannie Mae, How Will Artificial Intelligence Shape Mortgage Lending? 10 (2018), https://www.fanniemae.com/resources/file/research/mlss/pdf/mlss-artificial-intelligence-100418.pdf [https://perma.cc/9LWX-SWLX] (“[I]n two years almost three-fifths of lenders expect to have adopted some AI/ML applications.”); Douglas Merrill, AI Is Coming to Take Your Mortgage Woes Away, Forbes (Apr. 4, 2019), https://www.forbes.com/sites/douglasmerrill/2019/04/04/ai-is-coming-to-take-your-mortgage-woes-away/#1e38737c7567 [https://perma.cc/L5CV-CMXJ] (highlighting survey results that showed mortgage lenders anticipated drastically increasing their use of artificial intelligence in their businesses over the next two years); Naborly, https://naborly.com [https://perma.cc/E2AX-2BSP] (last visited Aug. 27, 2020) (“Our Applied Artificial Intelligence system learns from and leverages the experience gained from screening thousands of rental applicants and their tenancy outcomes. This helps Naborly’s analysts and customers see patterns of risk that could only . . . be detected by our AI.”); Artificial Intelligence Screening, RealPage, https://www.realpage.com/apartment-marketing/ai-screening [https://perma.cc/39SY-HWHS] (last visited Aug. 27, 2020) (“Using this new leading-edge, AI-based screening algorithm along with behavioral data, RealPage AI Screening precisely analyzes your applicant pool from our proprietary rental history database of over 30M records.”); see also Conn. Fair Hous. Ctr. v. CoreLogic Rental Prop. Sols., LLC, 369 F. Supp. 3d 362, 367, 371–75 (D. Conn. 2019) (reasoning that because consumer reporting agencies that utilize algorithms to screen tenants can be utilized by housing providers as “an intermediary to take discriminatory and prohibited actions,” such agencies must comply with the FHA when providing tenant screening services to landlords). Here, CoreLogic’s tenant screening product, CrimSAFE, disqualified a disabled man with no criminal convictions from moving in with his mother on the basis of “unspecified criminal records.” Id. at 367–68. More specifically, the lending industry has increasingly relied on big data and algorithmic decisionmaking to evaluate the creditworthiness of consumers. 27 Mikella Hurley & Julius Adebayo, Credit Scoring in the Era of Big Data, 18 Yale J.L. & Tech. 148, 151 (2016) (“Since 2008, lenders have . . . intensified their use of big-data profiling techniques. With increased use of [technology], every consumer leaves behind a digital trail of data that . . . lenders and credit scorers . . . are eagerly scooping up and analyzing as a means to better predict consumer behavior.”). While the use of such predictive techniques by lenders may mitigate consumer lending credit risk, it is not without its perils. The accuracy of an algorithm model is only as good as the data inputs used to train it, 28 See What Is Training Data?, Appen (Apr. 14, 2020), https://appen.com/blog/training-data [https://perma.cc/7RAE-BDT7] (“[T]he quality and quantity of your training data has as much to do with the success of your data project as the algorithms themselves.”). and data inputs based on a programmer’s implicit biases can create a discriminatory algorithm that results in unfair lending practices.

Federal housing laws in the United States, however, have failed to catch up to technology. The FHA makes no mention of technology generally or artificial intelligence specifically and does not address fair lending violations by way of predictive analytics, despite the widespread use of proprietary or third-party algorithmic models in many credit-scoring systems. A recently proposed rule (Proposed Rule) from HUD has exposed this gap in the law. 29 See HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard, 84 Fed. Reg. 42,854, 42,854 (proposed Aug. 19, 2019) (to be codified at 24 C.F.R. pt. 100) (“This rule proposes to amend HUD’s interpretation of the Fair Housing Act’s disparate impact standard to better reflect the Supreme Court’s 2015 ruling in Texas Department of Housing and Community Affairs v. Inclusive Communities, Inc. . . . .”). HUD purports that the Proposed Rule—the first federal regulation to directly address disparate impact and algorithms—is aimed at aligning HUD’s regulations with the Court’s interpretation of the FHA in Inclusive Communities. 30 Press Release, HUD, HUD Proposes Revised ‘Disparate Impact’ Rule (Aug. 16, 2019), https://www.hud.gov/press/press_releases_media_advisories/HUD_No_19_122 [https://perma.cc/5S5J-AX28] (“[HUD] today published a proposed rule to amend the HUD interpretation of the Fair Housing Act’s disparate impact standard. The proposed rule as amended would provide more appropriate guidance on what constitutes unlawful disparate impact to better reflect the Supreme Court’s 2015 ruling in [Inclusive Communities].”). In practice, however, the Proposed Rule will allow lenders to circumvent liability for algorithmic discrimination, in violation of the Fair Housing Act and fair lending laws, by substantially raising the burden of proof for parties claiming discrimination and creating seemingly insurmountable defenses for lenders accused of algorithmic disparate impact discrimination.

This Note focuses on the gap in statutory accountability within the FHA for disparate impact discrimination arising from algorithmic decisionmaking in the lending industry. Part I of this Note provides a historical overview of disparate impact in credit scoring, including the present use of nontraditional data and artificial intelligence by lenders, thus highlighting the importance of the disparate impact doctrine as a tool to combat housing discrimination. Part II offers a legal overview of the FHA pre- and post-Inclusive Communities and grounds the need for disparate impact theory as a recourse for algorithm-based discrimination within the broader context of disparate impact litigation generally. This Part then assesses how HUD’s Proposed Rule contravenes this history of disparate impact litigation and frustrates the purpose of the FHA. Part III offers suggestions for closing the gap in the law in the FHA and addresses potential counterarguments to the proposed solutions.