By: Elizabeth Carroll

The Expansion of the FCRA

On September 15, 2023, the Consumer Financial Protection Bureau (CFPB) issued an outline of proposals that would bring data brokers under the umbrella of the Fair Credit Reporting Act (FCRA).[1]  The move comes on the heels of a government-wide effort to regulate the burgeoning use of artificial intelligence (AI) in the financial sector.[2]

Currently, data brokers acquire consumer information and sell that data to third-parties.[3]  These third parties use the data to inform AI algorithms for predictive decision-making for the purposes of, inter alia, credit extensions, employment determinations, and housing applications.[4]  The CFPB has issued numerous publications about the risks that the use of AI poses to consumers, for instance, bias and illegal discrimination in the making of financial decisions.[5]

The FCRA protects consumer information by imposing a “permissible purposes” requirement on third-party companies who use information gathered from consumer reports collected by “reporting agencies.”[6]  The information found in these reports (i.e., consumer payment history, income, and criminal records data) typically inform AI algorithms for use in the financial sector.[7]  The CFPB’s proposal would define consumer information sold to a third-party who uses it for a permissible purpose as a “consumer report,” regardless of whether the data broker knew the third-party would use it for that purpose.[8]  All parties handling these reports would then be subjected to certain requirements under the FCRA, including those requirements that ensure the accuracy and confidentiality of the data, those that regulate the handling of disputes of inaccurate information, and those that prohibit the misuse of data.[9]  This proposal comes in response to CFPB research, which found that “[f]amilies living in majority Black and Hispanic neighborhoods are far more likely to have disputes of inaccurate information appear on their credit reports.”[10]

Additionally, the CFPB plans to clarify whether, and to what extent credit header data constitutes a consumer report under the FCRA definition.[11]  Credit header data includes consumer-identifying data maintained by consumer reporting agencies (i.e., name, Social Security number, current and former addresses, and phone numbers).[12]  By concluding that credit header data constitutes a consumer report under the FCRA, the rule would reduce the ability of those handling the data to sell or disclose it without a permissible purpose.[13]

The imposition of a “permissible purpose” requirement, and the expansion of the FCRA’s definition of a consumer report would benefit vulnerable consumers.  Research has found that the sharing of data from consumer reports disproportionately put the elderly, low-income families, people of color, and military families at risk for privacy, economic, and reputational harms, as well as exacerbation to already existing inequalities.[14]  For example, the extension of the FCRA would benefit domestic violence survivors by regulating and reducing the ability to impermissibly disclose sensitive contact information.[15]

A Law in Practice

Guidance for the expansion of the FCRA can be pulled from New York City’s (NYC) Automated Employment Decision Tool legislation (AEDT), which is already in practice.[16]  NYC implemented AEDT in response to concerns over discrimination and possible disparate impacts in the use of automated tools during the hiring process.[17]  For example, the Equal Employment Opportunity Commission recently issued guidance that warned how the use of algorithmic screening tools could be a violation of the Americans with Disabilities Act (ADA).[18]

One permissible purpose for the use of consumer report data under the FCRA is “employment purposes.”[19]  Under the proposed rule, an employment agency that utilizes a computer-based tool that uses AI to assist with employment decisions must ensure a bias audit, post a summary of the results of the bias audit on their website, provide notice about the type and source of data used for the tool, and notify job candidates that the tool will be used.[20]  The requirements under AEDT closely reflect the transparency, accuracy, and handling requirements of the FCRA.[21] However, AEDT does not require any action if the bias audit indicates a disparate impact.[22]  The extension of the FCRA would take this idea further and subject those under the new requirements to enforcement by the CFPB, Federal Trade Commission, and Department of Transportation.[23]  The CFPB will meet with other Federal agencies and stakeholders to discuss the proposals.[24]  Both data brokers and consumers alike should expect some big changes in the next coming months.

[1] See Remarks of CFPB Director Rohit Chopra at White House Roundtable on Protecting Americans from Harmful Data Broker Practices, Consumer Fin. Prot. Bureau (Aug. 15, 2023), See generally The Fair Credit Reporting Act, 15 U.S.C. § 1681.

[2] See Consumer Fin. Prot. Bureau, supra note 1.

[3] CFPB Announces Plans to Extend FCRA to Data Brokers Through Rulemaking, Cooley LLP (Aug. 23, 2023),

[4] See id.  

[5] See CFPB Issues Guidance on Credit Denials by Lenders Using Artificial Intelligence, Consumer Fin. Prot. Bureau (Sept. 19, 2023), (stating that AI is expanding the data used for lending decisions, increasing the need to protect consumers against illegal discrimination).

[6] Mercedes Kelley Tunstall, CFPB Previews Upcoming Proposed Rules Under the FCRA That Will Address Artificial Intelligence, Nat’l L. Rev.(Aug. 26, 2023),,scams%20and%20fraud%20can%20flourish.

[7] See Consumer Fin. Prot. Bureau, supra note 1.

[8] See Caroline Kraczon, The CFPB Moves Ahead with FCRA Rulemaking to Rein in Data Brokers, Elec. Priv. Info. Ctr. (Sept. 28, 2023),, see also Small Business Advisory Review Panel for Consumer Reporting Rulemaking, Consumer Fin. Prot. Bureau (Sept. 15, 2023),

[9] 15 U.S.C. § 1681c.

[10] See CFPB Finds Credit Report Disputes Far More Common in Majority Black and Hispanic Neighborhoods, Consumer Fin. Prot. Bureau(Nov. 2, 2023),

[11] Cooley LLP, supra note 3.

[12] See Consumer Fin. Prot. Bureau, supra note 6.

[13] 15 U.S.C. § 1681c.

[14] Emma Woollacott, Watchdog Calls For Crackdown On Data Brokers, Forbes (Aug. 16, 2023),

[15] See Consumer Fin. Prot. Bureau, supra note 1.

[16] N.Y., N.Y., Admin. Code § 20-871 (2021).

[17] See id.

[18] Sara Geoghegan, EEOC and DOJ Put Employers on Notice of Algorithmic Discrimination Risks, Elec. Priv. Info. Ctr. (June 3, 2022),

[19] See 15 U.S.C. § 1681b(a)(3)(B) (defining the use of information for “employment purposes” as a permissible use).

[20] N.Y., N.Y., Admin. Code § 20-871 (2021).

[21] See 15 U.S.C. § 1681e, g.

[22] N.Y., N.Y., Admin. Code § 20-871 (2021).

[23] See Consumer Fin. Prot. Bureau, supra note 1.

[24] See Consumer Fin. Prot. Bureau, supra note 6.

Share this post