On July 25, the House Financial Services Committee’s Task Force on Financial Technology held a hearing, entitled “Examining the Use of Alternative Data in Underwriting and Credit Scoring to Expand Access to Credit.” As noted by the hearing committee memorandum, credit reporting agencies (CRAs) have started using alternative data to make lending decisions and determine credit scores, in order to expand consumer access to credit. The memorandum points to some commonly used alternative data factors, including (i) utility bill payments; (ii) online behavioral data, such as shopping habits; (iii) educational or occupational attainment; and (iv) social network connections. The memorandum notes that while there are potential benefits to using this data, “its use in financial services can also pose risks to protected classes and consumer data privacy.” The committee also presented two draft bills from its members that address relevant issues, including a draft bill from Representative Green (D-TX) that would establish a process for providing additional credit rating information in mortgage lending through a five-year pilot program with the FHA, and a draft bill from Representative Gottheimer (D-N.J.) that would amend the FCRA to authorize telecom, utility, or residential lease companies to furnish payment information to CRAs.

During the hearing, a range of witnesses commented on financial institutions’ concerns with using alternative data in credit decisions without clear, coordinated guidance from federal financial regulators. Additionally, witnesses discussed the concerns that using alternative data could produce outcomes that result in disparate impacts or violations of fair lending laws, noting that there should be high standards for validation of credit models in order to prevent discrimination resulting from neutral algorithms. One witness argued that while the concern of whether using alternative data and “algorithmic decisioning” can replicate human bias is well founded, the artificial intelligence model their company created “doesn’t result in unlawful disparate impact against protected classes of consumers” and noted that the traditional use of a consumer’s FICO score is “extremely limited in its ability to predict credit performance because its narrow in scope and inherently backward looking.” The key to controlling algorithmic decision making is transparency, another witness argued, stating that if the machine is deciding what credit factors are more important or not, the lender has “got to be able to put it on a piece of paper and explain to the consumer what was more important,” as legally required for “transparency in lending.”