Hot Topics in Artificial Intelligence, Machine Learning, and Alternative Data in Financial Services
Association of Corporate Counsel, June 22, 2022
Andrew E. Bigart Partner
202.344.4323, [email protected]
Jonathan L. Pompan Partner
202.344.4383, [email protected]
Disclaimer
This presentation is for general informational purposes only and does not represent and is not intended to provide legal advice or opinion and should not be relied on as such. Legal advice can
be provided only in response to specific fact situations. This presentation does not represent any undertaking to keep recipients advised as to all or any
relevant legal developments. ATTORNEY ADVERTISING. Prior results do not guarantee a similar outcome.
2
Today's Discussion
CFPB and Regulator Leadership Set the Agenda What is AI, Machine Learning, and Alternative Data? Key Areas of Regulatory Risk
Fair Lending Marketing / Lead Generation Anti-money Laundering Best Practices to Mitigate Risk
3
Setting the Stage: CFPB and Regulator Leadership Focus on Fair Lending and Discrimination in Financial Services
CFPB and Bank Regulators
CFPB New Director Rohit Chopra and staffing changes at the Bureau Big announcements, taking on tech, taking on FDIC, promoting competition, and more Guidance to staff re engaging with former employees, and heightened scrutiny
Bank Regulators (FRB, OCC, FDIC, NCUA) Focus on fintech integrations Third-party risk management Fair lending/redlining initiatives
5
FTC, State Attorneys General, & State Regulators
FTC New Chair Lina Kahn June 16, 2022 Combatting Online Harms Through Innovation Report to Congress Resurrection of its penalty offense authority? (education/student loans, endorsements and testimonials) A focus on growing role of private equity and other investors
State AGs Debt collection practices Installment lending Buy now/ pay later and lease-to-own Privacy and data security
CA Department of Financial Innovation and Protection: UDAAP authority, active investigations, registration, and more a "mini-CFPB"
NY Department of Financial Services
6
CFPB and Bank Regulators
In the course of examining banks' and
other companies' compliance with
consumer protection rules, the CFPB
will scrutinize discriminatory
conduct that violates the
prohibition against UDAAP.
rTf"iahnWceaenhC,cFeitaPnhlBiianswsipitslieltuucrltsnoioosannemlsy'ibdsexeidgacimeusnioioniunee-sdlyauccnefsasirt,o"
a bank account because of their religion or said CFPB Director Rohit Chopra. "We will
bmaceonamedkxpionapgtnahiiennesrdaaadirnrevegeaarpostpiurtsoorinpeargnina,sttaupeircr-leyrdictotihiessnascsttgritn,ihmgeinbaotairodn
efforts to combat discriminatory in consumer finance."
practices
for and eliminating illegal
discrimination.
7
AI and ML at the CFPB
8
What is AI, Machine Learning, and Alternative Data?
Introduction to AI and Machine Learning
The mainstream use of artificial intelligence and machine learning in financial services is helping business enterprises to pull out actionable insights from large and complex datasets and deliver services.
It comes in the form of deep learning technologies, autonomous processes, or smart robots. Artificial intelligence is making its presence felt everywhere in the connected world. Chatbots, virtual assistants, and business intelligence bots Targeted online advertising Predictive analytics Voice recognition Pattern recognition
10
Artificial Intelligence and Machine Learning
Financial institutions are exploring how best to leverage developments in AI and ML for credit decisioning and AML efforts.
AI is a term used to address various technologies and systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and
translation between languages.
Machine learning (ML) refers to algorithms that improve their performance through the sharing or analysis of pattern information.
11
Alternative Data
Alternative data means information not typically found in the consumer's credit files of the nationwide consumer reporting agencies or customarily provided by consumers as part of applications for credit.
Examples of alternative data include information derived from a customer's social media activities, mobile device data, website data, and online browsing activity.
Financial services industry increasingly using alternative sources of data to streamline and improve credit underwriting.
Prudential banking regulators recognize that alternative data may be used in connection with fraud detection, credit underwriting, and account servicing and management.
12
Alternative Data (cont.)
U.S. banking regulators have recognized that the use of alternative data "may improve the speed and accuracy of credit decisions and may help firms evaluate the creditworthiness of consumers who currently may not obtain credit in the mainstream credit system . . . . These innovations reflect the continuing evolution of automated underwriting and credit score modeling, offering the potential to lower the cost of credit and increase access to credit." https://files.consumerfinance.gov/f/documents/cfpb_interagencystatement_alternative-data.pdf
At the same time, however, the banking regulators have cautioned financial institutions to engage in "responsible use of such data," and to ensure that any such use complies with applicable consumer protection laws and regulations.
13
Fair Lending Considerations
Fair Lending Overview
The use of data in credit decisioning is governed by federal fair and equal lending laws, specifically the Fair Credit Reporting Act ("FCRA") and the Equal Credit Opportunity Act ("ECOA"), and their implementing regulations.
These laws are enforced by the CFPB, FTC, and through private litigation. In addition, both the CFPB and FTC have general authority to police unfair and deceptive acts and practices (UDAAP/UDAP), and have used that authority when creditors failed to provide accurate information about the information used to make credit decisions, what the terms of credit are and what may impact changes in rates and repayment, and other credit-related activity that was not disclosed to borrowers at the time of applying for and accepting credit.
This general regulatory background sets the framework through which the regulators and Congress analyze new and alternative sources and types of data and underwriting practices in the marketplace.
15
Fair Lending Overview
The Equal Credit Opportunity Act (ECOA) applies to all creditors and those who, in the ordinary course of business, regularly refer prospective applicants to creditors. Implemented by Regulation B.
Illegal to discriminate against applicant regarding any aspect of a credit transaction On the basis of race, color, religion, national origin, sex or marital status, or age (if applicant has capacity to contract) Because all or part of the applicant's income derives from any public assistance program Because the applicant has in good faith exercised any right under the Consumer Credit Protection Act
The CFPB has ECOA rulemaking authority and supervises for and enforces compliance. FTC also has enforcement authority.
16
Fair Lending Overview, cont.
Reg. B covers creditor activities before, during, and after the extension of credit.
Information requirements; investigation procedures; standards of creditworthiness; terms of credit; furnishing information about credit; revocation, alteration, or
termination of credit; collection procedures.
Reg. B prohibited practices (12 C.F.R. 1002.4):
Discriminating against applicants on a prohibited basis regarding any aspect of a credit transaction.
Making oral/written statements, in advertising or otherwise, to applicants or prospective applicants that would discourage, on a prohibited basis, a reasonable person from making or pursuing an application.
17
Disparate Impact & Disparate Treatment
Two theories of ECOA/Reg. B liability: disparate impact & disparate treatment. Disparate treatment occurs when a creditor treats an applicant differently based on a prohibited basis.
Can be overt/open or be found by comparing treatment of applicants who received different treatment for no discernable reason other than a prohibited basis.
Disparate impact occurs when a creditor employs facially neutral policies or practices that have an adverse effect or impact on a member protected class Unless they meet a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact.
18
Fair Lending Regulatory Concerns
Discrimination and Digital Redlining. The most common concern voiced across government and civil society entities that focus on alternative credit data and AI driven lending was that such programs will continue existing structural roadblocks to credit for marginalized groups, and result in discriminatory outcomes (either intentional or unintentional) that may violate fair lending laws. The overriding concern is that entities will not properly use, calibrate, monitor, or adjust any data sources or algorithms used in new lending platforms to properly control for statistical discrimination (especially that causing a disparate impact on marginalized groups, even through the use of a facially neutral system).
Unfair Data Inclusion. FCRA prohibits certain data elements from being included in consumer reports, and also places requirements on CRAs to ensure a certain level of accuracy in the data used to create consumer reports. Several commentors and government agencies stated concerns that alternative data may be collected and used outside of FCRA's requirements. For example, social media data (e.g., friend groups and educational institutions) as a proxy for credit worthiness was seen as less reliable and fair compared to the inclusion of rent or utility payments into a credit decision.
Improper Additional Data Uses. A variety of sources noted for the potential misuse of alternative credit data outside of the underwriting context. Because alternative credit data can include a range of data points, including Internet activity, social media, and other data typically associated with digital advertising and other uses of such information, some entities are worried that consumers may grant access to information to receive credit but that the same information will be repurposed for other uses without the consumer's knowledge.
Deceptive Terms and Conditions. The sources we reviewed noted that, because consumers may not be aware of the types of data or activity that has bearing on their credit decisions when a company uses alternative data and AI to make such decisions, they may be unable to address or challenge adverse terms or changes in a credit offer.
19
CFPB Long-Term Actions Artificial Intelligence
"Although use of AI holds the potential to expand credit access to underserved consumers, use of such technologies may also hold risks, including risks of unlawful discrimination and lack of transparency"
"The Bureau recognizes the importance of continuing to monitor the use of AI and is evaluating whether rulemaking, a policy statement, or other Bureau action may become appropriate"
2017 Request for Information Regarding Use of Alternative Data and Modeling Techniques in the Credit Process
2018 Calls for Evidence 2020 Adverse Action Tech Sprint 2020 Request for Information on the Equal Credit Opportunity Act
and Regulation B
20
Where Is the CFPB Headed with Its Focus on Tech and Payment Systems?
Focus on payments systems, data harvesting, consumer choice/access restrictions, and more (e.g., EFTA, GLBA, etc.) Denial of Credit / Access: The CFPB is more closely scrutinizing the use of big data when it is used to deny credit (or access to financial services); less likely to object to the use of big data by creditors to reconsider credit applications that would otherwise be denied.
FTC Report, Big Data: A Tool for Inclusion or Exclusion (2016) FCRA Disparate Impact: One concern with using big data is that it may present fair lending issues if its use causes a disparate impact. When a creditor determines that a big data factor may be leading to a discriminatory impact, the creditor should determine whether: 1) the factor is highly correlated with the discriminatory impact, 2) there is a good basis for continuing to use that factor, and 3) there is a better variable that could be used for the same purpose that does not lead to a
discriminatory impact. Third Party Vendor Management Policy, CFPB Bulletin 2012-03 (April 13, 2012) - CFPB will focus on primary providers of financial service and service providers. If the CFPB believes that service providers are not complying with a consumer financial services law, or are committing a UDAAP violation when interacting with the institution's customers, the CFPB plans to hold both companies accountable. May include exams/investigations.
21
Combatting Redlining Initiative
Led by Civil Rights Division's Housing and Civil Enforcement Section, partnering with U.S. attorney offices, financial regulatory agencies (incl. CFPB), and state AGs.
Takeaways: Use U.S. attorneys' offices to ensure that fair lending enforcement takes advantage of local expertise on housing markets and credit needs; Extend DOJ's analyses of potential redlining to non-depository institutions that DOJ indicated are originating the majority of mortgage loans; Strengthen DOJ's partnership with financial regulatory agencies to ensure identification and referral of fair lending violations to DOJ; and Increase coordination with state attorneys general on fair lending matters.
All types of loans, and all types of lenders Director Chopra's comments focused on the use of AI in lending decisions. The CFPB
will be "watching for digital redlining," citing what he called "algorithmic bias" and the need for investigation of whether "discriminatory black box models are undermining th[e] goal" of equal opportunity.
22
Combatting Redlining Initiative, Cont.
CFPB has stated repeatedly that racial equity is a priority and fair lending will be key.
DOJ Task Force New Combatting Redlining Initiative (Oct. 22, 2021) AG Garland: "We will spare no resource to ensure that federal fair lending laws are vigorously enforced and that financial institutions provide equal opportunity for every American to obtain credit." American Trustmark National Bank settlement (approved Oct. 27, 2021) was the first under the initiative. o Consent Order Create $3.85m loan subsidy program for majority-Black and Hispanic neighborhoods in Memphis; open new lending office in such a neighborhood; $5m civil penalty o Broader view of what constitutes redlining. Several fair lending probes already open, more to come.
"Technology companies and financial institutions are amassing massive
amounts of data and using it to make more and more decisions about our lives,
including loan underwriting and advertising.
While machines crunching numbers might seem capable of taking human bias out of the equation, that's not what
is happening."
Source: Remarks of Director Rohit Chopra at a Joint DOJ, CFPB, and OCC Press Conference on the Trustmark National Bank Enforcement Action (Oct. 22, 2021)
23
CFPB, DOJ, and OCC v. Trustmark National Bank
Complaint alleges that Trustmark violated the Fair Housing Act (FHA), the Equal Credit Opportunity Act (ECOA) and its implementing regulation, Regulation B, and the Consumer Financial Protection Act of 2010 (CFPA). Allegations
ECOA and Regulation B prohibit creditors from discriminating against applicants and prospective applicants in credit transactions on the basis of characteristics such as race, color, and national origin, including by redlining or engaging in conduct that would discourage on a prohibited basis a prospective applicant from applying for credit.
Avoided locating branches in majority-Black and Hispanic communities Avoided assigning loan officers to majority-Black and Hispanic communities Failed to monitor its fair lending compliance Discouraged applicants and prospective applicants in majority-Black and Hispanic
neighborhoods Enforcement Action Congress entrusted the Bureau to enforce the CFPA, ECOA, and ECOA's implementing Regulation B. The proposed consent order, if entered by the court, would require Trustmark to:
Invest $3.85 million via a loan subsidy program Increase physical presence in and outreach to majority-Black and Hispanic
neighborhoods Comply with fair lending requirements $5 million penalty to the CFPB, and will credit the $4 million penalty collected by the OCC
24
Lender Innovation: Artificial Intelligence in Underwriting
Using an algorithm, rather than a human, to analyze a variety of factors to more accurately assess credit applicants. Beware of unwitting discrimination
"Black box" problem algorithms can't explain a result.
What if algorithm considers a data point that correlates strongly with protected characteristic?
Algorithms could include information that creates biases against certain groups.
AI algorithms can be compatible with ECOA/Reg. B; "a creditor may disclose a reason for a denial even if the
relationship of that disclosed factor to predicting creditworthiness may be unclear to the applicant."
25
Lender Innovation: Alternative Data in Underwriting
Information not traditionally used by national consumer reporting agencies in calculating a credit score: On-time utility, cable, or mobile phone bill payments; Cash flow data from bank statements; or Data related to consumer behavior on the Internet (e.g., time spent on social media).
Could be even more predictive than traditional data. Potential to expand credit access to "credit invisibles" and those with low credit scores under
traditional model disproportionately low-income, people of color, women, immigrants, and the elderly.
26
Alternative Data in Underwriting, cont.
How are lenders using it Creating proprietary blends of alternative data points to assess creditworthiness and underwriting traditionally risky borrowers o Lending Club Corp. o Prosper Marketplace Inc. o Upstart Network Inc. "Second Chance" or "Second Look" programs alternative data considered only when FICO score either doesn't exist or not satisfactory to obtain credit o Sunrise Banks NA
27
Alternative Data in Underwriting, cont.
BUT beware of unwitting discrimination use of certain data points could yield disparate impacts for protected classes, even if an algorithm is facially neutral Time spent on social media younger consumers Typos and grammatical mistakes immigrants and non-native English speakers Zip codes people of color, immigrants
28
CFPB Acts to Protect the Public from Black-Box Credit Models Using Complex Algorithms
CFPB confirmed that federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms.
The CFPB published a Consumer Financial Protection Circular to remind the public, including those responsible for enforcing federal consumer financial protection law, of creditors' adverse action notice requirements under the Equal Credit Opportunity Act (ECOA). Federal consumer financial protection laws and adverse action requirements should be enforced regardless of the technology used by creditors. Creditors cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new.
29
Data Use / Marketing / Lead Generation
Section 1033 Consumer-Authorized Financial Data Sharing and Aggregation
Section 1033 of the Dodd-Frank Wall Street Reform and Consumer Protection Act provides, among other things, that subject to rules prescribed by the CFPB, a consumer financial services provider must make available to a consumer information in the control or possession of the provider concerning the consumer financial product or service that the consumer obtained from the provider.
November 22, 2016 Request for Information November 18, 2017 Principles Statement February 26, 2020 Symposium October 22, 2020 Advance Notice of Proposed Rulemaking issued solicit comments
and information to assist the Bureau in developing regulations to implement section 1033. February 4, 2021 ANPR comments closed (99 comments received) July 9, 2021 Executive Order (EO) encourages CFPB to commence rulemaking under section 1033. Present Pre-Rule Stage Final Rule?
31
Section 1071 Business Lending Data (Regulation B)
Section 1071 of the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) amended the Equal Credit Opportunity Act (ECOA) to require, subject to rules prescribed by the Bureau, financial institutions to report information concerning credit applications made by women-owned, minority-owned, and small businesses. https://www.consumerfinance.gov/1071-rule/
May 15, 2017 Request for Information September 15, 2020 SBREFA Outline December 15, 2020 SBREFA Report September 1, 2021 NPRM issued
CFPB proposes to require covered financial institutions to collect and report to the Bureau data on applications for credit for small businesses, including those that are owned by women or minorities.
NPRM addresses CFPB's approach to privacy interests and the publication of section 1071 data; shielding certain demographic data from underwriters and other persons; recordkeeping requirements; enforcement provisions; and the proposed rule's effective and compliance dates.
Comments were due by January 6, 2022. Final Rule?
32
Anti-money Laundering
AI and ML for AML Purposes
When it comes to AML, current systems tend to be rule-based and in many cases manual, requiring review of various data points against established requirements and rules (both with respect to identifying customers and for monitoring suspicious activity).
AI and ML can be used to improve and monitor the determinations made by employees engaged in performing various compliance functions, including customer onboarding and transaction
monitoring and investigations.
Increasingly, financial institutions are using AI to confirm customer identity by collecting and comparing traditional, publicly available data used for verification (name or address) with data from
non-traditional sources, such as an access device's IP address or biometric data.
34
Regulatory Developments
The United States has implemented a comprehensive AML framework through the Bank Secrecy Act (BSA) and FinCEN's implementing regulations that requires financial institutions to verify the identity of their customers, perform risk-based due diligence on their customers, and screen customers against U.S. economic sanctions programs.
The use of AI/ML by a financial institution for customer identification, verification, and related purposes appears consistent with this legal framework, provided that the financial institution implements these technologies consistent with regulatory expectations.
As explained by a former Director of FinCEN in 2019, U.S. federal regulators are "committed to working with industry on ways in which technological advances with respect to identity can fit within our current regulatory framework or may lead to changes in our regulations."
35
Regulatory Developments
Private industry and U.S. policy makers have begun to take steps to gain a better understanding of how AI and ML are used, and some of the potential advantages and disadvantages that they present.
In 2018, FinCEN and the banking regulators issued a joint statement recognizing the use of AI by FIs in their AML programs, and noted AI's potential to improve compliance and efficiency. The joint statement explained that the regulators "welcome these types of innovative approaches to further efforts to protect the financial system against illicit financial activity" and recognize that "these types of innovative approaches can maximize utilization of banks' BSA/AML compliance resources."
In 2019, the U.S. Chamber of Commerce released recommended AI policy principles that encouraged policy makers to take "flexible risk-based approaches based on use cases, rather than prescriptive requirements when governing the development, deployment, and use of AI technologies."
In February 2021, the Treasury Department held a policy roundtable for public and private industry experts to discuss the interplay between digital identity, AML, and anti-fraud activities.
In 2021, the federal banking regulators issued a Request for Information (RFI) seeking industry input on the use of AI and ML for operational and regulatory compliance purposes. Numerous comments were submitted by trade associations, companies, and consumer advocates.
FinCEN established an Innovation Hours Program that serves as a forum for policymakers and industry to discuss innovative products and services, such as AI and digital identity.
36
Best Practices
Developing Products and Services that Use AI, ML & Alternative Data Key Considerations
Incorporation of fair lending principles into Compliance Management System Designing use of Alternative Data and AI/ML consistent with regulatory expectations
Transparency Security Accuracy Monitoring and Revision Oversight Third Party Risk Management
38
Developing Products and Services that AI, ML & Alternative Data Key Considerations
Transparency: Concerns around transparency in the use of alternative data for credit decisioning focus on two areas. First, that consumers should understand what data will be used to make decisions about them. Second, that consumers understand how those decisions will be made using the identified data. What Information is Used. Inform consumers about the types of data that will be used in the credit decisioning process, regardless of whether that data is traditional data or alternative data. How Information is Used. Inform consumers how the information will be used. Various lenders that use AI and alternative data in their decisioning process include disclosures and information about their products and methods. In addition to being clear about how the underwriting process occurs, companies should also make clear if additional data may be used after granting credit to alter the terms.
39
Developing Products and Services that AI, ML & Alternative Data Key Considerations (cont.)
Security: General agreement that reasonable cybersecurity protections should be in place for such platforms, although the standards may vary depending on the federal and state legal overlay. The FTC has brought several enforcement actions against creditors that failed to maintain the security of the credit information they maintained about consumers. Any lending platform should include reasonable cybersecurity protections for the data used in the platform.
Accuracy: Concerns have been raised of the ability for users of alternative data to ensure the accuracy of that data and the ability of consumers to correct inaccurate information. Entities that plan to leverage alternative data and AI should assess their data sets, determine potential gaps or inaccuracies in the data, and calibrate the uses and models of that data to account for those potential inaccuracies. Companies using alternative data should include procedures to account for inaccuracies or gaps in its data sets, and account for such possibilities as the products continue to be developed.
40
Developing Products and Services that AI, ML & Alternative Data Key Considerations (cont.)
Monitoring and Revision: One of the most consistent concerns is the potential for alternative data to lead to discriminatory practices (either intentionally or unintentionally) that could negatively impact minorities and other marginalized communities in the credit marketplace. In order to control for such unintended consequences: Monitor the outputs of credit models that leverage any data to make decisions about credit applicants. This monitoring should seek to identify potential discriminatory outcomes from credit decisions, such as biases against certain racial, ethnic, gender, or other marginalized groups. By identifying potential disparate impacts as they arise, entities engaged in AI lending programs can seek to correct those limitations before they become systemic. A model governance and monitoring program can help achieve the positive goals of an AI and alternative data fueled lending program while avoiding negative outcomes. Update the model and data sources to correct any identified bias. If the monitoring program identifies potential sources and/or outcomes that indicate discriminatory bias, an entity using an AI and alternative data lending program should determine if modifications to its model or data sources could correct that bias. Internal testing and revisions to AI models can help avoid real world unintended negative outcomes.
41
Developing Products and Services that AI, ML & Alternative Data Key Considerations (cont.)
Oversight: Engage independent oversight of data collection and use practices to increase objectivity and prevent unintended outcomes. For example, part of the CFPB's reasoning for granting a No Action Letter for ECOA violations to Upstart was that company's agreement to enter into "Model Risk Assessment Plan" CFPB. Upstart also entered into a similar agreement with the NAACP to engage in 2 year reviews of fair lending compliance with Upstart's model and alternative data by a civil rights law firm. This type of third party oversight is stated to be a "a model for companies like Upstart to protect borrowers from the discriminatory effects of these practices and ensure lenders cannot repackage age-old discrimination that has locked Black borrowers out of consumer credit markets."
42
Third-Party Risk Management
The federal bank regulatory agencies have requested public comment on proposed guidance designed to help banking organizations manage risks associated with third-party relationships, including relationships with financial technology-focused entities. The proposed guidance is intended to assist banking organizations in identifying and addressing the risks associated with third-party relationships and responds to industry feedback requesting alignment among the agencies with respect to third-party risk management guidance.
"Third-party relationships can include relationships with entities such as vendors, financial technology (fintech) companies, affiliates, and the banking organization's holding company."
July 19, 2021 NPRM issued
September 17, 2021 Public comment period closed
43
CFPB Enforcement by the Numbers (through 2020, excluding 19 in 2021)
$12.9 billion in consumer relief Monetary compensation, principal reductions, canceled debts, and other consumer relief ordered as a result of enforcement actions.
175 million people eligible for relief Estimated consumers or consumer accounts eligible to receive relief from enforcement actions. $1.6 billion in penalties Civil money penalties ordered as a result of enforcement actions.
Actions by Year
Relief by Year
Source: https://www.consumerfinance.gov/enforcement/enforcement-by-the-numbers/
44
Developing Products and Services that Use Alternative Data Additional Considerations
Beware of Pitfall with Lack of AI Implementation Traceability and Records Introducing Program Bias into Decision Making. Data Sourcing and Violation of Personal Privacy. Black Box Algorithms and Lack of Transparency. Legal and Regulator Arbitrage and Risk
45
Meet Our Firm
Venable is a network of trusted advisors serving businesses, organizations, and individuals in many of the most important aspects of their work. Our capabilities span virtually every industry and all areas of regulatory and government affairs, corporate and business law, intellectual property, and complex litigation. With more than 850 professionals delivering services around the world, we help clients connect quickly and effectively to the experience and insights they need to achieve their most pressing objectives.
850+ Professionals
Attorneys and advisors
4 Divisions
Business, Regulatory, IP, Litigation
120+ Years
A history of strategic growth
10 Offices
CA | DC | DE | IL | MD | NY | VA
46
For Additional Information
For an index of articles and presentations on consumer financial services related topics by Venable attorneys, see www.Venable.com/cfs.
47
Questions?
Jonathan L. Pompan Venable LLP Partner 202.344.4383 [email protected]
Andrew E. Bigart Venable LLP Partner 202.344.4323 [email protected]
48
2022 Venable LLP. This document is published by the law firm Venable LLP. It is not intended to provide legal advice or opinion. Such advice may only be given when related to specific fact situations that Venable has accepted an engagement as counsel to address.