CFPB Issues New Guidance on AI in Mortgage Lending
January 28, 2026 · Source: CFPB
Key Points of the Guidance
The Consumer Financial Protection Bureau (CFPB) has released interpretive guidance clarifying how existing fair lending laws apply to AI and algorithmic decision-making in mortgage lending. The guidance emphasizes that lenders using AI tools for underwriting, pricing, or marketing are still fully responsible for compliance with the Equal Credit Opportunity Act (ECOA) and Fair Housing Act. Using a third-party AI model does not shield lenders from liability for discriminatory outcomes.
Get the Free 2026 Mortgage Tech Comparison Guide
Side-by-side vendor comparison, pricing data, and feature matrix. Delivered to your inbox.
Adverse Action Notice Requirements
The guidance specifically addresses adverse action notices under ECOA. When an AI model contributes to a denial or unfavorable terms, lenders must provide specific and accurate reasons — not vague references to algorithmic scores. The CFPB rejected the argument that AI model complexity makes it impossible to provide specific adverse action reasons, stating that if a lender cannot explain why an application was denied, they should not be using that model for lending decisions.
What This Means for Mortgage Technology
For mortgage technology vendors, this guidance reinforces the need for explainable AI models with clear audit trails. Lenders evaluating AI-powered underwriting or pricing tools should ensure the vendor can demonstrate model explainability, fair lending testing results, and the ability to generate compliant adverse action notices. The guidance does not prohibit AI use — it clarifies that AI tools must meet the same standards as traditional lending processes.
Source: CFPB