Article

The History of Artificial Intelligence in Mortgage

November 21, 2024
6 min read

The evolution of artificial intelligence (AI) in mortgage is really just the history of technology in mortgage. In earlier posts and articles, I’ve covered the definition of artificial intelligence, the prevailing AI technologies in the mortgage industry today, provided some example AI use cases across the mortgage lifecycle, and described the process of bringing generative AI (genAI) into a mortgage organization. As a refresher, the four main AI subspecialties used in mortgage are machine learning, computer vision, expert systems, and natural language processing.

The great depression of the 1930s revealed the flaws in the US mortgage market, and this time marked the beginning of the mortgage industry as we know it today. The process of making and managing a loan was paper-based, time-intensive, and manual. For historical context, during this time the telephone was just starting to be used for customer service.

A chronology of how artificial intelligence technologies evolved in mortgage technology.

Machine Learning

Machine learning (ML) is the field of artificial intelligence that enables computers to learn and improve from patterns observed in large datasets without needing to be explicitly programmed.

Predating machine learning by many decades, the first credit scoring algorithms emerged in the 1950s with what would become the Fair Issac Corporation (FICO). Consumer records were digitized by credit bureaus in the 1960s and 1970s. It wasn’t until 1989 that truly universal credit scoring came out, and the FICO score was born. Bill Fair and Earl Isaac used data analytics and predictive modeling to help businesses make more objective and consistent credit decisions.

In the 1980s and 1990s, we saw foundational automated valuation models (AVMs) emerge. AVMs were initially rule-based systems relying on statistical models like linear regression. By the early 2000s, data availability improved significantly, with access to large datasets from the Multiple Listing Service (MLS) and public records data. AVMs took advantage of this by implementing more advanced statistical techniques, such as decision trees, to improve property valuation accuracy.

After the financial crisis in 2008, lenders and financial institutions started exploring data-driven solutions to improve accuracy and reduce risk, including an additional focus on predictive analytics. This is also the time when machine learning came into mortgage, designed to improve risk modeling and decision-making. Widespread digitization in real estate created vast amounts of data, including structured and unstructured sources, this dramatically improved the scale and accuracy of the AVM.

Computer Vision

Computer vision is the field of artificial intelligence that enables computers to interpret and understand visual information from the world, such as images and videos. Computer vision is also a key component in robotics and autonomous vehicles.

IBM released the first commercially successful optical character recognition (OCR) system in 1965. It was used in the banking industry for processing checks, using magnetic ink character recognition (MICR). OCR technology converted printed or typed text from scanned documents, images, or photos into machine-readable text and was used by Fannie Mae and Freddie Mac at scale in the 1990s to underwrite mortgages.

Intelligent character recognition (ICR) was patented in 1993 by Joseph Corcoran. His invention incorporated the use of neural networks to improve text recognition and adapt to variations in handwriting over time. This enabled banking to accurately process handwritten checks and mail sorting, major innovations that streamlined mortgage lending. These systems gradually improved with advancements in machine learning and computational power. In the 2000s, deep learning allowed ICR systems to process increasingly complex handwritten inputs, such as cursive writing and mixed font documents.

Intelligent character recognition, an evolution of optical character recognition, uses more advanced machine learning including neural networks.

Within the past ten years, we have seen new applications of machine learning in computer vision, including the ability to interpret photographs and drone imagery. This can detect property damage during the appraisal or inspection process. We are also starting to see generative technology used to enhance image quality and enable more complex use cases.

Expert Systems

Expert systems are computer programs designed to mimic human decision-making by applying predefined rules to solve specific problems.

Mainframes for processing financial transitions hit banking and mortgage in the 1960s, representing both early use of expert systems in mortgage and major innovation in payment processing at scale. Countrywide Home Loans invented the first rules-based underwriting system with its Countrywide Loan Underwriting Expert System (CLUES). CLUES implemented about 1,000 business rules and was deployed to more than 300 branches. By early 1994, CLUES was processing 35% of Countrywide's loan underwriting volume.

Given a mandate to take $1,000 out of the cost to originate a loan (about $3,500 in total at the time), Fannie Mae created their own rules-based underwriting system. The FICO score came out in 1989, and Fannie Mae’s clever contribution was to add credit evaluation to the system. With 1,700 business rules and a credit underwriting adjunct, Fannie Mae released Desktop Underwriter (DU) in 1995, a massive innovation in mortgage. Some would argue this was the single most significant technological innovation in mortgage ever.

In the 2000s, and especially after the financial crisis, we saw expert systems integrate more and more rules, and increasingly become compliance systems. Now under conservatorship, Fannie Mae and Freddie Mac continued to increase their focus on income, property, assets, and credit (IPAC). The complexity of the regulatory environment skyrocketed and point-based expert system solutions grew out of gaps in core platforms.

Natural Language Processing

Natural language processing (NLP) is the field of artificial intelligence that enables computers to understand, interpret, and generate human language. NLP is closely tied with computer vision technologies like OCR and ICR, as well as with interactive voice response systems that use speech recognition.

It’s difficult to place the exact timing of NLP in mortgage as it has generally evolved with both machine learning and computer vision. NLP as a field moved away from rules-based approaches in the 1990s and embraced statistical and probabilistic methods. I assume that as OCR became mainstream and widely adopted by Fannie Mae and Freddie Mac, they extended solutions with modern NLP to determine meaning and create actionable insights.

Sentiment analysis emerged as a field in the 1990s, and NLP-powered sentiment analysis emerged as an innovative call center technology at that time, allowing machines to determine how a customer call was going in near real-time. Combined with automatic speech recognition (ASR), traditional call recording solutions became more intelligent with NLP, enabling meaning to be derived from text generated from speech.

The Miracle of What Happened Next

In 1998 the convolutional neural network (CNN) was invented, and in 2009 the first graphics processing unit (GPU) was used for the purpose of deep learning. This major technological leap was actually enabled by video games, as GPUs has previously only been used to render high quality video game imagery. Then in 2017, the groundbreaking paper “Attention is All You Need” was published by eight computer scientists working at Google, which introduced the transformer architecture and the mechanism of self-attention.

There is no single inventor of generative AI and it's remarkable how so many different sparks came together to enable generative AI to catch fire.

Combined with the massive gains in accelerated computing, this gave us mass availability of generative AI (genAI) in 2022 with the release of ChatGPT 3.0.

GenAI is a type of artificial intelligence that creates new content, such as text, images, or code, by learning patterns from existing data.

Penetration of Generative AI in Mortgage Today

While confusion persists around the distinction between AI and genAI, successful applications like operator chatbots leveraging retrieval-augmented generation have emerged. AI-assisted code generation has become routine, and large language models (LLMs) are being used in sentiment analysis and customer interaction analysis. GenAI-based tools are gaining traction in OCR, ICR, and marketing content generation, though adoption remains cautious, particularly for customer-facing roles. Mixed policy approaches reflect this hesitance, with some enterprises fully embracing tools like ChatGPT while others impose outright bans, highlighting cultural and educational barriers.

Despite these advancements, confusion about genAI's use cases, often conflated with traditional AI or expert systems, further complicates adoption. Discussions about AI guardrails are becoming more prevalent, reflecting the need for clearer guidance and responsible use. However, truly innovative applications of genAI in industries like mortgage remain scarce, likely due to restrictive policies, a chilling effect from regulatory fears, and insufficient investment in organizational research and development (R&D) labs. This underscores the need for more targeted exploration and education to unlock genAI’s potential while addressing these challenges.

Similar posts

Insights, Rules, and Experiments in the AI Era