上海品茶

Urban Institute & FHLBanks:利用人工智能促进抵押贷款融资的公平(英文版)(60页).pdf

编号:145244 PDF 60页 578.05KB 下载积分:VIP专享
下载报告请您先登录!

Urban Institute & FHLBanks:利用人工智能促进抵押贷款融资的公平(英文版)(60页).pdf

1、RESEARCH REPORT Harnessing Artificial Intelligence for Equity in Mortgage Finance Michael Neal Linna Zhu Caitlin Young Vanessa Gail Perry Matthew Pruitt November 2023 HOUSING FINANCE POLICY CENTER ABOUT THE URBAN INSTITUTE The Urban Institute is a nonprofit research organization that provides data a

2、nd evidence to help advance upward mobility and equity.We are a trusted source for changemakers who seek to strengthen decisionmaking,create inclusive economic growth,and improve the well-being of families and communities.For more than 50 years,Urban has delivered facts that inspire solutionsand thi

3、s remains our charge today.ABOUT THE FEDERAL HOME LOAN BANK OF SAN FRANCISCO The Federal Home Loan Bank of San Francisco is a member-driven cooperative helping local lenders in Arizona,California,and Nevada build strong communities,create opportunity,and change lives for the better.The tools and res

4、ources we provide to our member financial institutionscommercial banks,credit unions,industrial loan companies,savings institutions,insurance companies,and community development financial institutionsfoster homeownership,expand access to quality housing,seed or sustain small businesses,and revitaliz

5、e whole neighborhoods.Together with our members and other partners,we are making the communities we serve more vibrant and resilient.Copyright November 2023.Urban Institute.Permission is granted for reproduction of this file,with attribution to the Urban Institute.Cover image by Tim Meko.Contents Ac

6、knowledgments iv Executive Summary v Harnessing Artificial Intelligence 1 The Potential of AI in Mortgage Finance 8 AI throughout the Mortgage Life Cycle 13 How Can AI Embed Discrimination into the Mortgage System?23 The AI Ecosystem 31 Recommendations to Advance Racial Equity through AI 38 Notes 45

7、 References 49 About the Authors 51 Statement of Independence 53 iv ACKNOWLEDGMENTS Acknowledgments This report was funded by the Federal Home Loan Bank of San Francisco.We are grateful to them and to all our funders,who make it possible for Urban to advance its mission.The views expressed are those

8、 of the authors and should not be attributed to the Urban Institute,its trustees,or its funders.Funders do not determine research findings or the insights and recommendations of Urban experts.Further information on the Urban Institutes funding principles is available at urban.org/fundingprinciples.T

9、he authors are grateful for the time and insights of the organizations and individuals with whom we spoke in developing this report.In addition to those profiled in the report,we acknowledge participants CoreLogic,the Federal Housing Finance Agency,the Mortgage Industry Standards Maintenance Organiz

10、ation,Steve OConner and the Mortgage Bankers Association,Upstart,F.Dan Siciliano,Peter Hull,David Arnold,Michael Akinwumi,Will Dobbie,Peter Bergmen,Ed Sivak,and Kareem Saleh for their contributions to this piece.We also thank David Hinson for his thoughtful and careful copyedits.EXECUTIVE SUMMARY v

11、Executive Summary Artificial intelligence(AI)is poised to transform the mortgage industry.The potential efficiency gains are significant,as is the promise of AI to overcome human biases and the challenges of evaluating nontraditional financial profiles,more common for mortgage applicants of color.Bu

12、t reports across various economic sectors illustrate AIs potential to perpetuate racial inequities.1 Without challenging or correcting for the underlying causes of bias in the data they use,AI models can simply perpetuate and embed racial inequality on a larger scale.2 In this report,we highlight cu

13、rrent use of AI and its potential to achieve both efficiency and racial equity at several key steps of the mortgage process.In addition,based on interviews with industry stakeholders,we describe the mortgage ecosystem and the role various stakeholder groups play in influencing AI adoption and penetr

14、ation.The analysis in this report is based on a qualitative research approach.The data collection included nearly 50 interviews with key stakeholders in the mortgage ecosystem.This includes staff members in the federal government,financial technology(fintech)companies,mortgage lenders,consumer advoc

15、ates,and research organizations.Over these interviews,several key trends consistently arose:There is no clear agreement on the definition of“artificial intelligence.”Firms largely use AI to pursue efficiency,often through machine learning.AI could amplify existing racial disparities or create new on

16、es.The ability of AI to improve racial equity can be undermined by the data used to train the algorithm,not just by the algorithm itself.Some of the most promising AI-based underwriting models are also the most controversial,such as explicitly incorporating race up front in underwriting models.Withi

17、n the mortgage process,AI is being used in marketing,underwriting,property valuations,and fraud detection and is beginning to be incorporated into servicing.In terms of adopters,AI has been used by the government-sponsored enterprises(GSEs),large mortgage lenders,and fintech firms.Adoption rates app

18、ear lower among smaller and mission-oriented lenders,such as minority depository institutions(MDIs)and community development financial institutions(CDFIs).vi EXECUTIVE SUMMARY Based on our interviews and our understanding of AI use in the mortgage finance ecosystem,we recommend actions in the follow

19、ing areas:Intentional design.Designers of AI models should determine whether the inputs and the underlying data that models learn from reflect the housing markets disparate racial impacts.Prioritizing equity must happen from the start,not as an afterthought.MDIs and CDFIs provide a blueprint for det

20、ecting the underlying bias in a models data,and policy should move toward a requirement that firms should debias AI models to correct these disparities.Pilot programs and innovation.The GSEs and Ginnie Mae should use pilot programs to test how AI models affect industry and consumer outcomes,particul

21、arly around data use and mortgage servicing.Increased regulatory guidance.Federal regulators must improve trust and equity in AI in the mortgage finance industry through interagency coordination and by setting clear guidelines for its application.Federal regulators should also clarify the rights of

22、consumers to access private data sources,especially those from protected classes.Harnessing Artificial Intelligence Framing the Efficiency and Racial Equity Dimensions of Artificial Intelligence One of the benefits of AI is its ability to integrate and analyze a broad range of data to maximize predi

23、ctive accuracy and other outcomes,such as equity.In theory,AI should help close racial inequities in mortgage outcomes.To motivate this intuition,we start with an idea in economics known as the production possibilities curve.3 In its simplest form,a production possibilities curve illustrates the tra

24、de-offs an economy must make to achieve production of a desired mix of products or services.Allocating more resources to the production of one item means there are fewer remaining resources to produce the other item.FIGURE 1 Production Possibilities Curve URBAN INSTITUTE Source:The Balance Money.Win

25、eCottonImpossible.Inefficient.Efficient 2 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE Equity Accelerator for Homeownership This report is part of a series,commissioned by the Federal Home Loan Bank of San Francisco,that examines how innovations within the mortgage finance syste

26、m can help narrow Black-white homeownership gaps.Homeownership is the primary way many US households build wealth.But because of historical racism and its ongoing legacies,the path to homeownership for Black households is rife with structural barriers,and,even once obtained,homeownerships benefits a

27、re not equitably distributed.To address some of the persistent racial disparities in homeownership and wealth,the Federal Home Loan Bank of San Francisco has partnered with the Urban Institute to launch a research and product development initiative called the Racial Equity Accelerator for Homeowners

28、hip.The accelerator hosts several research workstreams investigating methods for facilitating and sustaining Black homeownership:incorporating alternative data into mortgage underwriting mitigating the impact of student loan debt on Black homeownership innovating loss mitigation strategies to help h

29、ouseholds sustain homeownership during times of stress using artificial intelligence and advancing technologies that can overcome mortgage lending biases(the focus of this report)Although the accelerator focuses on Black homeownership,many of the barriers Black households face apply to other househo

30、lds of color,and the solutions to reduce the Black-white homeownership gap can help other households who struggle to become homeowners and build wealth.Addressing the Black-white homeownership gap is essential to achieving racial equity and ensuring all households have access to homeownership.Histor

31、ically,the mortgage finance system purposefully excluded Black households from homeownership through racist practices such as redlining,and the legacies of these practices persist.To undo the effects of explicit historical racism in housing,an equally explicit commitment must be made to address raci

32、al homeownership disparities.Without such a commitment,homeownership and wealth gaps will widen.Rooting out systemic racism is a complicated process that will require sustained collaboration between many actors in the housing finance system,and the policy and practice changes proposed in this report

33、 may improve Black homeownership only at the margins.But our research in this area is promising,and if the housing finance system can rally the necessary political will,we may be able to make tangible improvements for hundreds of thousands of Black households.In a simple and traditional example,an e

34、conomy chooses between two items,cotton and wine.The inputs needed to produce cotton cannot be used to produce wine,and conversely,if the inputs are used to produce wine,they cannot be used to produce cotton.In extreme situations,if all resources are used HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUIT

35、Y IN MORTGAGE FINANCE 3 to produce cotton,no wine can be produced,and similarly,if all resources are used to produce wine,no cotton can be produced.Figure 1 illustrates the production choices in this hypothetical economy,with each choice represented by a point on the chart.Any point along the produc

36、tion possibilities curve is considered“productively efficient,”which means the economy is using all its productive resources as efficiently as possible.The combination of goods the economy produces can be influenced by preferences or market demand.How Can a Production Possibilities Curve Relate to A

37、I?In the previous example of a simple economy,the maximum output of cotton and wine and the continuum of trade-offs associated with their production were easily quantified.Though less readily quantifiable,policy analysis has demonstrated a similar trade-off between efficiency and equality(Okun 2015)

38、.Efficiency refers to the allocation of resources that maximizes overall productivity.Equality exists when intersections of social identities,residence in marginalized communities,or experiences with oppressive systems do not determine opportunities,access to resources,and outcomes in life.Achieving

39、 equality requires acknowledging,addressing,and dismantling systemic biases in mind-sets,practices,and policies(Venkateswaran et al.2023).Actions that boost efficiency often come at the expense of greater equality,while steps taken to improve equality often limit efficiency.Policies that promote eff

40、iciency often prioritize market mechanisms and individual incentives,which can lead to unequal outcomes.Moreover,efficiency-driven policies may overlook externalities and social costs that can negatively affect vulnerable populations.If market preferences or market demand desired only equality,the i

41、ndustry would be less efficient.Conversely,if the industry chooses to focus solely on efficiency,the trade-off would be less equitable outcomes than are otherwise possible.Policymakers may put their finger on the scale in favor of societal priorities.Finding the right balance between efficiency and

42、equality involves complex considerations.But production theory posits that innovative techniques could move the trade-off curve outward.In other words,a more innovative process can boost the production of cotton and wine simultaneously 4 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINA

43、NCE and therefore raise production overall.For example,the development of machines to churn wine may free up human resources that can be applied to cotton production.The same is true for the trade-off between efficiency and equality in mortgage finance(figure 2).For example,use of an automated valua

44、tion model(AVM)to determine home valuesin place of a human appraisercan improve both equality and efficiency.AVMs may improve efficiency by reducing the amount of time required to estimate a propertys value.And eliminating the human appraiser may remove instances of conscious or unconscious racial b

45、ias.FIGURE 2 The Equality-Efficiency Trade-Off and the Effect on Innovation The production possibilities curve moves out because of innovation URBAN INSTITUTE In this report,we approach AI as a trade-off curveshifting innovation.On one hand,AI can boost efficiency and accuracy while reducing labor c

46、osts.In addition,AI provides greater modeling flexibility that can account for the experiences for people of color that traditional models may omit or misunderstand.As a result,one may hypothesize that AI use,as an innovative technique,should boost both efficiency and equality.On the other hand,expe

47、riences in other sectors suggest that AI carries the risk of codifying and amplifying the inequities embedded in the system today.To apply this theoretical model to the realities of the mortgage system,we conducted interviews with stakeholders and reviewed the latest reports on AI.The first step in

48、each interview was to determine how key stakeholders defined AI.EfficiencyEqualityMore efficient and more equitableHARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 5 Defining AI In interviews with nearly 50 housing finance stakeholders,we heard many definitions of AI.Notably,there w

49、as a lack of universal agreement about what AI actually is(box 1).A central step in assessing AIs impact is to first define it.For example,AI as a“replication of human decisionmaking”was a less prevalent definition used by representatives from larger financial institution leaders we spoke to.Other s

50、takeholders offered less specific definitions of AI that were indistinguishable from automation more broadly.BOX 1 Definitions of AI,According to Our Interviewees Our research included interviews with housing policy stakeholders,including mortgage lenders,consumer advocates,credit union executives,a

51、cademics,tech equity experts,and fintech leaders.The most common definition we heard focused on AIs ability to mimic human behavior.Another definition focused on leveraging AI models to improve organizational efficiency.As interviewees noted,these definitions may be too broad to be useful:1.Intervie

52、wees focused on AIs ability to mimic human decisionmaking:“AI is some automated decisionmaking rule or any database decision rule.”Academics studying AI issues“Any system that tries to replicate similar decisionmaking that humans dothat is,AI and automated statistical models and machine learning mod

53、elsare instances of AI.For example,computer vision is AI because it tries to visualize similar objects that humans see.First,there is a type of AI that learns from experience,and the second does not need experience to learn;these are called machine learning and zero-shot learning,respectively.”Tech

54、equity expert“AI is the ability to create computer programs that mimic human decisionmaking and algorithms that will allow you to be more precise in understanding the variables and their relationships,akin to machine learning.”Financial services trade association executive“At the risk of generalizat

55、ion,AI is math that attempts to replicate human intelligence.For example,statistical models attempt to classify risks like humans do.Generative AI attempts to generate creative outputs like text,images,and audio,like humans do.Computer vision attempts to see things like humans do,etc.”Tech equity ex

56、perts 2.Interviewees working in the industry emphasized AIs ability to improve individual or organizational efficiency:“AI is technology that helps do something more at scale.”Financial services trade association executive 6 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE “AI is a

57、tool that is improving efficiency by looking at the data and trying to assist with process and decisionmaking.”Credit union executive“AI is leveraging of technology to create efficiencies within ones operations.”Credit union executive 3.Interviewees who noted how common definitions of AI could be to

58、o broad:“There is no standard definition it can include statistical,mathematical models or automation.A lot of the processes used by creditors that are called AI may not truly fit into that box.We are trying to understand that distinction.”Consumer rights advocates“Several pieces of state legislatio

59、n have mentioned automated decision systems,but that definition could be too broade.g.,a stoplight would fall under that definition.So,replicates human decisionmaking may be too broad.”Mortgage lender For this report,we develop and offer the following working definitions.We build off Stanford Univer

60、sitys John McCarthy,who defines AI as“the science and engineering of making intelligent machines,especially intelligent computer programs.”4 As IBM states,“It is related to thetask of using computers to understand human intelligence.”5 In its simplest form,AI combines computer science and robust dat

61、asets to enable problem solving.It also encompasses subfields of machine learning,deep learning,and natural language processing that are frequently mentioned in conjunction with AI(Black Knight,n.d.).These disciplines are composed of AI algorithms that seek to create expert systems to make predictio

62、ns or classifications based on input data.Deep learning uses artificial neural networks that loosely simulate the human brain to identify patterns in data or to predict outcomes.Machine learning focuses on the development of computer algorithms that can learn and perform tasks without specific addit

63、ional programming.Machine learning,increasingly being employed in algorithms throughout the mortgage industry,is a branch of AI and computer science that focuses on the use of data and algorithms to imitate the way humans learn,gradually improving its accuracy.It is important to distinguish between

64、automation and AI,which are often conflated.Automation can significantly reduce the human effort needed to complete certain routine tasks,such as employment verification,property appraisals,and certain types of marketing.But automation without AI is limited to the preprogrammed rules implemented by

65、the developer of the automated process and relies on their“thinking”for its effectiveness.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 7 Literature Review Emerging research has helped shed light on the implications of AI for the mortgage industry.Recent reports show how machine

66、learning models can predict delays in the mortgage origination process(Brahma et al.2021),improve forecasting of mortgage delinquency(Azhar Ali et al.2021),increase the efficiency of fraud detection(van Zetten,Ramackers,and Hoos 2022),and improve the efficiency and accuracy of AVMs(Steurer,Hill,and

67、Pfeifer 2021).At the same time,AI may pose risks to communities of color.Research suggests that machine learning models,which also use alternative data,can result in larger biases in mortgage underwriting by uncovering sophisticated ethnicity and race proxies invisible to statistical models(Zhang,Kh

68、alili,and Liu 2019).Additional research confirms this conclusion while illustrating that adopting fairness techniques may result in worse outcomes for both Black and white mortgage applicants(Zou and Khern-am-nuai 2022).In addition,analyses of AVMs suggest that they may produce greater relative erro

69、r in majority-Black neighborhoods compared with majority-white ones(Neal et al.2020).A growing body of research has begun to develop a template for assessing uses of AI in mortgage lending.Through its Tech Equity Initiative,the National Fair Housing Alliance(NFHA)has developed a template that can co

70、nduct a critical analysis of an algorithmic system to identify its assumptions and limitations and can produce appropriate recommendations to mitigate consumer fairness and privacy risks(Akinwumi,Rice,and Sharma 2022).Recent research suggests that fair lending analyses of machine learning models mus

71、t focus on outcomes as opposed to inputs(Gillis 2021).FinRegLabs work also informs the discussion of AI in the credit industry more broadly,with implications for policymakers(Bailey et al.2023;Cochran et al.2021).Theory and research suggest that AI,often using machine learning,can improve efficiency

72、 within the mortgage process.But racial bias may undermine its use in practical settings.And federal regulators have acknowledged the potential downside risk.6 Some federal agencies have begun to outline expectations of AI and its implications for fairness in mortgage lending.7 The rest of this repo

73、rt delves deeper into the applications of AI.The next section describes the dynamics boosting the adoption of AI.We then identify key areas of the mortgage application process and describe the implications of AI for efficiency and racial equity at each stage of production,and we offer a case study o

74、f how these principles apply to valuation.Next,we examine how key actors in the mortgage ecosystem use AI.We end with policy recommendations to strengthen racial equity in the mortgage market without sacrificing efficiency.8 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE The Poten

75、tial of AI in Mortgage Finance In this section,we discuss AIs potential to drive greater efficiency and the promises and risks related to equity.AI and Efficiency Potential efficiency gains are a major factor driving AI deployment in the mortgage system.Given the vast amounts of information required

76、 throughout the mortgage life cycle,and the industrys cost structure,this is not surprising.Loan production costs are high and have risen in recent years.Before the recent refinance wave,between the first quarter of 2009 and the first quarter 2019,mortgage production costs rose 250 percent,from$3,70

77、0 to$9,300 per loan.Over this same period,inflation,measured as the change in the Consumer Price Index,rose only 19 percent.More recently,as of the fourth quarter of 2022,the average cost was$12,450.8 FIGURE 3 Mortgage Loan Production Expenses,Fourth Quarter of 2017 through Fourth Quarter of 2022 So

78、urce:Mortgage Bankers Association.Used with permission.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 9 The high fixed costs of mortgage production,excess capacity during periods of low loan production volume,and inefficient loan manufacturing processes all contribute to rising pr

79、oduction costs.The increased costs of regulatory compliance with rules such as the 2010 Dodd-Frank Act and the False Claims Act also play a role(Freddie Mac Single-Family 2021;McCargo 2017).Macroeconomic fluctuations magnify these structural challenges.In response to the broader economic cycle betwe

80、en 2019 and 2021,the Federal Reserve cut the federal funds rate and,with the onset of the COVID-19 pandemic in 2020,further eased policy.In response,mortgage demand soared in 2020 and 2021,driven largely by refinance loans.Refinances often cost less to produce than purchase loans.But tightening mone

81、tary policy over the past two years,largely in response to elevated inflation rates and solid labor market conditions,has boosted mortgage interest rates.Mortgage applications have responded predictably,with applications for purchase loans and refinances both declining,especially refinance applicati

82、ons.These trends have dragged down volumes and profit margins,as lenders take time to shed excess capacity.The mortgage industrys characteristic cycles of ramping up and then ratcheting back based on economic conditions means efficiency is a key focus for most lenders.With lenders looking for ways t

83、o establish leaner operating processes to mitigate this risk,increased automation through AI can be one long-term solution.AI and Racial Inequities in the Housing and Mortgage Markets By eliminating human bias in decisionmaking,AI also holds great potential to address the racial homeownership gap.It

84、 must first,however,overcome the biases and inequities already embedded into the data it analyzes.And policymakers and the mortgage industry must reckon with historical and present-day barriers that lock would-be homebuyers of color out of the market altogether.Several factors limit success for peop

85、le of color throughout the entire mortgage cycle.These disparities are driven by factors that appear to be race neutral but in fact reflect and proxy for the results of a long history of racial discrimination.9 This discrimination,rooted in structural racism,has manifested in less access to homeowne

86、rship and its intergenerational wealth-building effects,in the lack of access to mainstream banking and credit-building mechanisms,in community-level disinvestment,and in broader disparities in education,employment,and income.10 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE Histo

87、rical discrimination such as redlining limited access to mortgages for people of color,and restrictive racial covenants systematically excluded people of color from owning properties in specific,usually higher-opportunity,areas(Gerken et al.2023;Santucci 2019).Predatory lending often targeted househ

88、olds of color with mortgages they could not afford,setting them up for foreclosure,wiping away their equity,and ruining their credit profiles in the 2008 foreclosure crisis(Rugh and Massey 2010).The impact of these racist actions,combined with instances of outright destruction of Black property and

89、wealth,have led people of color to mistrust financial institutions.10 Racial discrimination also exists in other economic sectors and affects a households ability to qualify for a mortgage.For example,Black workers are often the first fired during an economic recession.In addition,historical experie

90、nces,combined with fewer resources channeled to students of color,have limited access to education for people of color,lowering their incomes and making it more difficult to achieve homeownership.11 Black and Hispanic workers typically receive less income,even for the same level of education.Higher

91、denial rates,particularly for Black and Hispanic applicants,have persisted(figure 4).12 A larger proportion of mortgage applicants of color have low,or even missing,credit scores.In addition,renters of color are more likely to be cost burdened,increasing their reliance on debt.This results in less s

92、avings needed to purchase a home and higher debt-to-income ratios.In addition,inadequacy of property(i.e.,collateral)may hinder applicants of color from qualifying for a mortgage,especially when vying for lower-price,more affordable homes or homes in neighborhoods where race-based valuation bias lea

93、ds to undervaluation.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 11 FIGURE 4 Denial Rates,by Race,Ethnicity,and Application Type URBAN INSTITUTE Source:2021 Home Mortgage Disclosure Act data.Note:For owner-occupied,primary-residence applications only.It has also set applicants

94、of color up for subtle forms of discrimination in the mortgage application stage through process and documentation.Increased reliance on easily documentable and easy-to-assess factors,such as a 3-digit FICO score and a consistent salary documented via a W-2,has sidelined applicants who,while fully c

95、reditworthy,have a profile that cannot easily feed into an automated,rules-based system.When they are approved for a home loan,Black and Hispanic homebuyers are charged higher rates and fees for mortgage loans and higher mortgage insurance premiums because of risk-based pricing based on down payment

96、 amounts(a marker of personal and family wealth)and credit scores.Research also points to the potential for racial bias in home appraisals research(Narragon et al.2021;PAVE 2022).Property valuation has generated significant equity concerns.Evidence indicates that borrowers of color are more likely t

97、o have their property undervalued by appraisers relative to white borrowers.The potential for appraisal bias may be at least partly rooted in racial prejudices appraisers hold.Moreover,structural racism affects the values of the properties owned by households of color or in neighborhoods of color.Ap

98、praisal error and bias have systematically resulted in either under-or overvaluations(Neal et al.2020).In addition,industrial sites were often placed near neighborhoods of color,further depressing home values.And a lack of neighborhood investment restrained home values in communities of color.Becaus

99、e of local property tax assessment policies,14%26%20%14%24%9%20%15%9%20%13%25%20%13%22%10%19%13%11%19%AsianBlackHispanicWhiteOtherOverallPurchase loansRefinance loansReverse mortgages 12 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE homeowners of color also pay more for property

100、taxes as a proportion of their home values(Avenancio-Len and Howard 2022).Taken together,homeowners of color accumulate less equity than white homeowners.These racial inequities are also reflected in racial disparities in refinancing.Homeowners of color are less likely than white homeowners to refin

101、ance.13 Low collateral values,high debt positions,and the timing of a home purchase are key factors that reduce refinancing for mortgaged homeowners of color.In turn,this leaves homeowners of color with housing costs higher than those for homeowners who can refinance,costs that reduce households abi

102、lity to tap any wealth gained.In these ways and more,racial disparities in homeownership,once the result of mechanisms that were legal,have been embedded and compounded.Today,the Black-white homeownership gap is worse than it was in 1968(Goodman and Zhu 2021).In the pursuit of efficiency,will AI emb

103、ed and reinforce this cycle,or can stakeholders use AI to counteract the racial homeownership gap?AI and Its Implications for Efficiency and Equity The tension between equity and efficiency competing for the same resources sets up a trade-off curve that describes the limits of that system across var

104、ious combinations of efficiency and equity;you can move along the production possibilities curve and adjust its performance to be more efficient or more equitable,but you cannot increase both at the same time.In the next section,we examine what we learned from interviewees and experts about how AI i

105、s being deployed across the mortgage process and the implications for efficiency and equity.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 13 AI throughout the Mortgage Life Cycle Here,we outline some of the ways AI is increasingly being used in the mortgage process,from marketing

106、 all the way through to closing and servicing.Marketing and Loan Production Selection(Sorting)AI tools allow lenders to launch hyperindividualized marketing campaigns.This corresponds with recent technology trends in the mortgage processsuch as those that track the saving and spending habits of bank

107、 account holdersthat can be useful for identifying potential clients and tailoring offers to them.AI tools can also be useful for identifying patterns in the data to determine whether existing borrowers are looking to refinance their loan.Marketers can also use third-party data to target people at c

108、ertain life stages,such as newly married people or new parents,who may be more likely to apply for a mortgage.AI tools also analyze customer demographics,behaviors,and sentiments about lenders and the broader market and craft more targeted and appealing content.Additionally,AI helps organizations de

109、velop content to meet their search engine optimization goals by determining what keywords can attract more customers.AI can also be used to facilitate interaction with prospective applicants,such as chatbots.14 As a result,an AI-based chatbot functions as a first-response tool that greets,engages wi

110、th,and serves customers in a familiar way.Although AI can allow for more targeted outreach and steering of potential borrowers,lenders using AI need to be careful to ensure their practices do not create disparate treatment or impact.One of the marketing experts we spoke to provide an example of an o

111、rganization that tried to target“sports enthusiasts.”This ostensibly race-neutral marketing category could disproportionately exclude people of color depending on the sport targeted(e.g.,white people make up a larger share of fans of golf,for instance,compared with other sports,such as football or b

112、asketball).In such ways,a bias that was historically based on factors such as cultural affinity or,for example,living in neighborhoods without bank branches,can become baked into an algorithm.14 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE Because lending is so highly monitored,

113、lenders need to ensure that their marketing campaigns do not disparately isolate or exclude certain demographics.Moreover,lenders have previously marketed lower-quality products to people of color compared with white counterparts,such as when financial institutions targeted subprime loans to Black a

114、nd Hispanic households.Lenders should learn from previous actions to minimize the financial risks for communities of color.In terms of sorting borrowers and matching them to loan products,AI could also play a role in suggesting or steering applicants toward certain loan products,such as an adjustabl

115、e-rate loan versus a fixed-rate loan,or toward a certain channel,such as Federal Housing Administration(FHA),Veterans Administration(VA),or Fannie Mae or Freddie Macbacked loans;bank-held portfolio loans;or private investors.There is a long and well-documented history of racialized sorting,where bor

116、rowers of color have been steered to costlier options,such as loans from finance companies,FHA-backed loans,and subprime loans(Bayer,Ferreira,and Ross 2018).AI could replicate such steering by presenting borrowers options that maximize lender profits,or it could result in borrowers being matched wit

117、h the product and channel that is optimal for them.Underwriting and Pricing Underwriting is a multistep process by which a mortgage lender verifies a potential borrowers income,assets,credit history,debt,and property details to issue final approval on a loan application.The underwriting process usua

118、lly takes three to six weeks.15 This risk assessment process often also determines the price charged for the loan.Underwriting rules can depend on intended mortgage execution(e.g.,FHA versus VA versus GSE).Mortgage underwriting tools can use AI models in their process.They can automate distinct task

119、s,such as document review,and meet other underwriting parameters.16 And it can be employed to meet several underwriting parameters.In terms of equity,AI has been cited as a mechanism for potentially improving access to mortgage credit for people of color.Over the past several decades,lenders have re

120、lied on traditional credit measures for assessing a potential borrowers ability to qualify.But these traditional measures tend to disproportionately affect people of color and often fail to include predictors of mortgage performance,such as on-time rental payments(Choi et al.2022).Interestingly,many

121、 of these predictors were once relied upon when underwriting was more of a manual,human judgmentcentered process.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 15 In reckoning with this problem,lenders and regulators have slowly begun to adopt“alternative”datasuch as on-time renta

122、l,telecommunications,and utility paymentsto extend access to mortgage credit to those who have been historically excluded.Research shows that these data points are both strong predictors of mortgage payments and could reduce racial disparities in mortgage lending17 with certain precautions(Wu 2022).

123、And this is certainly an area where AI can be of use.First,AI will expand the use of data,which can help a lender improve underwriting outcomes for people of color.And by incorporating alternative data sources,the benefits for applicants of color could be further improved.But second,AI can also bett

124、er analyze traditional tradeline data.For example,Yolanda McGill of Zest AI notes their fintech company has employed AI and machine learning models to identify potential borrowers financial situations and ability to access credit.18 In this way,AI can help extend mortgage credit to borrowers whose c

125、redit scores or histories would have disqualified them under traditional underwriting standards.By incorporating more data that capture people of colors ability to qualify for a mortgage,mortgage underwriting can become more equitable.Applying AI and machine learning to typical credit bureau tradeli

126、ne data to pull more variables from it gives more data that provide a clearer picture of thin-file borrowers.Separately,more data can be obtained using alternative data,which can also be useful to score thin-file borrowers.Obtaining more data points on a borrower even using only normal tradeline dat

127、a is possible with machine learning and can help lenders reach creditworthy borrowers they would have overlooked.But including traditional forms of information for assessing the ability to repay a mortgage into an AI-based model has proven inferior to combining alternative measures of credit,such as

128、 rental payments,into AI-based models.19 Some of the most promising AI-based underwriting models are also the most controversial because they incorporate race or ethnicity into the underwriting model.Including data on race allows these models to produce more equitable outcomes for people of color,re

129、ducing their likelihood of denial relative to white applicants(Gillis 2021).But by including explicit data on protected classes,they could conflict with current federal law unless they are part of a special purpose credit program under the Equal Credit Opportunity Act(ECOA).But there may be methods

130、for incorporating protected class status.An interviewee working at a fintech company noted that protected class information ought to be used during model training,such that the algorithm can learn to set the relative weights on variables in ways that make its decisions more sensitive to the credit c

131、haracteristics of protected class applicants.That is,consciousness of protected status can be used so that models are built to be fair from the beginning and thus achieve more equitable outcomes without the need for protected status information to be used as an input at 16 HARNESSING ARTIFICIAL INTE

132、LLIGENCE FOR EQUITY IN MORTGAGE FINANCE decision time.20 But the federal finance regulators have never issued guidance on whether this use of protected class must be reconciled with the ECOA.Many fintech companies,often the early-stage creators or adopters in an innovation system,argue that mitigati

133、ng racial disparities requires revisiting prohibitions on the use of protected status,which may have made sense at one point but may have outlived their usefulness.At the same time,these fintech companies continue to look for ways to build credit strategies that are sensitive to protected classes wh

134、ile keeping their products aligned with federal regulations and increasing the likelihood that borrowers and lenders will adopt them.But the impact may not diverge from models seeking to avoid regulatory violations but still produce disparate outcomes.This challenge raises the need for other race-co

135、nscious approaches that can reduce racial inequities in underwriting.Employment,Income,and Asset Verification Critical to the mortgage process is income and asset verification.Generally,the borrower submits W-2 forms,bank statements,tax returns,and pay stubs to verify income and employment.Some lend

136、ers ask for contracts to verify the borrowers employment status.Similarly,lenders use borrowers bank and investment account records to verify their assets.21 These verifications help lenders evaluate whether the borrower can pay their monthly mortgage reliably.Income and asset documentation may cons

137、ist of physical papers that an underwriter needs to scan,but AI,specifically machine learning processes,can efficiently extract and verify all the information in these documents.These applications can scan huge volumes of case files,extract relevant information,and pass it on to the underwriter in s

138、ummary form.Use of AI in employment and income verification is more straightforward and presents fewer equity concerns than other parts of mortgage underwriting.There also exists the potential to advance equity,as improvements in these verification technologies may make it easier for lenders to asse

139、ss the ability to qualify among people with multiple or inconsistent income streams,such as gig workers,who are more likely to be people of color and whose income has traditionally been difficult to assess.Appraisals In the underwriting process,the determination of a homes value is critical.First,it

140、 gives the underwriter a sense of whether the price agreed upon by the buyer and seller is an accurate HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 17 representation of the homes true value.Second,it provides the underwriter a sense of the risk being taken if the loan is origina

141、ted.A home appraisal is a process through which a real estate appraiser determines a homes fair market value.Separately,collateral issuessuch as an appraisal that comes in below the sales price or mortgage requestedcould undermine the mortgage application and result in the potential borrower being d

142、enied a mortgage.Appraisals are also often used by local governments to determine property taxes.Traditionally,a human appraiser estimated property values.A traditional appraisal also involves a visit to inspect and measure the home.From this in-person visit,an appraiser can assign a value by compar

143、ing the homes features,age,and condition with similar homes in the area and what they sold for.A more efficient and faster process of valuing homes has emerged:automated valuation models.AVMs incorporate large amounts of historical data on home sales to estimate the subject propertys value.In an inc

144、reasing share of loan origination transactions,AVMs are also accepted instead of appraisals.For GSE mortgage originations,AVM use is largely confined to refinance loans with loan-to-value(LTV)ratios below 80 percent.But they are used for some purchase originations where the LTV ratio is 90 percent o

145、r higher.AVMs can determine property values in seconds.This facilitates the mortgage industry by increasing refinances and helping consumers discover the price of a home they may wish to purchase.22 And during the pandemic,appraisal waivers helped sustain the industry when in-person activities were

146、discouraged.AVMs incorporate AI to quicken estimates and to increase accuracy.And with machine learning,the AVM can use large amounts of data more efficiently.Before the use of AI,creating a highly accurate AVM involved developing a base model.But that statistical model would not have the same level

147、 of accuracy in all markets.And creating multiple models for accurate use in different geographies unwinds the benefits of automation.Incorporating machine learning helps AVMs perform complex calculations that more accurately capture how humans think through neural networking and cloud computing pow

148、er.As a result,one model can be written that can adapt to all the nuances of different markets.AVMs are increasingly being used to remove the human potential for bias.AI and machine learning can improve the accuracy of AVMs through more flexible modeling and can better absorb and interpret data.In a

149、ddition,AI can also be used to test an AVM for its potential for bias.The results of a machine 18 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE learningbased testing model should provide for more precise estimates of the racial inequities in property valuation.But in addition to

150、the algorithms in place,an AVMs performance also relies on the data it is fed.And input data that are racially biased will produce racially biased results.This is a huge issue for AVM developers and users because of the long-standing history of racism and racial disparities in appraisal values.The s

151、ales comparison approach used by human appraisersin which appraisal values are based in part on those of comparable propertieshas baked in racial inequity by pulling forward historic undervaluation over time.As a result,an AVM relying on these data will likely experience the same problem and can per

152、petuate racial discrimination.Fintech companies are innovating in the AVM space.They are now able to design a model,based on AI,that can be used to estimate a homes value more quickly and using more data drawn from a broader range of sources.But the AVM is still producing an estimate of the property

153、 focused on what the housing market will bear.One interviewee pointed out that an AVM does not try to incorporate the impacts of systemic racism on a homes value;it just knows what others have paid for similar properties.23 Comparing purchases on similar properties and factoring in the broader conte

154、xt that undervalues homes in neighborhoods of color are two different approaches.Charu Singh of Just Value Inc.leads a property technology and financial solutions firm focused on undervaluations.Singh points out that“AI is not the first nor most impactful tool to solve”systemic appraisal bias,noting

155、 that the housing industry and tech experts“have not yet built the foundation to allow AI models to reflect anything back to us other than the bias already present in the data and market.”24 Singh also points out that societys actions have driven systemic undervaluation,that AI can amplify those act

156、ions,and that communities of color must“lead the way”to identify equitable and transparent AI models.25 Although AI use in AVMs can produce greater accuracy,it will be important to further define what the industry means by accuracy.If accuracy corresponds with a homes ultimate sale price,a comparabl

157、e-sales method improved with AI-based AVMs will reduce racial inequities.Fraud Detection There are several types of fraud,and it may be committed by borrowers,lenders,appraisers,or people involved in the real estate transaction.Income fraud risk,where an applicant falsifies their income,is a critica

158、l concern and can be perpetrated via doctored pay stubs or W-2s to longer-term falsification HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 19 schemes involving using false information with“seasoned”bank account information.Meanwhile,asset rental fraud occurs when loan applicants

159、borrow or rent other peoples assets to make themselves appear more qualified for mortgage financing.Equity skimming occurs when investors use straw buyers,or someone who purchases property on behalf of another person.Homeowners may also fall victim to foreclosure relief scams.In this type of mortgag

160、e fraud,homeowners who are at risk of defaulting on their loans or whose homes are in foreclosure are misled into believing they can save their homes by putting the property in the name of a third-party investor.Scammers often use false or stolen identities to commit mortgage fraud.This takes place

161、when the scammer obtains financing by using an unknowing victims financial information,including Social Security numbers,stolen pay stubs,and falsified employment verification forms,thereby obtaining a fraudulent mortgage on a property they do not own or occupy.False appraisals are another common wa

162、y scammers commit mortgage fraud.Appraisers may commit the appraisal fraud on their own or with the help of other professionals,including a builder or a mortgage banker.There are several ways to avoid mortgage fraud.They include using an attorney to thoroughly review all legal paperwork;checking the

163、 references and referrals of all participating parties,including real estate brokers and loan officers;researching and verifying the propertys title history;reviewing all final loan documents to ensure all information is accurate;and researching and reviewing property tax assessments to verify the a

164、ctual assessed value.Automated computer systems using statistical modeling and analysis to fight online fraud are mature.The software would review previous cases and then look for statistical relationships among many data variables and then use these patterns to flag suspect cases for review by inte

165、rnal antifraud teams.Increasingly,AI is being implemented in these automated systems for greater speed and accuracy.For example,Resistant.AI trains the application,and in turn,the AI system processes information,receives ongoing corrections,and learns from them.26 By training the application,an AI p

166、rocess uses historical data to help uncover important characteristics and features of a particular dataset.As decision logic has evolved,complex pattern recognition has become a requirement,and employing machine learning algorithms can accelerate the use of data.Through machine learning techniques,a

167、 lender can better understand the probability that a mortgage application will have fraudulent information and the part of the application that is likely to be misrepresented.20 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE For example,machine learningbased fraud detection softwa

168、re would be able to improve risk predictions by relying on large volumes of diverse,granular,and high-quality data.In addition,machine learning can quantify any potential fraud risk,capturing actual and nonfraud evidence.A consistent and ongoing feedback loop allows the technology to continuously le

169、arn,creating stronger predictions while identifying new and emerging fraud behavior trends.Machine learning can improve on the inefficiency associated with an underwriter manually reviewing applications.Relying on machine learning technology can help underwriters decide whether a loan needs further

170、investigation for potential fraud.The potential for racial bias may exist in fraud detection software as well.For example,if the training data used to develop the AI-based algorithm are biased,the model could produce biased outputs,thinking that a person of color is more likely to commit fraud.In tu

171、rn,applicants of color may be more likely to be turned down for loans.To date,fraud detection in mortgage lending is an area for potential research.There is little empirical analysis describing fraud detection models and their potential for racial bias.This research would bring greater transparency

172、to an opaque part of the mortgage process.Servicing In loan servicing,the lender collects the principal,interest,and escrow payments from the borrower over the life of the loan and passes those payments to the loan funders(investors).There are two distinct aspects of mortgage servicing:first,managin

173、g performing loans is a high-scale,high-efficiency process that is fairly automated;second,managing nonperforming loans,from their first delinquency through reperformance,restructuring,foreclosure,or some other resolution,is more complex,labor intensive,and costly.Mortgage lenders can streamline loa

174、n servicing workflows using AI-powered support automation platforms.27 These tools alleviate repetitive manual processes and provide solutions to lenders,underwriters,and borrowers.By speeding up loan servicing processes,lenders can deliver better,more tailored customer service to borrowers.But nonp

175、erforming loan servicing has been less automated than earlier stages of the mortgage life cycle,such as underwriting.And the lack of robust data suggests that AI may be less useful in mortgage servicing because there is less information to train the model on.This lack of data often extends to race a

176、nd ethnicity as well.Datasets such as those collected under the Home Mortgage Disclosure Act HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 21 provide this information at origination,but there is no requirement that race or ethnicity Home Mortgage Disclosure Act data be maintained

177、 as the loan moves to servicing.Federal Regulations Promoting Equity in the Mortgage Process Federal fair housing laws seek to eliminate discrimination in the mortgage industry,and these all have implications for AI.Most notably,the ECOA,enacted in 1974,prohibits discrimination in any aspect of a cr

178、edit transaction on several factors,known as protected classes(Federal Reserve,n.d.).Under the ECOA,it is illegal for creditors to discriminate against applicants based on race,sex,age,national origin,or marital status or because an applicant receives public assistance.Creditors are also prohibited

179、from asking about marital status(with some exceptions)and from asking an applicant whether they plan to have children or additional children,though they can ask about the number,ages,and financial obligations relating to all existing children(Wu 2021).Within the Civil Rights Act,passed in 1968,Title

180、s VIII and IX are commonly known as the Fair Housing Act because they established new antidiscrimination laws regarding housing.The 1968 Fair Housing Act expanded on previous laws by prohibiting discrimination concerning the sale,rental,and financing of housing based on race or color,religion,nation

181、al origin,sex,family status,or disability(Federal Reserve,n.d.).The prohibitions codified by the Fair Housing Act cover discrimination in all aspects of residential real estaterelated transactions,including purchasing real estate loans or appraising residential real estate(Federal Reserve,n.d.).Beca

182、use both the Fair Housing Act and the ECOA apply to mortgage lending,lenders may not discriminate in mortgage lending on the basis of any of the prohibited factors listed.Under both laws,the following activities are illegal if performed on the basis of a protected class:failing to provide informatio

183、n or services or providing different information or services relating to any aspect of the lending process,including credit availability,application procedures,and lending standards discouraging or selectively encouraging applicants,with respect to inquiries about or applications for credit 22 HARNE

184、SSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE refusing to extend credit or using different standards in determining whether to extend credit varying the terms of credit offered,including the amount,interest rate,duration,and type of loan using different standards to evaluate collatera

185、l treating a borrower differently in servicing a loan or invoking default remedies using different standards for pooling or packaging a loan in the secondary market(Federal Reserve,n.d.)Regulation around AI is still in its early stages,and much of the discussion in the mortgage space has centered mo

186、re heavily on AIs use in underwriting and appraisals.Additional oversight is necessary to ensure that AI tools are not driving racial disparities in access to mortgage credit throughout the process.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 23 How Can AI Embed Discrimination i

187、nto the Mortgage System?In the previous section,we covered some of the equity issues raised by AI use at different stages of the mortgage process.But more broadly speaking,our research highlighted several ways AI can embed discrimination or systemic racism in the mortgage system across many differen

188、t processes:through biased data,biased models,and biased standards.This section uses an example of AVM error and equity to illustrate how bias affects racial disparities in the mortgage system.Biased Data Across our interviews,many industry stakeholders underscored the fact that any AI or machine le

189、arning tool is only as good as the data that go into it.Any data resulting from racist practices will inform inequitable outcomes under newer AI or machine learning models.This issue is not exclusive to the mortgage industry;equity-minded people are grappling with it across AIs different uses.One re

190、searcher highlighted an issue with AI-based algorithms that set bail in the criminal justice system.People of color,especially Black people,are overrepresented in the criminal justice system and are more likely to have been arrested,despite committing crimes at similar rates to white people.This ove

191、rrepresentation in the data leads to higher bail being set for Black people,despite the algorithm being ostensibly race neutral.Similarly,the history of racism and racial disparities in the credit and housing finance systems creates the same problem for AI users relying on data from these industries

192、.Systemic and individual appraiser racism have led to undervalued homes in many communities of color.So an AI-based AVM may reduce racial bias to some degree by removing the human appraiser,but it can still perpetuate systemic racism by,for example,using undervalued homes as comparisons when assigni

193、ng a value to a property in a neighborhood of color.Another issue with AVMs relying on biased data is that their efficiency means they have the power not only to reinforce systemic racism but to amplify it as well.AI and machine learning models have the capacity to do the work of a human but faster,

194、meaning that lenders can use them to significantly increase loan volume.And that pace will only increase as AI technology improves.But the cost of this 24 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE type of efficiency is that if the industry fails to understand or grapple with

195、the racism baked into much of the data,they will simply further entrench that racism and worsen existing disparities.Moreover,the ability of AI to reduce racial bias could improve if full demographic information was available for every loan,because this information would allow lenders to test,after

196、the fact,for racial bias in lending-related systems.Right now,lenders do not always have full demographic information for every loan because it is an optional part of the loan application.In addition,there are privacy concerns for consumers whose data are used in AI modeling.For instance,advancing t

197、he use of cash flow data in underwriting requires increased use of consumer-permissioned data,particularly from bank accounts.But if the use of this type of data is not regulated and monitored,there are significant data privacy concerns for consumers.And these concerns may be necessarily heightened

198、for consumers of color,who have an understandably greater distrust of the financial system because of past and ongoing harms.Biased Models Another racial equity issue that can arise with AI use is whether the models themselves are biased.Given the restrictions around fair lending,it is unlikely AI d

199、evelopers would build models that disproportionately harm people of color.But the ways some AI models learn and evolve on their own mean that if developers and users are not careful about the parameters they set,AI models can still create inequitable outcomes by race.Fair lending standards are under

200、stood to prevent lenders from using race(or other protected statuses)as inputs in their lending algorithms.28 But many other variables can serve as proxies for race,especially as data quality and granularity improve.Perhaps a lender is hoping to market to potential borrowers,and the lender trains an

201、 AI-based marketing tool based on past borrowers characteristics.Perhaps these past borrowers skew disproportionately white compared with the population,as white households are more likely to achieve homeownership.Even if the data on which the AI is trained do not include information on race,the mod

202、el,as it learns,may identify proxies for race that result in a preference for potential white borrowers over borrowers of color,thereby amplifying the impact of biased data.This is one reason some stakeholders advocate for including protected statuses as model inputs.To correct for historical discri

203、mination,lenders could use race in their models as a means of debiasing their HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 25 outputs.But for this solution to avoid being harmful will require strong regulatory supervision of lenders use of race variables.Another risk associated

204、with AI models is a lack of transparency and explainability,sometimes referred to as a“black box.”The increasing complexity of AI models may mean that humans have a hard time understanding what the model is doing or the relationship between the variables that go into it and the results it generates.

205、This is especially true for AI-based systems because they can generate different decisions even for identical circumstances.Under the ECOA,lenders are required to be able to explain the reasons for denial.But this may become increasingly complicated as the models become increasingly complicated.A la

206、ck of transparency could harm consumers and reduce their ability to dispute what they perceive as an unfair or arbitrary decision.This could disproportionately affect borrowers of color,who are more likely to be denied a mortgage.Finally,because some AI models are designed to learn,there can be an i

207、nherent risk of model drift29 that needs to be closely monitored.A model that begins by offering racially equitable evaluations and outcomes can change to become inequitable or discriminatory.A lack of careful monitoring poses a risk to consumers of color.Discriminatory and Inequitable Standards Bey

208、ond more specific AI-related concerns,developers and users also need to consider the end goal of their AI use.Throughout the mortgage process,certain standards applied to applicants and borrowers have been shown to have racially inequitable outcomes.Given the efficiency gains and amplification power

209、 of AI,mortgage industry stakeholders should be thoughtful regarding where they plug in AI tools.A good example of the issue of discriminatory or inequitable standards are traditional debt-to-income ratio and credit score requirements in the underwriting process.Recently,the mortgage industry has be

210、gun to reckon with its history of systemic racism and how that racism affects what may seem,at first glance,to be reasonable markers of ones ability to qualify for a mortgage.Using automation or AI to simply speed up this process or extract income and debt information from application documents woul

211、d be an example of using AI to increase efficiency but would perpetuate an inequitable standard.26 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE The inequities baked into debt-to-income ratios and credit scores are one reason the use of alternative data in underwriting has gained

212、 steam,because AI-driven models can process more complex and nonstandardized information than rules-based systems currently in use.If it accelerates that trend,AI will not be perpetuating a racially biased status quo but will be used to extend access to credit to people who are disadvantaged by trad

213、itional metrics.Even so,underlying even this improvement is the fact that even though people of color have lower incomes and greater job instability,they may still be less likely to make the on-time rent payments that can sometimes substitute for credit scores.In this regard,AI can be used both to m

214、ake progress on advancing racial equity in the mortgage space and to encourage the industry to think deeper about underlying disparities and innovative ways to remedy them.In the next section,we extend the discussion of AVMs with a case study to illustrate how these biases can be baked into AI and h

215、ow steps can be taken using AI to improve both accuracy and equity.We discuss how AI balances the accuracy-equity trade-off in AVMs and how models that focus on being more precise might skew to exclude or undermine communities of color.Case Study:Integrating Equality into the Design of Machine Learn

216、ing Algorithms by Balancing Accuracy and Equity in AVMs To demonstrate how an AI-based innovation can increase or mitigate bias,we will delve into AVMs.Using an AVM instead of a traditional appraisal can yield significant efficiencies and removes bias risk arising from human judgment.Nevertheless,pr

217、ior research has found AVMs to have less accuracy,as a percentage of value,in neighborhoods of color(Neal et al.2020).To avoid embedding valuation bias in AVMs,the federal government has recommended rulemaking to address potential bias in AVMs(PAVE 2022).As building AI into AVMs can improve their pe

218、rformance and efficiency and expand their use,this case study sheds light on how regulators can think about setting performance standards for equity in this product.30 Measuring Accuracy An AVM produces estimates that can differ from the propertys actual market value because of limitations of the al

219、gorithms,omitted variables,and other factors.To measure the accuracy of an AVMs estimates,the forecast standard deviation(FSD)is used to represent the probability that a particular HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 27 AVM estimate falls within a statistical range of t

220、he subject propertys actual market value.For example,if an AVM produces an estimated value of$200,000 with an FSD of 5 percent,there is a 68 percent probability(1 standard deviation)that the subject propertys actual market value falls between$190,000 and$210,000.The lower the FSD,the more accurate a

221、n AVM is.Measuring Equity The simplest measure of fairness in machine learning is statistical parity.Because we are concerned about algorithmic discrimination against people of color in the form of less accuracy,we want a model that results in the same average FSD for both people of color and white

222、people.But the goal of statistical parity in outcomes overlooks disparities in inputs,factors that affect the accuracy an AVMs performance.Previous findings have shown that properties in neighborhoods with more homogeneous properties,fewer distressed sales,and higher median household incomes tend to

223、 have greater AVM accuracythat is,lower FSDs.Properties owned by white households are more likely to be in neighborhoods with factors contributing to lower FSDs(Neal et al.2020).This means that if we managed to find a perfect AVMthat is,one that takes the data input of any property,regardless the ne

224、ighborhoods housing characteristics,and always minimizes the FSDstatistical parity forces us to make difficult trade-offs.This is because the actual prediction difficulty of the two groups differs,but fairness requires us to achieve equal FSDs.Here again,we face a trade-off curve.Fairness versus Acc

225、uracy To illustrate the trade-off between fairness and accuracy in the context of AVMs,we can use a simple example.Assume two models:model 1 has a higher accuracy overall(a lower FSD score),but it has a lower FSD score in majority-white neighborhoods than in majority-Black neighborhoods.Model 2 has

226、a higher FSD score but still below the satisfactory threshold,and the accuracy of its outcomes is less correlated with neighborhood racial composition.If our goal is pure accuracy,we would select the model with the lowest FSD,which is model 1,but then we violate the fairness requirement.Moving from

227、model 1 to model 2 indicates that improving fairness will degrade accuracy,and vice versa.28 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE But why not build separate models for the two types of neighborhoods?This more complex hybrid model would increase both fairness and accuracy

228、.This hybrid model,though,now explicitly uses race as an input,as the first step is to pick which model to use based on race.The example illustrates several points raised in the preceding section.First,hidden bias in the data is a critical source of AVM error.The training data used in the AVM algori

229、thms(i.e.,home prices of comparable sales)can contain all kinds of hidden bias,and building complex models from such data can both amplify these biases and introduce new ones.Second,the algorithms employed by the AVM may contribute to AVM error.Without specifying a nondiscriminatory fairness factor

230、in the algorithms,the optimal AVMs would maximize the predictive accuracy.When maximizing accuracy across multiple racial groups,an algorithm will naturally maximize for the majority population,at the expense of others,as the majority contributes more to the models accuracy.Thus,building the optimal

231、 model for predictive accuracy without specifying a fairness criterion often results in demonstrable unfairness to some applicants and borrowers.The sensible response to this tension between accuracy and fairness is to acknowledge it and then directly quantify and manage it so that we can figure out

232、 a solution to reduce it.This means including racial and ethnic information in the AVMs inputs can lead to more equitable outcomes.How Can We Optimize This Accuracy-Fairness Trade-Off?FIRST,ESTABLISH THE TRADE-OFF FRONTIER OF ACCURACY AND FAIRNESS ACROSS VARIOUS MODELS For this first step,we measure

233、 the FSD and the unfairness scores of each models output.Each point in figure 5 corresponds to a different AVM.Here,we have drawn a curve connecting a set of models that outperform the other models.We should consider all models on the curve to be reasonable candidates for optimal performance.Moving

234、from model 1 to model 2,we are gaining a greater degree of fairness at the cost of reducing accuracy,as we are getting a greater error score.This yields the frontier of“best”performance between error and unfairness.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 29 FIGURE 5 The Tra

235、de-Off Frontier of Accuracy and Fairness URBAN INSTITUTE NEXT,ENCODE FAIRNESS INTO ETHICAL MACHINE LEANING ALGORITHMS The trade-off frontier is silent about which point we should choose along the frontier,as that is a matter of judgment about the relative importance of accuracy and fairness.We could

236、 assign a single numerical objective that takes a weighted combination of the error and unfairness scores.The weights ascribe a“penalty”to losing accuracy or fairness.Interestingly,this optimization decisionmaking process relies on human judgement,including how to define the notion of fairness and e

237、quity in the first place,which groups of individuals to protect,and how to assign penalty weights in consideration of the relative importance of accuracy and fairness.The accuracy-fairness trade-offs illustrate the complexities and the nuanced nature of algorithmic fairness.Therefore,the design of e

238、thical algorithms becomes an exercise in understanding which fairness criteria are most relevant in a context and then balancing these considerations.Without intentionally addressing these equity concerns,when such models become the basis for widely deployed services such as appraisal tools,the bias

239、 can be further propagated and even amplified by their reach and scale.It may be practically impossible to solve the bias in data input as the data,such as income and home prices,already bake in the impact of historic racism.But it is possible to think about AccuracyFairnessMore accurate,less fair M

240、ore fair,less accurate 30 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE how to integrate fairness and racial equity into the design of ethical algorithms and in the testing of algorithms for bias.We believe federal regulators are positioned to set these governing principles becau

241、se of their authority and because of their ability to set consistent standards on a large scale.In other words,to think about an algorithm for building machine learning models that can avoid amplifying or that can reduce racial bias requires dedicated governmental resources focused on transparently

242、assessing trade-offs that can emerge in principled rulemaking.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 31 The AI Ecosystem In the previous section,we assessed AI use and its implications for both efficiency and racial equity through the mortgage application process.In this s

243、ection,we look at AI adoption from the point of view of various entities involved throughout the process.Many key actors drive the pace and forms of adoption,including fintech companies,lenders,researchers,community advocates,the federal government,and the GSEs.Technology Adoption Life Cycle Broadly

244、 speaking,our qualitative evidence suggests the diffusion of AI parallels the technology adoption life cycle.According to this framework,a small proportion of AI users are innovators and early adopters.In our research,fintech companies and the largest lenders appeared to belong to these two groups,r

245、espectively.FIGURE 6 Diffusion of Innovation Curve URBAN INSTITUTE Source:Adapted from Matt S.Smith,“Models for Predicting the Future:Geoggrey Moores Crossing the Chasm,”SmithHouse blog,February 21,2018,https:/ there appeared to be a gulf between early adopters and other adopters,the distribution to

246、 the right of the chasm in figure 6.This may be attributable to a lack of clarity on issues documenting an AI process or regulations governing the appropriate AI-based decisionmaking.But it may also reflect the InnovatorsEarly adoptersEarly majorityLate majorityLaggardsThe chasm 32 HARNESSING ARTIFI

247、CIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE high up-front costs associated with adopting AI.Below,we look at the state of adoption within each category.Fintech Companies Fintech companies are a key source of innovation.They often develop financial technology to improve and automate the delivery

248、 and use of financial services.That said,the mortgage industry is heavily regulated.And at times,the diffusion of innovation can be restrained by the regulatory environment.Fintech companies are pushing the bounds of the evolution of AI as it pertains to underwriting.One interviewee who works with f

249、intech companies noted that lenders have begun“moving from conventional credit scoring techniques to more sophisticated ones powered by alternative data and AI.”31 Additionally,Zest AI,a mission-oriented fintech company,works with banks,fintech companies,and credit unions to adapt ethical AI practic

250、es into their underwriting,pricing,and other decisionmaking systems.32 Mortgage Lenders The high cost of adopting AI partly reflects the need for new technological infrastructure that can analyze AI-based algorithms.In addition,labor costs are elevated because of a need to hire data scientists and o

251、ther technology specialists who can build and execute AI-based models.Further,regulatory costs may also be high,reflecting a degree of uncertainty on sanctioned uses of AI.Given the financial and regulatory costs of AI,smaller lenders appear further behind in AI adoption.But the lack of AI penetrati

252、on among these lenders may also reflect their relationship banking style of operating that is geared toward achieving their mission.Moreover,some MDIs expressed skepticism of AIs benefits for communities of color as well as the history of structural racism.One interviewee with experience at a CDFI n

253、oted that larger financial institutions probably incorporated AI more often than smaller,community-based institutions.Additionally,the interviewee noted that few such institutions look beyond traditional models and prefer incorporating a hands-on approach:“These lenders are not going to implement an

254、ything that walks them backwards on their mission,even if it improves their efficiency.”33 Such financial institutions may be more comfortable adopting AI use for income verification,particularly institutions that make easy approval decisions.This would free up their resources for more challenging c

255、ases that require a hands-on approach.34 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 33 Moreover,mission-oriented financial institutions can serve lower-income communities and communities of color through high-touch activities as a way to separate themselves from larger lenders

256、 using AI to drive efficiency and grow markets.35 Additionally,Nicole Elam of the National Bankers Association highlights the importance of MDIs improving their tech efficiency while maintaining relationship banking models.36 Elam notes that MDIs have not found a scalable relationship-driven way to

257、incorporate new technologies into their initiatives to help community members become mortgage ready.And there may be a high degree of skepticism,rooted in the history of structural racism that undermined communities of color and MDIs.Some MDIs skepticism may reflect the view that AI mimics human beh

258、avior,and human behaviors are affected by structural racism.So MDI lenders hope to see more transparency in how AI is developed,where the data come from,how the data are being refined,and what is the racial equity lens being built in.37 As AIs use expands across the financial landscape,some mission-

259、oriented lenders have begun exploring AI options.Gary Perez,president and CEO of USC Credit Union,which is deploying AI in small-dollar,nonmortgage consumer lending,notes that incorporating AI-assisted credit underwriting services will enable the credit union to better support the majority-Black and

260、 majority-Hispanic neighborhoods surrounding the University of Southern California.Perez notes that the small organizations community charter expansion,as well as its CDFI and MDI designations,are new,thus limiting their ability to do empirically based relationship banking.As a result,USC Credit Uni

261、on hopes to use AI to responsibly expand access to community members who would otherwise be denied.38 Civil and Consumer Rights Groups Civil and consumer rights groups generally expressed skepticism about whether AI will be a force for racial equity.Keisha Deonarine of the NAACP pointed out,“The cre

262、dit system was not created with them in mind.And this raises the importance of humans assessing the information provided to see whether there are any potential risks.”39 Chi Chi Wu and Odette Williamson of the National Consumer Law Center note,“AI has systematically undermined people of color”in oth

263、er areas of the economy,such that it breeds a degree of skepticism and fear.These feelings“could also be heightened by skepticism people of color have with the financial system in the past,”and Wu and Williamson note that companies need to develop transparent,nondiscriminatory systems to build trust

264、 with communities of color.40 They are cautious of 34 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE new models,indicating that nothing prevents a newer model or approach from being more discriminatory.But they also feel AI could improve the current system.41 NFHA is playing a lea

265、ding role in tech equity.Critically,NFHA has developed an approach for auditing algorithms(Akinwumi,Rice,and Sharma 2022).NFHA posits that many issues AI is creating are not generated by AI itself.Rather,AI mirrors issues in the underlying data.NFHAs Tech Equity Initiative is grounded in the hope th

266、at intentional AI design could help people of color gain access to housing and economic opportunities,and if businesses can see how AI can help them achieve equity while still achieving profit targets,they are more likely to be on board.42 Through this work,NFHA has identified five key goals43 to ad

267、dress technological bias:1.developing solutions for removing bias from the technologies that shape our lives 2.increasing transparency and explainability for AI tools 3.advancing research to help reduce bias in tech 4.developing policies that promote more effective oversight for AI tools 5.supportin

268、g efforts to increase diversity,equity,and inclusion in the tech field Incorporating these goals has led to important leadership on topics intersecting with racial equity and racial bias.Importantly,NFHA has helped deepen the industrys understanding of machine learning in credit underwriting models,

269、algorithmic bias in housing-related transactions,and a fair lending policy agenda for federal financial regulators(Akinwumi et al.2021).44 NFHA is also helping guide federal policy related to key areas of mortgage lending,including AVMs,which increasingly rely on AI.Researchers Systematic analysis o

270、f AI-based systems is critical for establishing its potential and its limits.Given their independence,researchers can provide added transparency,particularly for a black-box technology in the private sector.Research has found mixed results on whether automated underwriting discriminates.Some researc

271、h has suggested that fintech algorithms discriminate in mortgage underwriting,but“40%less than face-to-face lenders”(Bartlett et al.2020,4).Updates to this research find the rate differences between nonfintech and fintech lending to be negligible(Bartlett et al.2022).AI can be used to audit the find

272、ings from a standard statistical model(Zhu,Neal,and Young 2022).In this case,when using an HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 35 AI-based method,the extent of error in AVMs was modestly lower than the results produced by an ordinary least squares regression(Zhu,Neal,an

273、d Young 2022).FinRegLab has also produced reports targeting policymaking in AI in financial services broadly.The organization started with a market context report to try to understand what kinds of lenders are using machine learning,what kinds of challenges they need to solve,and what policy questio

274、ns require more clarity around regulatory guidance(Cochran et al.2021).Bailey and coauthors(2023)provided important information to the industry on explainability and fairness in machine learning for credit underwriting.Additionally,FinRegLab evaluated tools provided by seven companies that provide p

275、latforms and services to help lenders develop and manage machine learning models responsibly,including for bias-related risk.They hope this work sheds light on the changes needed to support the fair and responsible use of machine learning models.45 The Secondary Market and Guarantors One major force

276、 is the secondary market.Many lenders sell a significant portion of their loans to Fannie Mae and Freddie Mac to boost lender capacity,increase liquidity,and offload some risks.Loans securitized by the GSEs are a sizeable portion of the one-to-four-family residential market(Goodman et al.2023).But t

277、o sell to the GSEs,a lender must meet Fannie Mae and Freddie Macs underwriting standards regarding the types of loans they will buy.Given the role the GSEs play in the secondary market,their use of AI could significantly affect AIs adoption throughout the rest of the industry.Per a Federal Housing F

278、inance Agency(FHFA)report,both GSEs are using a“cautious approach”to adopting artificial intelligence and machine learning and remain in the early stages of adoption.And although this report suggests that neither GSE uses AI directly in underwriting,Freddie Mac plans to incorporate an AI credit scor

279、ing solution in its automated underwriting system(OIG 2022).In addition to the FHFA and the GSEs,the US Department of Housing and Urban Development(HUD)plays an important role in the mortgage process.And given the disproportionate concentration of borrowers of color in FHA mortgages,HUDs influence o

280、n AI could have a stronger short-term impact on outcomes for applicants of color.The FHA has implemented its Technology Open to Approved Lenders Mortgage Scorecard algorithm for its automated underwriting.But HUD may still have steps to take to modernize its systems.46 Use of AI will also require in

281、ternal updates to their systems.Updating with AI-based systems,36 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE however,will affect lenders because FHA lenders would have to ensure their systems are up to date to communicate with theirs.And HUD plans to assess whether loans it su

282、pports through all of its programs violate the Fair Housing Act.47 In sum,the adoption of AI by secondary market agencies is moving slower than the development of AI tools and technology.Caution is not a bad thing.But this slow-moving process also means lenders are wary about using AI tools that cou

283、ld advance equitysuch as underwriting tools that go beyond traditional credit scoringbecause the loans still have to be approved by and managed by rules these agencies set.And lenders may wait until these rules are developed or sufficiently clarified before expanding AI adoption.Because more than 80

284、 percent of mortgages are backed by the federal government or the federally chartered GSEs,they set the rules of engagement.The FHA and the GSEs could encourage more lenders to adopt equity-advancing AI and ensure that the necessary oversight infrastructure exists to prevent the entrenching or exace

285、rbating of racial inequality through these tools.Federal Regulators Through the Fair Housing Act and the ECOA,the federal government has issued rules that protect historically marginalized communities,including people of color,and uses its authority to enforce these rules.In addition,the federal gov

286、ernment,through both public commentary and the use of its own systems,directly influences the role of AI in the mortgage application process.The Biden administration and financial regulators have taken steps to promote transparent,cautious use of AI that promotes both efficiency and equality in the

287、housing industry.Partly rooted in the administrations growing interest in AI,agencies such as the Consumer Finance Protection Bureau(CFPB)are identifying the tools needed to support sustainable and equitable AI use.48 To further its understanding of the risks associated with AI and to craft better p

288、olicies,the CFPB has called for pilots that illustrate the success of automation and where AI guidance will be needed.49 The federal government can also provide guidance to the industry by outlining its thinking on AI more broadly.The most notable releases are the AI Bill of Rights and President Bid

289、ens executive order on October 30,2023,which identified key principles for automated systems.The principles include safe and effective systems,algorithmic discrimination protections,data privacy,notice and explanation,and human alternatives,consideration,and fallback.50 President Bidens executive or

290、der directs federal agencies to develop protocols for safe use of AI and requires developers to share their models with the National Institute of Standards and Technology(NIST)for approval.The executive order also provides HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 37 resource

291、s that help developers and users avoid algorithmic bias,and the US Department of Justice will advise landlords and recipients of federal funding how to avoid fair housing violations when using AI technology and other algorithmic models.51 Recently,the Federal Reserve also indicated that the ECOA mus

292、t evolve with the realities of AI.52 Agencies created to oversee the mortgage industry have also published guidance on AI.Before the release of the report mentioned in the previous section,the FHFA issued an advisory bulletin to the GSEs in February 2022 that offered risk management guidance regardi

293、ng the use of AI and machine learning.53 The bulletin outlined four areas of heightened risk:(1)model risks,including lack of transparency(or“black box risk”)and model performance degradation;(2)data risks,including relying on unrepresentative or unsuitable data sources;(3)other operational risks,pa

294、rticularly insufficient information technology infrastructure,lack of information security,reliance on third-party providers,and potential disruptions to business continuity;and(4)regulatory and compliance risks,including lack of compliance with consumer protection,fair lending,privacy,and employmen

295、t discrimination laws and regulations.38 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE Recommendations to Advance Racial Equity through AI AI has the potential to advance racial equity,but it also has the potential to harm communities of color if not used properly.In other sector

296、s,AI has been shown to bias outcomes against people and communities of color.This is a moment to make sure the benefits of AI are equitable in the housing finance system,which underpins homeownership opportunity.Based on our conversations with mortgage industry stakeholders,we have developed several

297、 recommendations for policymakers,regulators,developers,and users to ensure that AI equitably expands access to mortgage services.We recommend three areas of focus:intentional design for equity,carefully studied pilot programs,and regulatory guidance.Design with Intention The key to intentional desi

298、gn is making equity a top priority rather than an afterthought.Experts in this area advocate for speculation on potential harms ahead of development or implementation,rather than playing catch-up after the product or program is already in motion.54 And documenting inputs,processes,and results can br

299、ing transparency to the innovative process.Many interviewees also highlighted the necessity of intentional design for creating equitable AI.This approach combines the AI tool in question with the power of human insight and thoughtfulness about potential unintended benefits or harms.This type of desi

300、gn is key for creating AI that prioritizes fairness and learns to create more equitable outcomes.Intentional design requires thoughtfulness about the following:the training data being used and what biases need to be accounted for whether outcomes reflect structural inequities and what can be done to

301、 remedy them how to prevent model drift over time how to make the AI or machine learning tool more transparent to users what the end goal of the AI use is and whether the human process it is replacing is fair or needs improvement.HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 39 G

302、iven the role federal agencies play in mortgage lending,intentional design of AI pilots could start there.But there is also an opportunity for effective intentional design to occur among fintech companies creating algorithmic tools as well as early adopters of these tools.Another way to advance inte

303、ntional design is to increase diversity among the designers of these tools and their users across the mortgage industry.Black and Hispanic people are significantly underrepresented in the tech sector and in leadership positions across industries that rely on tech.On these counts of intentional desig

304、n,smaller lenders,CDFIs,and MDIs can serve as modelsand their data used to teach the AI-based modelsfor what it looks like to prioritize equity.Many of the interviewees we spoke to from these types of organizations had diverse leadership staff with significant representation from the communities the

305、y served.They reported that this level of individual representation helped shape their mission and lending habits,as lived experience could help inform the decisionmaking and priorities of their leaders.People with lived experiences of racism and inequality are well equipped to understand how certai

306、n AI initiatives may entrench or worsen systemic racism.Therefore,it is important to increase diversity across these sectors to mitigate potential harms and center equity.Another method of intentional design is adversarial debiasing,a technique used to identify and reduce bias in credit decisionmaki

307、ng tools.Zest AI pioneered the approach and has patented related applications of adversarial debiasing in its credit models.According to McGill,adversarial debiasing pits two machine learning models against each other during the model training process.55 The first underwriting model predicts creditw

308、orthiness,but the second adversary one predicts the race or other potentially protected class attributes based solely on the risk scores of the first model.56 Competition in this game forces the underwriting model to increase parity between scores from protected and unprotected classes so that the a

309、dversary cannot accurately predict race based on risk score,resulting in a final model that is accurate and fair.57 Pilot Programs The federal governmentGinnie Mae and the GSEs,in particularcan also use pilot programs to determine the effectiveness of AI tools and address equity concerns at a smalle

310、r scale before the industry implements these tools more broadly.40 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE For example,Ginnie Maes Office of Enterprise Risk has launched a series of machine learning and AI model pilots,using different approaches to explore new ways of measu

311、ring and analyzing data.58 One AI algorithm was deployed to reduce the probability of false negatives and false positives when identifying issuers that may pose enhanced risk to the program but may slip through the cracks of traditional methods of risk identification.A potential new pilot could be u

312、sed by the GSEs or the FHA to test the use of AI in mortgage servicing.These agencies do not service loans themselves,but mortgage servicing is an integral part of ensuring that investors,including those that own agency mortgage-backed securities,receive their payments from the mortgage payments mad

313、e by homeowners.As these agencies bear the credit risk of the loans,they set the rules for how servicers manage nonperforming loans.An AI-based algorithm that projects the likelihood of borrower delinquency and identifies the best risk management process could be valuable.The GSEs or the FHA could d

314、eploy the pilot and test it against current processes to determine delinquency and resulting management.Measuring how the model performs against current systems would ultimately improve accuracy and guard against racial bias.This process could continue until the algorithm was deemed ready for use.Pi

315、lots can be useful for incorporating AI throughout more stages of the mortgage process and identifying key frictions:Why might lenders be hesitant to adopt equity-advancing AI tools?What incentives can advance the use of these tools?How well do consumers understand AI use in the mortgage process,and

316、 what are their salient concerns?What are the unintended consequences of these tools,and how can they be mitigated?Federal Home Loan Banks(FHLBs)play a crucial role in providing liquidity and funding to financial institutions,which affects the availability and affordability of mortgage loans.Given t

317、he potential of AI to address the racial homeownership gap,it is essential for FHLBs to closely monitor its development.AI innovations can introduce new lending models,streamline processes,and improve access to credit for underserved communities.By monitoring the AI landscape,FHLBs can identify emer

318、ging technologies and partnerships that could advance fair lending practices and expand homeownership opportunities for underrepresented groups.They can collaborate with AI start-ups,their members and other financial institutions,and regulatory bodies to ensure that innovative solutions promote incl

319、usive lending and reduce disparities in homeownership.Furthermore,monitoring AI in the context of closing the racial homeownership gap allows FHLBs to assess and address potential risks and challenges.FHLBs can engage with AI companies to promote transparency,ethical use of data,and bias-free algori

320、thms.They can also work with regulatory agencies HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE 41 to establish guidelines and standards that foster responsible AI practices,emphasizing fair lending and equal access to credit.By monitoring AI developments,FHLBs can leverage these

321、innovations to create a more equitable housing finance system and contribute to closing the racial homeownership gap.59 Increase Regulatory Guidance for Use of AI in the Financial Services Sector Federal actors have two vital roles that will fundamentally set the course for AI in mortgage finance:(1

322、)regulatory guidance and standard setting and(2)advancing adoption via the FHA,Fannie Mae,and Freddie Mac.Lack of clear regulatory standards governing the use of AI across the mortgage industry was a major point of concern for nearly every stakeholder with whom we spoke.There is a lack of clear stan

323、dards from the FHFA,the GSEs,and HUD about the federal governments expectations regarding the use of AI.One interviewee captured a sentiment many stakeholders shared:“Anything that creates more certainty and safety from the regulatory community would help”both industry and consumer stakeholders.60 A

324、dditionally,the CFPB has a role to play in this space.The CFPB is currently engaged in rulemaking on Section 1033 of the Dodd-Frank Wall Street Reform and Consumer Protection Act.Section 1033 addresses consumer-permissioned data,specifically requiring financial institutions to make account data avai

325、lable to their consumers.It also directs the CFPB to set standards for the development and format of these data.61 It will be key for the CFPB to clearly delineate“which data elements consumers have a right to access,what the standards are for private companies accessing and transferring data,and ho

326、w several federal consumer finance laws should be applied to consumer data transfers”(Choi et al.2022,39).Regulation in the AI space is also important to clarify standards for lenders partnering with third-party vendors(e.g.,fintech companies)offering AI tools.First,data privacy and security standar

327、ds are essential for protecting sensitive consumer information as it is transferred from one party to another.Consumers can experience significant harms if their data are not handled adequately.In addition,the government,lenders,and these vendors may all have differing incentives for AI use that can

328、 affect outcomes,so it is up to federal regulators to ensure that consumers,particularly the most vulnerable,are protected.42 HARNESSING ARTIFICIAL INTELLIGENCE FOR EQUITY IN MORTGAGE FINANCE For instance,take the residential property technology sector,which primarily provides digital tools and plat

329、forms that landlords rely on to interface with their clients.A recent research report on property technology companies found that venture capital funding creates an incentive structure that encourages these firms to prioritize investor profits over the best interests of tenants and potential homebuy

330、ers(TechEquity Collaborative 2023).Fintech companies,even those whose mission is to advance equity,are necessarily profit motivated as well.Thus,it is up to regulators to ensure that profits do not come at the expense of equitable and affordable housing access.Because AI is a topic that cuts across

331、the economy,other corners of the federal government may be critical areas of AI policy setting.Through the AI Bill of Rights,the current administration has already taken an important step of raising the topic of AI and advocating for addressing the potential for racial bias.Congress has also voiced

332、its interest in addressing AI.62 In addition,the depository supervisors have indicated an interest in AI and implications for their industry.The supervisors include the US Department of the Treasury,the Federal Deposit Insurance Corporation,the Federal Reserve,the Office of Comptroller of the Curren

333、cy,and the National Credit Union Administration.63 And NIST has taken critical steps on AI(Axelrod et al.2022).Specifically,NIST,a federal agency,developed its framework for voluntary use and to improve the ability to incorporate trustworthiness considerations into the design,development,use,and evaluation of AI products,services,and systems.NIST notes in its framework the importance of addressing

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(Urban Institute & FHLBanks:利用人工智能促进抵押贷款融资的公平(英文版)(60页).pdf)为本站 (Yoomi) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
客服
商务合作
小程序
服务号
会员动态
会员动态 会员动态:

wei**n_... 升级为标准VIP wei**n_...  升级为标准VIP

 wei**n_... 升级为至尊VIP 137**64... 升级为至尊VIP

 139**41... 升级为高级VIP  Si**id 升级为至尊VIP

 180**14... 升级为标准VIP  138**48...  升级为高级VIP

180**08...  升级为高级VIP   wei**n_... 升级为标准VIP 

 wei**n_... 升级为高级VIP  136**67...  升级为标准VIP 

 136**08...  升级为标准VIP 177**34...   升级为标准VIP

 186**59...  升级为标准VIP 139**48... 升级为至尊VIP 

 wei**n_... 升级为标准VIP 188**95... 升级为至尊VIP

wei**n_...  升级为至尊VIP  wei**n_... 升级为高级VIP 

wei**n_...  升级为至尊VIP 微**... 升级为至尊VIP 

139**01... 升级为高级VIP  136**15... 升级为至尊VIP 

jia**ia...  升级为至尊VIP  wei**n_... 升级为至尊VIP

183**14... 升级为标准VIP   wei**n_... 升级为至尊VIP 

微**...  升级为高级VIP  wei**n_...  升级为至尊VIP

 Be**en 升级为至尊VIP 微**... 升级为高级VIP 

186**86... 升级为高级VIP  Ji**n方... 升级为至尊VIP 

188**48...   升级为标准VIP  wei**n_...  升级为高级VIP

iam**in... 升级为至尊VIP   wei**n_... 升级为标准VIP

 135**70... 升级为至尊VIP 199**28... 升级为高级VIP 

wei**n_...  升级为至尊VIP  wei**n_...  升级为标准VIP

wei**n_... 升级为至尊VIP 火星**r...  升级为至尊VIP

 139**13... 升级为至尊VIP 186**69...   升级为高级VIP

157**87...  升级为至尊VIP  鸿**... 升级为至尊VIP

wei**n_...   升级为标准VIP   137**18... 升级为至尊VIP

wei**n_...  升级为至尊VIP wei**n_... 升级为标准VIP  

 139**24... 升级为标准VIP  158**25... 升级为标准VIP 

 wei**n_... 升级为高级VIP 188**60... 升级为高级VIP  

 Fly**g ... 升级为至尊VIP  wei**n_... 升级为标准VIP

186**52... 升级为至尊VIP 布** 升级为至尊VIP 

 186**69... 升级为高级VIP wei**n_... 升级为标准VIP 

 139**98... 升级为至尊VIP 152**90...  升级为标准VIP

138**98...  升级为标准VIP 181**96... 升级为标准VIP 

185**10...  升级为标准VIP   wei**n_... 升级为至尊VIP

高兴 升级为至尊VIP   wei**n_...  升级为高级VIP

  wei**n_... 升级为高级VIP 阿**... 升级为标准VIP

wei**n_...  升级为高级VIP  lin**fe... 升级为高级VIP 

wei**n_...   升级为标准VIP wei**n_... 升级为高级VIP

wei**n_...  升级为标准VIP wei**n_... 升级为高级VIP 

wei**n_... 升级为高级VIP  wei**n_...  升级为至尊VIP

wei**n_... 升级为高级VIP    wei**n_... 升级为高级VIP

 180**21... 升级为标准VIP  183**36...   升级为标准VIP

 wei**n_...  升级为标准VIP wei**n_... 升级为标准VIP 

xie**.g...   升级为至尊VIP 王**  升级为标准VIP

172**75... 升级为标准VIP   wei**n_... 升级为标准VIP

  wei**n_... 升级为标准VIP wei**n_...  升级为高级VIP

 135**82... 升级为至尊VIP 130**18...  升级为至尊VIP 

wei**n_... 升级为标准VIP  wei**n_...  升级为至尊VIP

wei**n_... 升级为高级VIP   130**88... 升级为标准VIP 

 张川 升级为标准VIP wei**n_...  升级为高级VIP