MHCO Columns

Fair Housing Alert: Hidden Flaws in ChatGPT, Bard, Bing, and Other Generative AI Products - Potentially Discriminatory

Do you want access to MHCO content?

For complete access to forms, conference presentations, community updates and MHCO columns, log in to your account or register now.

MHCO

Like other real estate businesses, you may be using ChatGPT, Bard, Bing, and other generative AI products, a.k.a. chatbots, for marketing purposes, such as developing advertising strategies, analyzing housing markets, and generating property listings, ads, social media posts, and other marketing content. Just recognize that for all their potential benefits, chatbots contain flaws that make them risky to use for marketing and advertising.

 

 

Among these flaws is the possibility of hidden bias. Explanation: Data and algorithms built into chatbots may incorporate the subtle prejudices of the humans who create them. They can also learn prejudice from the way they’re deployed. For example, in 2018, Amazon stopped using an AI-based recruitment program after discovering that its algorithm skewed against women. The model was programmed to vet candidates by observing patterns in resumes submitted to the company over 10 years. Most of the candidates in the training set were men. As a result, the AI taught itself to prefer male over female candidates.

 

Discriminatory content. Be aware of the risks and don’t use the content that chatbots generate for advertising and marketing unless and until somebody at your company with knowledge of fair housing laws carefully vets it to ensure it contains no hidden prejudices or biases.

 

Discriminatory placement. Beware of relying on chatbots in deciding where to advertise. Explanation: Historically, landlords have perpetuated segregation by deliberately advertising only in certain publications or outlets that minorities targeted for exclusion are known not to use. This is a critical compliance issue because HUD and the courts interpret discriminatory advertising as including the selection of media or locations for advertising that deny particular segments of the housing market information about housing opportunities based on a protected characteristic. Examples include strategically placing billboard ads in predominately white neighborhoods and running newspaper ads in local publications read mostly by a white audience. Use of chatbots with sophisticated algorithms targeting highly specific audiences significantly increases the risks of inadvertently exclusionary ad placement strategies.

 

Bottom line: Make a deliberate decision about whether you want your employees to use ChatGPT and other chatbots and for what applications. Then set out a written policy that clearly explains the banned and permitted uses and any applicable safeguards for the latter. Also include language addressing algorithm discrimination in your property’s fair housing and nondiscrimination policies. Ask your attorney about adapting this model language for your policy:

 

Model Language 

Use of Chatbots for Marketing Purposes. Employees must be aware that Chatbot data and algorithms may contain hidden prejudices or biases or be based on stereotypes about people of certain races, sexes, age, religions, or other protected classes under discrimination laws. Accordingly, employees may not use Chatbots for purposes of recruiting, marketing, advertising, promoting, or tenant selection unless and until ABC Landlord’s legal counsel vets and verifies that those applications and tools relying on Chatbot data are fully compliant with applicable federal and state antidiscrimination laws and will not have the indirect effect of discriminating against groups or individuals that those laws are designed to protect.

 

Using ChatGPT for marketing is just one of several common practices that may constitute indirect and unintentional discrimination.