Are you using AI in real estate? Or, are you being judged for a home based on AI? Then keep reading…
Mary Louis, a Black woman, led a groundbreaking lawsuit against SafeRent Solutions after an algorithm denied her apartment application. The AI system was found to discriminate based on her “source of income” (a.k.a. How you plan to pay for your home) which is a protected class in some places.
“SafeRent’s defense attorneys argued in a motion to dismiss that the company shouldn’t be held liable for discrimination because SafeRent wasn’t making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant.” Source: Fast Company
in other words, landlords, and property managers paid for a service to help them rate prospective tenants but this company didn’t want to be blamed for the service they provided. Make it make sense.
A $2.3M settlement was reached, marking a win for fair housing protections based on “source of income” and setting one of the first legal precedents for AI accountability in housing.
And that’s Fair Housing 101 (aka “How Not to Get Got in Real Estate”)
With costs being up, understanding affordable and fair housing is more important than ever!
It’s time to have The Talk. Not the birds and bees talk but The Talk about buying and selling your home “without getting got”
Want a complimentary 🎁 copy of The Talk: Fair Housing Edition? Comment “TALK” for a DM w/the info. Or, visit [ Ссылка ](link’s also in my IG bio)
In the meantime, we need more Fair Housing DECODERs©, a.k.a. educated, proactive advocates (appraisers, lenders, REALTORS®, and other community members) for fair and affordable housing (and lending).
Comment “DECODER” for a DM w/the info. to get a 🎁 complimentary (aka NO FEE) download of the Fair Housing DECODER©️ workbook. Or, visit [ Ссылка ]
#LearnWithDrLee #FairHousingDecoder
#fairhousing ##fyp #reels #memes
#massachusetts
#AI
Ещё видео!