skip to main content
Close Button
Last Name
First Name
Practice Area

The Newsroom

Landlords Beware: Navigating Potential Biases in Tenant Screening Algorithms

Landlords Beware: Navigating Potential Biases in Tenant Screening Algorithms

Machine learning tools powered by Artificial Intelligence (“AI”) are changing the landscape of real estate.* For home buyers , AI may provide assistance in various steps of the mortgage process and assist them with finding an ideal home. AI also offers promising opportunities for the commercial real estate sector , as algorithms enable property owners to predict property values, identify investment opportunities, and optimize property management and maintenance. Companies with proprietary tenant screening algorithms additionally claim to streamline background checks; however, housing providers should be forewarned of growing concerns that such algorithm-based screening products create and perpetuate discriminatory housing practices.

A pending class action lawsuit brought in the U.S. District Court for the District of Massachusetts illustrates the potential hazards of applying AI technology to the tenant review process. Plaintiffs in Louis et al. v. SafeRent et al. allege that the algorithms of a tenant screening company, SafeRent Solutions, LLC, violate the Fair Housing Act (“FHA”). Plaintiffs claim the algorithms are unlawful because they disproportionately assign low scores to Black and Hispanic rental applicants who use federally funded housing vouchers to pay the vast majority of their rent, causing them to be denied housing. Plaintiffs further assert SafeRent’s algorithm has a disparate impact based on race and source of income, in violation of federal and state laws.

Statements from government agencies indicate that housing providers that utilize such algorithms during the screening process may also be held liable for violating laws such as the FHA. The Department of Justice and Department (“DOJ”) of Housing and Urban Development (“HUD”), which filed a joint statement of interest in Louis et al. v. SafeRent et al., have explained that “housing providers…that use algorithms and data to screen tenants are not absolved from liability when their practices disproportionately deny people of color access to fair housing opportunities.”

HUD’s general counsel has also stated that housing providers “must ensure that all policies that exclude people from housing opportunities, whether based on algorithm or otherwise, do not have an unjustified disparate impact because of race, national origin or another protected characteristic.”

Indeed, it has long been known that algorithms are susceptible to the explicit and implicit biases of those who create them. For housing providers, this issue is complicated by the fact that algorithms generated by tenant screening companies are generally unregulated, proprietary, and not shared with the public. Housing providers should accordingly proceed with caution before employing new, algorithm-based tenant screening tools to streamline their tenant review processes. As some government authorities have indicated, housing providers may be held liable under state and federal housing discrimination laws for (even inadvertently) employing algorithms that have a discriminatory impact.

*Machine learning” has been defined as “a subfield of artificial intelligence, which is broadly defined as the capability of a machine to imitate intelligent human behavior.”

By Amanda Grannis

enter image description here