Tenant screening has become a billion dollar industry dominated by thousands of companies that promise landlords data-driven, rapid decisions about future tenants. However, as Shelterforce has revealed, these tools often rely on error-prone biased algorithms that exclude tenants, particularly blacks, Latino, low-income, and voucher-holding tenants, based on factors that have little to do with their ability to pay actual rents.
This article follows the story of Mary Lewis, a Massachusetts renter, despite having stable employment and strong references, which rejected the housing due to a low algorithm score from Saferent Solutions. Her experience highlights the broader issues of tenant screening, opaque standards, inaccurate data, and the limited replies of those unfairly rejected.
Despite federal guidance issued under the Biden administration to curb discriminatory practices, the current Trump administration has rewind these protections and undermined surveillance in the HUD and CFPB. In response, state and local governments are rising up – enacting laws to seal eviction records, requiring transparency in screening decisions, and implementing a fair opportunity housing policy.
Litigation has also gained traction, with recent legal victory for screening companies like Saferent being forced to reform their practices. Supporters say these fights reflect a long journey towards credit reporting regulations in the 1970s. And we see tenant screening as the next frontier in consumer protection and housing equity.
Source link