Here’s Why Background Checks On Dating Apps Promote A False Sense Of Safety

Here’s Why Background Checks On Dating Apps Promote A False Sense Of Safety

Written as a collaboration by the #open team, including Amanda Wilson (#open’s co-founder), Gabrielle Alexa Noel, and Rose Bern. This blog is part of a 3-part series on background checks and data privacy.

Newly proposed laws in Connecticut and Utah seek to require dating apps, like #open, to disclose whether or not we perform background checks. Connecticut Senate Bill SB5 will soon go into effect and while Utah House Bill HB352 did not pass, Rep. Angela Romero, who authored the bill, has plans to reintroduce it.

55f88 hashtagopen viper e1651707335207 - open relationships and non-monogamous dating app
By telling users to base their decision-making on a person’s conviction history, dating apps wrongly imply that involvement in the criminal punishment system is a relevant predictor of future harmful behavior.

Background Checks Don’t Always Provide Accurate Conviction Data

While these bills are dressed up as attempts to protect women, they actually pressure dating apps into implementing background checking tools, which experts claim will negatively impact user privacy and security. Tinder has already started using Garbo, a tool that scrapes data from various government databases and attempts to match them up, often leading to inaccurate or incomplete records.

Garbo is not regulated under the Fair Credit Reporting Act the way actual background checking companies are, so they also do not guarantee the right for individuals to view their own records or correct inaccuracies. This means that users are being told to base safety decisions on potentially untrue, uncontestable conviction data.

Safety Means Actually Listening To Dating App Users And Protecting Their Privacy

Tinder’s use of Garbo makes it the consumer’s responsibility to vet potential dates rather than encouraging companies to meaningfully address harm. According to a 16-month investigation by Columbia Journalism, many users reported instances of sexual violence to Tinder/Match Group only to continue seeing their attackers active on the platform. Why would we marry conviction records to dating service providers when they won’t do anything about it?

#open is fiercely protective of users’ data and believes in our members’ right to privacy and anonymity. For instance, we will never sell your data to advertisers or any other third party. We will also never trade or give your data to third parties without explicit disclosure, except where required by law. However, it is clear that other dating apps do not espouse similar precautions. In 2014, Match Group –– which owns a bunch of online dating services, including OkCupid, Tinder, and Hinge  –– came under scrutiny for experimenting on OkCupid users and publishing an analysis of their findings.

Christian Rudder, who was president of OkCupid at the time, addressed the controversy, saying, “If you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”

He later wrote the book Dataclysm: Who We Are (When We Think No One’s Looking) by mining the OkCupid dataset and analyzing patterns he observed. Everyone consents to this sort of research and analysis when they sign up for the app; it’s in the user agreement. Even worse, this same dataset was used to create a facial recognition A.I. called Clarifai, which has unfettered access to OkCupid’s user database because their founders invested in the company’s technology. Clarifai founder Matt Zeiler stated he would sell their technology to “foreign governments, military operations, and police departments provided the circumstances were right.”

We know this technology has already been deployed by malicious actors. One facial recognition database developed in the United States was linked to the monitoring and detaining of Uighur Muslims, an ethnic minority in China who are the victims of systematic violence and genocide. This egregious violation of prviacy and data confidentiality is unwarranted, but it’s even worse that it can be used for ethnic profiling and other nefarious aims.

There Are Other Dating App Privacy Violations Worth Noting

In 2016, a flaw in the Tinder app enabled Spanish users to see their matches’ exact location, even when those matches had them blocked. And that same year, alt-right researchers Emil Kirkegaard and Julius Daugbjerg Bjerrekær published 70,000 users’ OkCupid data, including their sexual preferences, turn-ons, usernames, and whether or not they do drugs. Research shows that many security flaws still exist on the platform.

fd293 hashtagopen - open relationships and non-monogamous dating app
Download #open, and connect with individuals and couples today.

Hacking further jeopardizes customer safety, which happens more frequently than one might think. After a famous Ashley Madison hack exposed 33 million accounts in 2015, a number of victims reportedly committed suicide. Another hack in 2019 exposed 3.9 million AdultFriendFinder accounts to the public, including users’ sexual preferences, dates of birth, email addresses, and their intent to engage in extramarital affairs. Victims immediately became targets of spam emails and blackmail schemes as a result.

Background Checks, And The U.S. Criminal Punishment System Overall, Are Distorted By Discrimination

If dating apps rely on background checks, they’ll acquire more sensitive data they can exploit. People with prior convictions, who are disproportionately Black and Brown due to overpolicing and systemic biases, wrestle with gross inequality as it is. After all, mass incarceration is a voter suppression tactic; in October 2020, it was estimated that 5.1 million citizens were disenfranchised during the 2020 presidential election due to felony convictions. Furthermore, the criminal punishment system entrenches poverty, worsens economic disparities, and criminalizes victims who engage in self-defense. Linking a person’s conviction history to their dating apps functions as a form of digital redlining, excluding them from popular online communities.

Finally, considering domestic and sexual violence is vastly underreported and almost never results in conviction, it’s unlikely that background checks would protect dating app users. For every 1000 sexual assaults, only 25 end in a felony conviction for the perpetrator, according to Rainn. By telling users to base their decision-making on a person’s conviction history, dating apps wrongly imply that involvement in the criminal punishment system is a relevant predictor of future harmful behavior.

Dating apps need to implement safety strategies that actually work to protect users; for the most part, strategies that involve background checks simply pretend.

Let’s chat on social media

2 responses to “Here’s Why Background Checks On Dating Apps Promote A False Sense Of Safety”

  1. […] As we’ve mentioned a few times before, newly proposed legislation has recently urged dating apps to incorporate in-app background checking technologies. Match Group, the company that owns apps like Tinder, Match, and Hinge, unveiled a partnership with Garbo –– a non-FCRA background check that scrapes data from various government databases and attempts to match them up, often leading to inaccurate or incomplete records. As Connecticut Senate Bill SB5 goes into effect on October 1st, many more online dating operators will either incorporate similar technologies, or disclose their own reasons for not wanting to use them. […]

Leave a Reply

Your email address will not be published. Required fields are marked *