UPDATE: 27/4/2021 

A statement from a Match Group spokesperson has revealed that the company has not yet agreed to implement the changes.

“We recognise we have an important role to play in helping prevent sexual assault and harassment in communities around the world. We are committed to ongoing discussions and collaboration with global partners in law enforcement and with leading sexual assault organizations like RAINN to help make our platforms and communities safer,” the statement read.

“While members of our safety team are in conversations with police departments and advocacy groups to identify potential collaborative efforts, Match Group and our brands have not agreed to implement the NSW Police proposal.”

NSW police has proposed the use of artificial intelligence to flag potential sexual violence perpetrators on dating apps. 

In a world-first proposal, Match Group, the parent company of Tinder and Hinge, would streamline reports of sexual assault from its dating app users by creating a “portal” that could be accessed by NSW police.

In a statement (via triple j Hack), Detective Superintendent Stacey Maloney said dating apps should work with police when sexual violence has been reported.

“If they hold information that is suggestive an offence has been committed, they have a responsibility in my view to pass that on,” Maloney said.

The announcement of the plan comes off the back of a 2020 report by triple j Hack and Four Corners, which covered Tinder’s failure to adequately aid survivors of sexual assault.

Since the investigation, both Match Group and fellow dating app giant Bumble have announced a number of safety change to protect the wellbeing of users.

Maloney mentioned the possibility of Tinder and Hinge developing artificially intelligent systems to detect “red flags” from messages that could indicate sexual violence.

“It’s looking at what type of behaviour those users would exhibit and if we can, pick up on that throughout the course of them being on those apps,” she said.

“In the event something does occur, it’s in existence.”

Meanwhile, Queensland University of Technology’s Dr Rosalie Gillett told triple j Hack that artificial intelligence is unlikely to monitor all kinds of concerning behaviours.

“Dating apps’ automated systems might be able to detect overt abuse, such as threats to a person’s physical safety, but there’s a good chance that they won’t identify more normalised and everyday content and behaviours,” Gillet said.

“Automated systems are only as useful as the data that are used to develop them. This means that Match Group will need to consider what data it uses to train its models. An automated system designed to detect overt abuse will only ever be able to detect overt abuse.”

For more on this topic, check out the Health & Wellness Observer.

Get unlimited access to the coverage that shapes our culture.
to Rolling Stone magazine
to Rolling Stone magazine