What Spotify and Tinder aren’t telling us

0
692

Our online and real-world lives are increasingly influenced by algorithmic recommendations based on data gathered about our behavior by companies that are often reluctant to tell us what data they’re gathering how they are using it.

Researchers at the University of Auckland have endeavored to find out more about how these algorithms work by analysing the legal documents – Terms of Use and Privacy Policies – of Spotify and Tinder. The research, published in the Journal of the Royal Society of New Zealand, was done Dr Fabio Morreale, School of Music, and Matt Bartlett and Gauri Prabhakar, School of Law.

The companies that gather and use our data (usually for their own financial gain) are notably resistant to academic scrutiny they found. “Despite their powerful influence, there is little concrete detail about how exactly these algorithms work, so we had to use creative ways to find out,” says Dr Morreale.

The team looked at the legal documents of Tinder and Spotify because both platforms are grounded on recommendation algorithms that nudge users to either listen to specific songs or to romantically match up with another user. “They have been largely overlooked, compared to bigger tech companies such as Facebook, Google, Tik Tok etc who have faced more scrutiny” he says. “People might think they’re more benign, but they are still highly influential.”

The researchers analysed various iterations of the legal documents over the past decade. Companies are increasingly required to let users know what data is being collected, yet the length and language of the legal documents could not be described as user-friendly.

“They tend toward the legalistic and vague, inhibiting the ability of outsiders to properly scrutinise the companies’ algorithms and their relationship with users. It makes it difficult for academic researchers and certainly for the average user,” says Dr Morreale. Their research did reveal several insights. Spotify’s Privacy Policies, for instance, show that the company collects much more personal information than it did in its early years, including new types of data.

“In the 2012 iteration of its Privacy Policy, Spotify’s data practices only included basic information: the songs a user plays, playlists a user creates, and basic personal information such as the user’s email address, password, age, gender, and location,” says Dr Morreale. After several iterations of the Privacy Policy, the existing 2021 policy allows the company to collect users’ photos, location data, voice data, background sound data, and other types of personal information.

The evolution in Spotify’s Terms of Use also now states that “the content you view, including its selection and placement, may be influenced by commercial considerations, including agreements with third parties”. This provides ample room for the company to legally highlight content to a specific user based on a commercial agreement, says Dr Morreale.

“Spotify promises that the ‘playlist is crafted just for you, based on the music you already love’, but Spotify’s Terms of Use detail how an algorithm could be influenced by factors extrinsic to the user, like commercial deals with artists and labels.”

“In their recommendations (and playlists for that matter) Spotify is also likely to be pushing artists from labels that hold Spotify shares – this is anti-competitive, and we should know about it.”

And probably contrary to most users’ perceptions, the dating app, Tinder, is “one big algorithm”, says Matt Bartlett. ““Tinder has previously stated that it matched people based on ‘desirability scores’ calculated by an algorithm. I don’t think users fully understand or know about how Tinder’s algorithm works, and Tinder goes out of its way not to tell us.”

“That’s not to say that this is an evil thing – the problem is that they’re not transparent about how the matching occurs. In my opinion, the Terms of Use should specify that.” While the researchers were unable to fully identify how the platforms’ algorithms function, their research highlighted that very problem – that the companies aren’t transparent about their collection of our data or how they are using it.

“With these powerful digital platforms possessing considerable influence in contemporary society, their users and society at large deserve more clarity as to how recommendation algorithms are functioning,” says Dr Morreale. “It’s crazy that we can’t find out; I think in the future we’re going to look back and see this as the Wild West of big tech.”

Story Credit: University of Auckland/Newswise

Photo Credit: Sara Kurfeß/Unsplash