tiktok
Credit: Unsplash/CC0 Public Domain

On April 24, President Joe Biden signed a foreign aid package that included a security addendum—a ban of the popular China-based social media app TikTok unless its parent company is sold to an American firm within nine months. With over 148 million monthly users in the United States, the bite-sized video app has been criticized for its highly addictive nature.

Nick Seaver, assistant professor in the Department of Anthropology, teaches a course on and the seminar Technologies of Enchantment, which discusses how algorithms are designed to charm and entrap users. He is author of the 2022 book "Computing Taste: Algorithms and the Makers of Music Recommendation."

Tufts Now spoke with Seaver about the potential TikTok ban and whether fears about the app's addictiveness are warranted.

A lot has been made about TikTok's powerful algorithm. You have studied algorithms on other apps, especially music ones. How does the TikTok algorithm differ?

I come to this as someone who's studied the developers of recommender algorithms for many years; my first book is about the makers of music recommendation systems and how they think about their work.

So far as we know, TikTok doesn't use any especially uncommon techniques. The basic concept is called "collaborative filtering" and has been around for about 30 years—see what users engage with, find similar users, and use them to generate recommendations for each other.

What seems most distinct is TikTok's format—short, looping videos that require a user to manually advance to the next one. This is a great setup for collecting data about what people watch, since there's no question about what's in focus on screen, as there might be on a feed, for instance. And it can be very engaging, thanks in no small part to the creativity of people producing videos on the platform.

One challenge here is that we don't have a good way of knowing just what's happening inside the "black box" that creates the recommendations, and in the absence of trust, that opens up a lot of speculation.

Why is the discourse around TikTok so different from discussions about other social media apps?

I think there are a few concerns in play here. The first is the idea that TikTok has a uniquely addicting algorithm. But like many critical algorithm studies scholars have argued, the word "" is often a way to talk about other things, like the business model of a company or our ideas about the motivations of the people who work there.

That leads to another concern, probably the dominant one at the moment, which is the worry that TikTok represents an intrusion of the Chinese government into the minds of American youth. That's clearly animating the legislative situation, as the TikTok bill that just became law seems most likely to require a sale of U.S. operations to an American company—not to the dismantling of the platform itself.

We have seen occasional legislative efforts to constrain "persuasive design" features, like autoplay or algorithmic personalization, especially in relation to young people, but this has only taken off with TikTok because of the specter of Chinese involvement, which suggests to me that geopolitical concerns are the main issue here.

You are now writing a book on attention spans and their social contexts. How is technology changing the way we view attention?

My main interest in attention is how we use it to talk about so many different concerns. We talk about "attention spans" as if they're an obvious thing to worry about, but the history of the concept suggests that it only really emerged in the late 1800s as educators start trying to apply psychological concepts to their everyday concerns, like keeping students in line in the classroom.

My argument is that our many ways of thinking about attention are related to the many ways we have for measuring it. Platforms like TikTok, for instance, are known to optimize for engagement—basically, how long users spend paying attention to their product—and algorithmic recommendations use finer-grained measures of attention, like how long you spend watching a video, as a proxy for how much a user likes a certain kind of media.

Other measures of attention, like cameras in cars that check whether you're looking at the road or trick questions on online surveys that test how closely you're reading, treat attention as different kinds of thing.

People have been worrying about the effects of media on our ability to pay attention for as long as we've been talking about "attention" at all. So this feels like a new situation, but it is very much not.

That doesn't mean that critics of social media algorithms' effect on attention are wrong, but it does mean we might want to be skeptical about claims to unprecedented novelty.

How do critics of social media algorithms use or misuse ideas about attention?

My main concern is that many of these critiques are often depoliticizing—they take the legitimate political concerns of and imagine that they're basically the result of people being hypnotized by algorithms.

This kind of critique takes actual political disputes and imagines that they only exist because of media technologies and human psychological vulnerability, rather than real disagreement. Social studies of media draw into question the idea that people simply absorb media messages uncritically, as though they were injected straight into the brain.

At the end of the day, I think that much popular talk about attention isn't really about the brain at all. When we talk about attention, we're often talking about how we think people ought to live or what we want to do with our time, but imagining that these opinions come from biological or psychologically inevitable features of human nature.

Citation: Q&A: What's behind the potential ban on TikTok? (2024, May 1) retrieved 1 May 2024 from https://techxplore.com/news/2024-05-qa-potential-tiktok.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.