HomeInnovationWhat TikTok’s U.S. Spin-off Means for Its Algorithm and Content Moderation

What TikTok’s U.S. Spin-off Means for Its Algorithm and Content Moderation


Rachel Feltman: For Scientific American’s Science Quickly, I’m Rachel Feltman.

TikTok’s algorithm, which shapes what more than a billion users see, has developed an almost mystical reputation for figuring out what people want to watch. Those powers aren’t actually magical, but they do matter. An algorithm as widely used as TikTok’s can have a huge impact on our culture by determining what information people receive and how.

As TikTok prepares to spin off a U.S.-only version of the app with majority-American ownership, plenty of questions loom about how the platform—and its all-mighty algorithm—might change. Will new investors reshape what kinds of content is promoted or suppressed?

On supporting science journalism

If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

Here to break down what we know about the highly anticipated TikTok sale and what it might mean for the platform’s future is Kelley Cotter, an assistant professor in the Department of Human-Centered Computing and Social Informatics at Pennsylvania State University.

Thank you so much for coming on to chat today.

Kelley Cotter: Of course, I’m glad to be here, and thanks for inviting me.

Feltman: So would you start by telling us a little bit about your background—you know, what kind of research you do?

Cotter: So I investigate all kinds of things to do with the social and ethical implications of digital technologies, and I particularly focus, usually, on algorithms and AI—and perhaps more specifically on social media algorithms—and some of my core interests are in how people learn about and make sense of these technologies, how they imagine them and what they think they might make possible.

And then I have a book that’s under contract right now with Oxford University Press on critical algorithmic literacy, so one of the things I’m interested in is understanding how what we know about algorithms can help us govern them in a more bottom-up fashion. And also thinking about our understanding of platforms and the practices we have around them as kind of contextual insights that we have.

Feltman: What do you think is lacking in most people’s understanding of the algorithms that power the social media they use?

Cotter: So when I started researching this maybe almost 10 years ago there was still a large portion of the population who weren’t even really aware that these processes existed to sort of sort and filter content online. Now I think that has changed quite a bit, where there’s—probably most people have some awareness of these processes happening. They have some awareness that what they see in their feeds isn’t everything that they could possibly see. And I think they also have a basic understanding of how that works, so they know that this depends upon their activity on the sites: the things that they engage with, the things they watch, the things they share, the things they comment on, all that kind of stuff.

I think anything higher level than that, maybe the more complex technical understanding, is more out of reach, but also, the ways that people are aware of the impacts or consequences of algorithms is also limited. So people are often aware of the ways—of their own encounters with algorithms because we learn a lot about them through our own experiences. But there’s not sort of a broad understanding of the ways algorithms might be reshaping different broader societal processes.

Feltman: Mm. So you recently wrote a piece for the Conversation about the TikTok sale and how it relates to the kind of infamous TikTok algorithm. To start us off what do we know about the TikTok sale? What’s going on there?

Cotter: So we have some details at this point, not a full picture, but we have some details. So we know that the deal is going to create a new U.S.-only app, spun off from the original app; that it’s going to be a majority ownership by American companies, about 80 percent, and then less than 20 percent among Chinese investors, ByteDance—the parent company of TikTok.

And the main driver of creating this deal originally had to do with concerns about the app being under Chinese control. And one of the key focal points was the algorithm because there was concerns about the ways that the algorithm could be manipulated to shape the content that users see in their feeds in ways that U.S. lawmakers found concerning. So the algorithm, then, would be licensed to this new American company, and they would retrain it and rebuild it for the U.S.-only app.

Feltman: Yeah, and why is the fate of TikTok’s algorithm such a big part of this conversation, you know, even now that it wouldn’t be in the hands of a foreign power?

Cotter: The algorithm is at the heart of everything that TikTok does. So every social media platform really revolves around the functions that their algorithms perform. So algorithms are designed to tailor content to user preferences, so they’re designed to make users’ experiences meaningful and valuable; that’s sort of the goal. But it also means that they play a central role in shaping sort of the culture by the ways that they make certain kinds of content visible or less visible.

So they sort and filter content for folks and then also enforce some of the community guidelines that social media companies set to make sure that the content that people see in their feeds isn’t excessively gory or doesn’t promote violence or in—historically, there was concern about minimizing misinformation. So there’s different ways that it’s supposed to optimize feeds to lift up the best content and the best content for the individual user.

Feltman: As someone who’s studied social media algorithms for nearly a decade what’s unique about the one that powers the TikTok “For You” page, both, actually, algorithmically and maybe in the ways people feel that it works, if that makes sense?

Cotter: Yeah, the TikTok algorithm is perceived to be especially good at tailoring content for users. There’s kind of a popular conception of it as knowing people better than they know themselves. And some of my research with colleagues has investigated those kinds of beliefs and the ways that they converge in this really curious mixture of spiritual beliefs and conspiracy theorizing, where there’s sometimes perception that what people see in their feeds is somehow sort of, like, cosmically destined for them; it’s meant for them specifically. So there’s this really—there’s perceptions of the algorithm as being very powerful and good at its intended purpose.

In some ways, in many ways, the algorithm isn’t especially different from other social media algorithms. It’s sort of designed in the same way, where the goal is to keep users on the site and keep them coming back. That’s sort of what it’s optimized for. And it also, like other social media algorithms, relies on signals from people’s behavior on the site—again, the things that they like, the things they comment on, things they share, these sort of signals of interest.

One Wall Street Journal investigation suggested that watch time on TikTok is an especially strong signal of interest used by the algorithm to rank content. One reason why the TikTok algorithm might be potentially better at tailoring content is the nature of the short video format, where it’s easier to get a read on what interests people based on the length of time that they spend watching any given piece of content versus any other thing.

It also has other, like, unique features that promote more connections between creators and users. So we get, like, the Stitch function, where people will respond to different videos; they’ll splice in a video from another creator and respond to it with their own video. There’s sounds, where people can use similar sounds to kind of create kind of memes and, and different conversations or promote similar ideas about things. So there’s ways that connections across users are facilitated by the platform features that could be helpful for understanding user preferences.

But it’s not entirely clear why it is, at least perceived as, especially good at tailoring content. We have some information about how it works, but it’s hard to know any given one reason why it might be especially good.

Feltman: So given what we know about the proposed buyers for TikTok and the potency of the TikTok algorithm what are the implications if the sale goes through?

Cotter: Yeah, because the algorithm is so central to life on the platform, to what it is, it matters whose hands it’s in because it will directly, again, shape what the platform looks like—what this new American app will look like.

So the proposed investors, or the [investors] that have been shared, are some sort known entities. Oracle, of course, is a big one, and they’ve maintained the data for TikTok in the U.S. for a [few] years now, so that one was—sort of followed from that established relationship. But I think a lot of the concern around the investors that have been named is that they all seem to have ties to the Trump administration, to be more conservative-leaning in their views, and this has the potential to change the sort of ideological slant of the platform if the investors decide that they want to tweak the algorithm in some ways or tweak the community guidelines for this new app in ways that might change what’s considered acceptable or unacceptable speech.

So maybe one important thing to note is: earlier on, when we were still in conversations about trying to bring legislation about to ban TikTok, concerns from lawmakers, particularly Republican lawmakers, was that there were greater visibility of Palestinian hashtags on TikTok over Israeli hashtags; supposedly there’s some sort of lopsidedness in the content there. So with an owner that has a strong ideological point of view and has the will to make that a part of the app, it is possible, through tweaking the algorithm, to sort of reshape the overall composition of content on the platform.

So this doesn’t have to do with the ownership, but with the new app, because it’s going to be American users only—so they say that there will be global content that will still be visible on the platform, but the users for this app will be American. So we can expect that if this new algorithm, as licensed from ByteDance, is retrained on U.S.-only users, that the American values, preferences, behaviors that inform the curation of content by the algorithm on the site—we might expect to see some subtle shifts, just by nature of that different dataset that it’s being built on.

And if users perceive the new app to be in the hands of Trump allies or to be more conservative-leaning in their viewpoints and have concerns that those investors might exert influence on the content in the app, we might expect to see some users leave the app. So it could result in a situation where not only is it a—an app that is composed by only people based in the U.S. but only a subset of American users and particularly ones that perhaps might be right-leaning, which would also, again, have very big impact on the kinds of content that you see there.

So ultimately, the new app might look drastically different than it does right now, depending on what happens with decisions made by the investors, decisions by users, by who stays and who goes, and all that.

Feltman: Well, thank you so much for coming on to talk through this with us. We’ll definitely be reaching out to chat more if this sale goes through.

Cotter: Yeah, I’d be happy to chat more. Thanks again for having me.

Feltman: That’s all for today’s episode. We’ll be back on Friday to find out how Halloween treats can play tricks with our gut microbes.

Science Quickly is produced by me, Rachel Feltman, along with Fonda Mwangi and Jeff DelViscio. This episode was edited by Alex Sugiura. Shayna Posses and Aaron Shattuck fact-check our show. Our theme music was composed by Dominic Smith. Subscribe to Scientific American for more up-to-date and in-depth science news.

For Scientific American, this is Rachel Feltman. See you next time!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read

spot_img