Filterworld

Photo by Antonio Batinić (Pexels)

I know what you are going to see when you open Facebook, Twitter or Instagram. 

You are going to see what someone wants you to see, based on what it is you seem to want to see.

“[There] are equations that measure what you’re doing, surveil the data of all the users on these platforms and then try to predict what each person is most likely to engage with,” explains New Yorker writer Kyle Chayka. “So rather than having this neat, ordered feed, you have this feed that’s constantly trying to guess what you’re going to click on, what you’re going to read, what you’re going to watch or listen to.”

Chayka calls it “filterworld” and defines it as the “vast, interlocking, and yet diffuse network of algorithms that influence our lives today—one that has had a particularly dramatic impact on culture and the ways it is distributed and consumed.” It’s all about engagement. The measure of success is how many likes you can get, how many saves you can get on TikTok, and how many streams you can get on Spotify. Chayka argues that “all this machine-guided curation has made us docile consumers and flattened our likes and tastes.”

Nonetheless, as Megan Garber writes in the Atlantic, “algorithms can influence our tastes so thoroughly that, in a meaningful way, they are our tastes, collapsing desire and identity.” She adds:

Users talk about [algorithms], typically, as mere mathematical equations: blunt, objective, value free. They seem to be straightforward. They seem to be innocent. They are neither. In the name of imposing order, they impose themselves on us.

But there’s more to it than that. They also, ironically, isolate us from the very community they seem to promise. As Chayka notes:

These digital platforms and feeds, they kind of promise a great communal experience, like we’re connecting with all the other TikTok users or all of the other Instagram users, but I think they’re actually kind of atomizing our experiences, because we can never tell what other people are seeing in their own feeds. We don’t have a sense of how many other people are fans of the same thing that we are fans or even if they’re seeing the same piece of culture that we’re seeing, or experiencing an album or a TV show, in the same way.

I think we can take the concern a step further. What these algorithms create is what University of Chicago professor Cass Sunstein has called the “Daily Me”—a self-created world where we see only the sports highlights of our favorite team, read only the issues that address our interests and engage only the op-ed pieces with which we agree. The highly lauded personalization of information protects us from exposure to anything that might challenge our thinking or make us uncomfortable. Unchecked, we begin to follow only the echo of our own voice; or, even worse, the voice of someone else.

Chayka writes, “In Filterworld, it becomes increasingly difficult to trust yourself or know who ‘you’ are in the perceptions of algorithmic recommendations.” Which means, as Garber notes,

“… it also becomes difficult to trust anything at all.”

James Emery White

 

Sources

Kyle Chayka, Filterworld.

Megan Garber, “The Uncanniest Influencers on the Internet,” The Atlantic, January 17, 2024, read online.

Tonya Mosley, “How Social Media Algorithms ‘Flatten’ Our Culture By Making Decisions for Us,” NPR, January 17, 2024, read online.

Cass Sunstein, Republic.com.

James Emery White