This is going to be a messy post, so brace yourself.
“It’s not true that life is one damn thing after another; it’s one damn thing over and over.” ― Edna St. Vincent Millay
Everything is cyclical.
Fear and greed. Hope and despair. Creation and destruction. Normal hairstyle vs. hair that looks like it was chewed by rats, untorn jeans vs. ripped jeans, market cap vs. equal weight, smartphones vs. dumb phones, vinyl vs. digital, real vs. virtual, full chaddi vs. half chaddi, techno-optimism vs. techno-despair: the more things change, the more they remain the same.
The reason why I’m saying this is because there’s something peculiar going on with the feeds from which I get most of my information. I’m an information addict. If I don’t have a steady stream of information injected into my brain, I start tweaking and fidgeting like a drug addict. Given my addiction, I’m always on the lookout for new apps and platforms that give me a good high.
While most people doom-scroll on Instagram, Facebook, and TikTok, I scroll listlessly on YouTube and Twitter. Mind you, I’m not swiping shorts like a zombie on YouTube, nor am I rage tweeting on Twitter like a lunatic. I’m looking for links. On YouTube, I’m looking for good, long-form videos to add to the default “Watch Later”playlist YouTube so helpfully provides. I save them for later, and later becomes never. #SedLife
On Twitter, I’m looking for links to good articles, research papers, and threads. Like my Watch Later playlist on YouTube, I keep bookmarking these links or adding them to my Omnivore feed or Workflowy inbox. Like YouTube, all these apps are black holes into which most of the links I save embark on a one-way trip. I consume more information than most people, but not everything that I save for later consumption. I’m starting to think I’m not consuming as much as I am getting consumed.
Why am I saying all this?
As I mentioned in the previous post (archive), the Substack app is starting to grow on me. It’s become one of my go-to apps to get my fix of new information, and I’m happy because I keep finding some wonderful writers and writing. But in the last month or so, there’s something peculiar going on with my “Explore” feed, which is a mix of posts from writers that I follow as well as those recommended by the Substack algorithm. I’ve been seeing an increasing number of posts and publications expressing disenchantment with technology and advocating some form of disconnection.
It started with this post (archive), which advocates switching to flip-phones, walking to grocery stores, writing by hand, reading, letting kids loose, and finding faith. In case you are wondering if I’m dumb enough to miss the obvious, no, I didn’t find the first piece on the Substack explore feed, which could’ve been a signal to the algorithm to recommend similar spots. I discovered this post in a weekly Substack email digest. Soon, as the end of 2023 approached, I saw a steady trickle of other posts.
Despite all the lofty techno-optimism manifestos (archive), tech companies have never been more unpopular. The fact that so many people search for “digital detox” is a sign of the times. Hell, there are people charging lakhs to let others stay on farms, touch that disgusting stuff called soil, and inhale pungent cow dung smells infused with extra methane.
Many of these posts use words such as “resist” and “rebel” when talking about reducing our dependence on technology. The fact that simple acts like reducing screen time, reading a physical book, writing with a pen, talking to people face-to-face, gardening, and letting kids run loose instead of helicopter parenting are described as revolutionary acts feels weird to me. I mean, the vibe against tech definitively soured after the media switched (archive) from being techno-optimistic to techno-adversarial as Trump was elected. But I hadn’t thought about how bad and widespread this techno-disillusionment was.
As I was pondering the peculiarities of Substack algorithmic recommendations, this interview with Kyle Chayka popped up on my Pocket Casts feed. Kyle is an author and a staff writer at The New Yorker. He writes about how technology affects us, and I’ve long been a fan of his writing.
The reason Kyle was loitering around on my podcast feed was because his new book, Filterworld: How Algorithms Flattened Culture, was just published. I haven’t read it, but I heard him talk about the book on a couple of podcasts. Most discussions about algorithms are all or nothing. They are portrayed as either making our lives better or leading us off a cliff. Kyle, however, has a nuanced take on them.
My own view is that algorithms aren’t all bad. In fact, some physicists argue that our universe runs on an algorithm—the fundamental laws of nature as we know them. Since we are a part of the universe, that means human beings also function based on the laws of physics that, as theoretical physicist Brian Greene put it, fit on a t-shirt:
Heather Berlin: What does physics have to say about free will?
Brian Greene: Well, it’s not definite because we don’t fully know the laws of physics, but the laws of things that we currently have at our disposal have no opportunity for intercession by human will. I mean, we are a collection of particles governed by laws that you can write down and fit on a t-shirt, and those laws don’t at any point in the evolution of the particles say, “Hey, can you like tell me now what to do, person?” They just determine the future based upon what things were like in the past.
An expanded version of the deeply disturbing idea that we’re just a dumb lump of particles dancing to the dull tunes of physics:
Brian Greene: Yeah, you know, we are special; life is special in some ways, right? We can think, we can feel, we can react in various ways to the environment.
But one of the beauties of thinking about life and consciousness within the cosmological unfolding is that we see deep continuities, right? Just as a star is a collection of particles that is fully governed by the ironclad laws of physics, we each are collections of particles that are fully governed by the ironclad laws of physics, too.
Now, our organization is more exquisite compared to the organization, say, in a tabletop, or even in a star. And that allows us to realize behaviors that are unavailable to the table or to the star. And that certainly is welcome. That’s a good thing.
But the bottom-line message is we are all bags of particles governed by the laws of physics. And this was a statement that I made on “The Late Show with Stephen Colbert.” It just sort of came out in a heated conversation, to which he responded hey, that’s a great pickup line. And it is, actually. You should try it out.
But that’s what I mean by life is physics orchestrated. We are physics that is manifested in a highly coordinated manner. And that’s really the only distinguishing characteristic of us compared to the inanimate objects in the world.
Some extend that metaphor to the human brain and describe it as a computer (archive), but the idea is contentious (archive), to say the least. There may be some limited truth to the notion of a rule-based brain. The German psychologist Gerd Gigerenzer argues that we use simple rules or heuristics (archive) to navigate many aspects of life. If we’re playing cricket and we have to catch a ball, we don’t solve physics equations; we just fix our gaze on the ball and run towards it.
But anyway, coming to the point about algorithms, they’re deeply enmeshed in our world. From music recommendations to medical diagnoses, traffic lights, stock markets, human resources, policing, and social media, algorithms are the invisible interns, managers, bosses, and overlords that surreptitiously make choices for us, about us, and shepherd us in directions we may not want to go. The problem is that we are not consciously thinking about the impact of algorithmic recommendations.
Algorithms are also an inevitable result of the world of information overload in which we live. They help us sort through the informational landfills and cesspools and help us make sense of the world. The simple reality is that we need them in many cases. The question is, have we become wholly reliant on them to the point of losing our agency?
A few of my takeaways from Kyle Chayka’s interviews:
Surprise is dead
We no longer actively search for the things that we want to consume but have become passive consumers, dependent on a feed to deliver the things that we supposedly like straight into our brains. This passive consumption has killed surprise and serendipity. We’ve forgotten the joy of discovering books, movies, articles, and music in catalogs, blogs, by talking to friends, reading newspapers, and by accident. Now that all information about all things is available at the click of a button, everything is previsualized and predestined; there’s no surprise in anything.
Algorithms are ruining real world
Algorithms are reconfiguring the real world as well. Kyle mentioned an instance of Google Maps recommending a less crowded road that ran through a small town to avoid traffic. While people could save time by avoiding traffic, the quality of life in this small town is now ruined. It’s the same with Airbnb’s, coffee shops, book stores, and restaurants. All of these establishments are optimizing themselves to have an Instagram aesthetic because Instagram has become a default source of recommendations. If people don’t like a shop, they won’t post on Instagram, and if they don’t, the shop loses business. The end result is that all these spaces have the same generic, bright, and sterile layouts devoid of character.
Algorithms have become the invisible gods that determine the fate of large swathes of society. They demand a virgin sacrifice, and we must worship them and pander to them. Airbnb owners are afraid that they won’t attract customers if their homes don’t look like those that are getting good business. People on social media spend anxious hours trying to use a hundred gimmicks to get those likes and shares; otherwise, what’s the meaning of life?
Be your own tastemaker.
Kyle recommends being intentional about what you consume and being your own tastemaker. That means not being a passive and vegetative consumer of culture and information. Before algorithms, we used to read things, experiment, and rely on friends to discover what was good. Though it was painful, there was a certain pleasure to the activity of finding what you enjoyed. Algorithms have destroyed that joy.
This is all the more important in an era when everything is generic because everyone is trying to pander to the algorithms in the hope that they will look kindly upon them. If you are what you consume, then generic slop begets generic views, opinions, and personalities.
An excerpt of the book from a reading:
Algorithmic recommendations are addictive because they are always subtly confirming your own cultural, political, and social biases, warping your surroundings into a mere image of yourself while doing the same for everyone else. This had made me anxious – the possibility that my view of my own life, lived as it was through the internet, was a fiction formed by the feeds. So much of my perception of what my friends were up to on a given day, what was going on in various cities, which news stories mattered, even the weather, was dictated by what I saw on automated apps. What’s more, those feeds were all increasingly fractured and flawed, presenting posts from days ago as if they had just happened. Ultimately, my sense of self was beholden to the responses I got from my invisible audiences, whose attention was algorithmically mediated too. I don’t know that anyone else who has spent years of their life on digital platforms can be totally sure who they are without them. A fear took hold – had I passively consumed what I was interested in, had I given up my agency to figure out what was truly meaningful to me.
None of what he said was a revelation, and all these things have been at the back of my head for a long time. But hearing the same things from him triggered my brain to think about all these aspects again. My brain has been trying to process some of these things for the last couple of days, but I have more questions than answers.
Listening to these conversations made me think about my own information consumption patterns. I spend hours on Twitter scrolling aimlessly, trying to find useful things. Have I deluded myself into thinking that Twitter is useful to me since it helps me find insightful things and instead isn’t harmful to me?
I think that I am a discerning consumer of information and that I’m immune to the algorithmic wiles of platforms. Even though they keep recommending things that agree with my tastes and interests, have I deluded myself into thinking that I’m not stuck in a filter bubble?
Have I deluded myself into thinking that, if I don’t check these apps frequently, I might miss out on something useful, and that would be bad?
Have I deluded myself into thinking that the way I am consuming information is good because I am smart because I consume so much information?
All these techno-pessimist books and articles are topical. Given that there’s a lot of anxiety among people about their relationship with technology, are they pandering to people and taking advantage of their anxieties? Is this all hyped-up current thing nonsense?
These are some questions I’ve been pondering for a few days now.
As I was writing this post, I came across this post by writer J.E. Petersen that made me feel seen:
21 Signs You Might be a Digital Dope Addict
Check all that apply:
- On average, you engage in passive scrolling at least once a day
- You check your phone (or some other screen) within minutes of waking up
- You check your phone (or some other screen) within minutes of going to sleep
- You check your phone for no reason
- You have a hard time sitting still (or standing) for more than a minute or so without checking your phone
- You use your screen to relax.
- “Downtime” and “screentime” are functionally equivalent
- You often get distracted away from some task on your phone or computer and lose at least half an hour before you realize what happened
- You hunt for new notifications when you’re bored
- You feel anxiety when you don’t have access to your phone because you left it at home, or even just in the other room, or it’s dead
I’ve only quoted a part of the list, but I check the boxes for pretty much all 21 things. Even as I was writing this, I was consciously keeping track of the number of times I picked up my phone, checked notifications on Twitter, LinkedIn, etc., and I had a problem.
As you can see, this post is an unfiltered stream of consciousness. That’s because I haven’t spent enough time reflecting on my own information choices. As I alluded to earlier, I have a whole lot of questions and few answers. The one thing that seems clear to me is that unless we are intentional about what we let into our brains, we’ll all be generic representations of the algorithms that feed us the slop we crave. In fact, being intentional about what I consume on the internet is one reason why I started this blog. Progress has been slow, but I’m taking it one step at a time.
As I completed the post, I realized that this post is a continuation of the previous post.
Articles that inspired and informed this post
In order to resist the dehumanizing effect of the Machine, it is necessary to draw a line. To create a frame around what defines our humanness. We need to fiercely protect these lines that the Machine continually tries to encroach upon. Yet, once the frame is defined, it is much easier to defend it.
This post is not intended as a lament, but as a starting point for rehabilitating attentional ferals of the digital age, whether they be young or old. All of us who use digital devices are affected by the easy lure of hyper attention, and if our aim is — asPecosuggests, “to be anchored to our core meanings in life and situate technology’s proper place in the order of things” — then it is up to us to train, grow, and reestablish deep attention.
I need to get back in the habit of reading. I need even more to get away from the madness that is screen addiction. It may be the lamest of all addictions. At least junkies get to be high. My addiction just lets me roll my eyes a lot and feel superior or check out mentally while contributing nothing, learning nothing and remembering nothing.
I feel like the lab rat in one of those experiments where they give it a choice between cocaine infused water or a rewarding social life.
But everything has changed.
Now, every image you see, every sound you hear, and every article you read in this virtual world has to be questioned. Is this real? Is this the output of a physical human’s creativity? Does this beautiful mountain really exist? Has this song really been sung? Are these the perfectly written thoughts of a writer? Or are these all the products of the machine gone wild? Who can really know…
Welcome to the age of inauthenticity.
Meanwhile, I think there’s a much more pressing problem whereby AI is already starting to degrade the entire user experience of the web, rendering once-reliable content resources as completely useless. This is a problem that’s currently causing web users harm while simultaneously destroying the utility of the world’s largest tech platforms.
We think another part of the puzzle is that extremists tend to dominate social media conversations, while more moderate individuals barely speak up. For instance, studies have shown that 0.1% of users are responsible for 80% of the misinformation spread on X, formerly known as Twitter, and a similar pattern has been found for toxic Reddit comments.
The Plight of the Modern Creator is This… Make content; get eyeballs; get paid; double down, invest in better equipment, better editing, better whatever; consume consume consume; turn that consumption into more content; get more eyeballs; ask those eyeballs to share your content with other eyeballs; click to subscribe; click to receive notifications; click to follow me on Twitter; cl
This is obviously something that’s been happening for over a decade now—people documenting events instead of being in the moment. But it’s not just events anymore. People document everything now. Every mundane moment of their lives. What they wear. What they eat. What they buy. And as well as ordinary things people now feel the need to document profoundly personal moments, from health scares to mental breakdowns to their first time seeing a baby after it’s born:
I offer some high-level advice, which is to develop and pursue offline habits that demonstrate to your child that there is a wonderful world worth exploring beyond our phones. Ask yourself: Do you spend time outside, reading books, talking to friends, playing music, doing art, and engaging in activities and hobbies that pull you away from the screen for significant amounts of time—in other words, doing the very things that you’d like to see your kid doing more of? (You might want to read The Profound Pleasure of Physical Tasks.)
Our young man may not even see a need to stop his virtual binging, as society offers him little in the way of a moral or ethical compass for technology. We live in an ethos of digital whateverism: if anyone has a screen, they can do anything, anywhere, anytime. The electronic Oreos are everywhere. So the motivational pickpockets go to work on our young man, stealing away each flicker of intrinsic motivation as it arises in him, slowly bankrupting him of the possibility of discovering what he might actually want out of life..
Boys are in trouble. Many have withdrawn from the real world, where they could develop the skills needed to become competent, successful, and loving men. Instead, many have been lured into an ever more appealing virtual world in which desires for adventure and for sex can be satisfied, at least superficially, without doing anything that would prepare them for later success in work, love, and marriage.
Once in a while, we all do something disgusting. So throw your phone away, hold your nose, touch, or maybe even open those disgusting things called books. Maybe bear with the stench and read a page or two.