← All posts

Essay

Quitting the Modern-Day Cigarette

And the app I built to do it.

Cigarette companies knew their product was addictive and harmful. They sold it anyway because it was enormously profitable, marketing it as cool, as a lifestyle, as freedom. It took decades of research, public pressure, and regulation before society started pushing back.

I think social media is in the same chapter of that story, and I think most of us already know it. We just haven't done anything about it yet.

What Made Me Notice

I started paying attention to this when I noticed it in myself. I'd catch myself having a knee-jerk reaction to a headline before I'd even read the article, or feel a flash of anger at a stranger's opinion in a comment section and carry it with me for an hour. These weren't thoughts I was choosing to have. They were being fed to me, and I was swallowing them without thinking.

Once I saw it in myself, I started seeing it everywhere. Friends and family members across the political spectrum getting more rigid, more certain, less willing to consider that the other side might have a point. People who used to be able to disagree over a beer now seem to genuinely believe the other side is acting in bad faith. It used to be a joke that political parties were like sports teams; that comparison feels quaint now.

I'm not pointing fingers — we're all just a product of the digital environment we've been in for over a decade.

The Machine Behind the Curtain

Social media companies have one goal: engagement. If they can keep you on Facebook for eight minutes instead of six, they can show you more ads, and more ads means more revenue. So they optimize for attention, and decades of behavioral research have shown what captures attention best: content that provokes a strong emotional reaction. Outrage, fear, and disgust are the engagement jackpot, while nuanced, thoughtful content simply doesn't compete.

Creators have learned this too. If you're trying to make a living on these platforms, the algorithm rewards consistency and volume, which means rage bait, engagement bait, and hot takes with no room for nuance are the rational strategies in an irrational system. Creators in general aren't bad people. They're responding to incentives.

Then short-form video poured gasoline on the fire. TikTok and Reels compressed the attention window from ten minutes down to sixty seconds or less, meaning creators now need to hook you in the first two seconds or you're gone. There's no room for "on the other hand" in a format that punishes anything except instant emotional impact.

The result is that we've been herded into algorithmic silos, not because anyone sat in a boardroom and decided to polarize the country, but because polarization is profitable. The machine doesn't care about truth or nuance, only about your next click.

Engineered Addiction

These platforms are engineered to be addictive in ways that go far beyond just having appealing content. The variable reward loops, the infinite scroll, the notifications tuned to pull you back in: none of this is accidental. Millions in R&D goes into optimizing every pixel of these apps to squeeze another ten seconds of attention out of you, in much the same way tobacco companies spent decades refining their product to maximize dependency.

And just like cigarettes, the people most vulnerable are the youngest users. Kids and teens who haven't developed the critical thinking skills to question why a confident-sounding stranger on their phone is telling them what to believe. They trust the platform the way earlier generations trusted the evening news, except the evening news wasn't engineered by an algorithm that gets smarter at manipulating them with every scroll.

The Apps Are the Weapon

Here's something most people don't think about: the native app is the most dangerous version of any social media platform. When you download Instagram or TikTok, you're handing over enormous control. These apps track every interaction — from what you pause on and skip to how long you hover and what makes you tap — and all of that data feeds back into the algorithm, making it better and better at keeping you hooked.

But here's what surprised me: almost every major social network also maintains a fully functional mobile website, and those web versions are dramatically limited in how much data they can collect and how effectively they can manipulate your behavior. The tracking is weaker, the addictive hooks are blunted, and while the content is the same, the manipulation dial is turned way down.

So I tried an experiment. I deleted every social media app from my phone and only used the web versions. It helped, but it wasn't enough. The algorithmic feed was still there, still quietly steering what I saw and thought. The silo was less airtight, but it was still a silo.

So I Built Something

That's why I created Scrolless.

I didn't want to quit social media entirely, because there's genuine value in staying connected with people. But I wanted to strip out the parts that were designed to manipulate me, and the existing tools didn't frame the problem the way I saw it.

Most app blockers and screen time tools ask: What if I reduced my usage? That's the wrong question. I don't have a time problem, I have an algorithm problem. The question I wanted to answer was: What if I never used recommendation algorithms at all?

That's the core idea behind Scrolless. By default, it blocks surfaces that are purely algorithmic — things like instagram.com/reels, x.com, or reddit.com's front page and r/popular. These are pages where an algorithm decides what you see, and they're where the manipulation happens.

But it doesn't block everything, because I still find enormous value in individual content. I can go to a specific subreddit I've chosen to follow, read a specific Reddit post I found through a web search, or visit someone's Instagram profile directly. The content itself isn't the enemy; the algorithmic curation of it is. The difference is intent: when I navigate to a specific subreddit or a specific person's page, I made that choice — but when I open an algorithmic feed, the platform makes thousands of choices for me, each one optimized to keep me scrolling.

There's also a second problem that blocking alone doesn't solve. These platforms inject algorithms onto pages worth keeping, like your actual Facebook timeline or the goddamn Instagram search page (look at it, it's insane). Suggested posts, sponsored content, "recommended for you" slots wedged between posts from people you actually follow. Over time, these injections can make up more of your feed than the content you subscribed to in the first place. Scrolless has a Feed Cleaner mode that strips all of that out, leaving only posts from accounts you chose to follow. It's the difference between a feed that serves you and a feed that serves the platform's engagement metrics.

The whole thing is fully customizable. You can block sites or subdomains using regex rules, so you can dial it in to match exactly how you want to interact with the web. Everyone's relationship with these platforms is different, and the tool should reflect that.

It's a Safari extension for iOS and macOS, a one-time $2.99 purchase with no subscription, no data collection. Which felt important given the whole point is escaping platforms that treat your attention as a revenue stream.

If that resonates with you, give Scrolless a try. And if it doesn't, at the very least consider deleting the apps and using the web versions. It's a small step, but you'll be surprised how much it changes the experience.


Scrolless is available for Safari on iPhone, iPad, and Mac. $2.99, one-time purchase. Download on the App Store.