SK Nag

In a world drowning in data and digital choices, it seems ironic that the platforms designed to broaden our horizons are increasingly boxing us into narrower and narrower corridors of content. Whether you’re a creator trying to reach new audiences or a user just trying to find something refreshing, there’s a growing sense that artificial intelligence (AI)—particularly the algorithms powering social media and search engines—are predicting our preferences too well for comfort. And in doing so, they’re getting human nature terribly wrong.
The promise of AI in content curation was to help us navigate the internet’s vastness. But increasingly, it has become a filter bubble factory, replacing discovery with déjà vu. We are being shown what we are likely to watch, buy, or read, not what we might like to explore. AI’s over-reliance on historical data and engagement metrics is slowly killing serendipity—the thrill of stumbling upon the unexpected.
The Tyranny of Prediction
At the heart of this problem lies the logic of over-prediction. Algorithms optimize for watch time, click-through rate (CTR), and past behavior. If a user often watches food videos, AI assumes that’s all they want. If someone skips a slow-paced but insightful video once, similar content gets buried forever.
This logic ignores the fluid, experimental, and often irrational nature of human curiosity. We click things because we’re bored, intrigued, nostalgic, or just accidentally. We change tastes, try new ideas, or develop fresh interests on whims. But algorithms, bound by narrow data patterns, are allergic to such ambiguity. They box us in, then take our confined behavior as proof of preference.
This has disastrous implications for content creators too. Videos, blog posts, or art that don’t “perform” in the first few hours are algorithmically deprioritized. There’s no second chance. No room for sleeper hits. Innovation, by its very nature, takes time to be appreciated—but platforms punish anything that isn’t instantly viral.
Feedback Loops That Kill Freshness
This is where AI’s predictive power turns self-defeating. When content is not shown widely due to low early engagement, and therefore fails to get engagement, the algorithm reads that as confirmation of irrelevance. This feedback loop discourages experimentation and encourages creators to stick to proven formats. In essence, it penalizes risk, creativity, and depth.
In a country like India—where regional voices, multilingual content, and emerging creators are on the rise—this feedback loop is especially damaging. A Marathi vlogger trying something new, or a Bengali poet exploring spoken word, may never reach their audience because the algorithm didn’t recognize the content as “engaging enough” based on prior patterns. The AI doesn’t understand cultural nuance, emotional storytelling, or evolving interest—it just sees numbers.
The Cost of a Caged Feed
The result? An internet that feels increasingly stale. You’re not imagining it—your social media feeds, YouTube suggestions, and Google Discover cards are slowly becoming echoes of what you’ve already seen. There’s less surprise, less learning, less freshness. Platforms are failing in their original promise: to expand access, not restrict it.
This narrowing of experience doesn’t just affect personal growth or artistic exposure. It has larger social consequences. People stay trapped in ideological silos, shoppers keep buying the same brands, and audiences never hear voices from the margins. AI isn’t just making us predictable—it’s making us passive.
Time for a Human Reset
It’s not that AI is inherently flawed—it’s that its current implementation is too narrow. Platforms need to design with human curiosity in mind. Here are three ideas to start with:
- Build for Exploration, Not Just Engagement: Include modes that intentionally disrupt patterns—offering a “curiosity feed” or a random discovery feature.
- Rethink Metrics: Instead of only CTR and watch time, factor in diversity of engagement, topic spread, and even content age.
- Give Content a Longer Runway: Instead of killing underperforming content within hours, allow algorithms to test them with broader segments over time.
In an era where AI can compose music, write essays, and even mimic voices, the one thing it still cannot do is want something new. That’s uniquely human. And if we don’t build platforms that honor that impulse, we’ll be left with a digital ecosystem that’s efficient but uninspiring—like being fed our favorite meal every day until we forget we ever had a taste for variety.
As India becomes increasingly digital, this is more than a design flaw. It’s a cultural risk. Algorithms must be taught to value uncertainty, experiment with exposure, and—most importantly—trust that humans are capable of liking something they’ve never seen before.
Because the one thing that keeps society alive isn’t just data—it’s discovery.
(Author is Political & Economic Analyst. The views expressed are personal opinion of the author.)


