Thairath Online
Thairath Online

Algorithms Often Push What We ‘Might Like’ Until We Question if Its Our True Preference. How Do We Preserve Our Identity When Algorithms Know (or Judge) Us Too Well?

Everyday Life23 Apr 2026 01:32 GMT+7

Share

Algorithms Often Push What We ‘Might Like’ Until We Question if Its Our True Preference. How Do We Preserve Our Identity When Algorithms Know (or Judge) Us Too Well?

Have you noticed that when we listen to music or watch movies, online platforms often suggest content we ‘might like,’ almost as if they know us? This sometimes leads us to question ourselves: are these truly our genuine preferences, or merely what the algorithm feeds us? Are we losing the ability to discover what we like on our own?

This feeling may indicate we are experiencing Algorithmic Anxiety—a deep sense that our power to choose is being intruded upon by algorithms, leading us to doubt whether, amid carefully arranged streams of information, there remains any space for our authentic selves.

The psychological mechanisms of domination.

The key principles that give algorithms influence over our thoughts stem from fundamental human tendencies: confirmation bias and the psychological principle of familiarity.

Confirmation Bias means creating biases to confirm one's own beliefs. The human brain is designed to seek consistency with existing beliefs and tends to ignore or dismiss conflicting information to reduce the mental effort of analysis. Algorithms amplify this bias by continuously feeding content that seems to 'fit' our preferences.

The algorithm keeps supplying this content more and more, creating an Echo Chamber that misleads us into thinking what we see is the entire truth, causing us to lose opportunities to encounter diverse perspectives.

At the same time, the psychological principle of familiarity, or the Mere-Exposure Effect studied by Robert Zajonc, explains that humans tend to like or feel positively toward things simply because they encounter them frequently. The more we see something, the more we feel safe and automatically like it.

When algorithms repeatedly display the same type of content, they not only show us what we like but also train our brains to 'learn to like' what the system wants us to like. This process blurs the line between tastes arising from internal desires and those shaped by the algorithm to the point where they are almost indistinguishable.

Losing power over oneself.

According to the Self-Determination Theory by Deci and Ryan, humans feel meaningful and happy when their needs for Autonomy, Competence, and Relatedness are met.

When algorithms interfere with our mental autonomy by filtering and selecting all content, our freedom to choose and decide is immediately diminished. We do not select content with genuine intention but pick from menus prepared by the system.

This leads to a psychological state called Decision Paralysis, where one cannot decide or, in some cases, develops mental apathy. When accustomed to receiving pre-filtered information, we lose enthusiasm for exploring new things independently. This apathy signals that we are losing our 'self' merely because of statistical data remembered by algorithms.

Redefining preferences in the age of artificial intelligence.

In the AI era, our preferences may be divided into two types: tastes influenced by algorithms and tastes arising from exploring new experiences.

Accepting that algorithms are tools reflecting only past preferences, not defining future potential, is key to resetting our mindset. We must understand these systems are mirrors reflecting who we were or what we liked. To preserve our identity, we need to push beyond that reflection by shifting perspectives and seeking preferences beyond algorithmic calculation.

Reclaiming power for ourselves.

If you worry about losing your identity and preferences, try these three practical approaches to regain control over your decisions.

1. Create a safe space.

By seeking content beyond algorithms or choosing sources outside your usual familiarity, you train your brain to step out of habits and affirm that you still hold decision-making power yourself.

2. Practice mindful media selection.

Conscious information consumption is crucial. Before clicking anything, pause briefly to consider whether it truly reflects your desire or is influenced by the system. This questioning helps you regain control over algorithms and creates psychological distance between you and technology.

3. Clear your data history.

Regularly deleting usage history or resetting personal data cleans your digital space, reducing system influence on your tastes and opening opportunities for new experiences.

4. Seek information independently.

Instead of waiting for algorithms to suggest what you might like, try searching for what genuinely interests you on various platforms or choose channels yourself. This breaks the cycle of being guided and lets you decide what information enters your mind.

5. Manage data access permissions.

Use privacy control tools available on platforms, such as disabling 'personalized recommendations' or limiting tracking permissions. Doing so reduces behavioral data collection that algorithms use to suggest content, making your online space more neutral and less prone to repetitive content.

As long as we continue to question whether our preferences come from algorithms and wonder if we still have our own likes, it proves our 'true self' is still active and ready to explore the world with our own spirit.

Our true preferences may not be limited to algorithm suggestions but often hide in books we haven't read, places we haven't visited, or ideas we haven't heard before.

Reclaiming power and freedom of thought may start with small decisions every time we engage with the digital world. Choosing not to follow trends blindly or seeking ideas that challenge our beliefs helps us become the ultimate masters of our minds and is a good start to building an identity no algorithm can control.


References