Steve Zafeiriou (b. 1998, Thessaloniki, GR) is a New Media Artist, Technologist, and Founder of Saphire Labs. His practice investigates how technology can influence, shape, and occasionally distort the ways individuals perceive the external world. By employing generative algorithms, electronic circuits, and interactive installations, he examines human behavior in relation to the illusory qualities of perceived reality, inviting observers to reconsider their assumptions and interpretations.

In search of IKIGAI
dark mode light mode Search Index Menu
Search
The Illusion of Choice, Generative Artwork by Steve Zafeiriou

The Myth of Pure Choice (2025): Why the “Illusion of Choice” Shapes Modern Decision Making

Your choices are more scripted than you think.

You feel like you’re picking your own tools, your own content, your own path.

You scroll, you click, you subscribe.

You call it “intuition”, “taste”, or “what I’m into right now”.

Meanwhile: defaults (subjective mind), and algorithms are quietly drawing the map you think you’re exploring.

This is the problem of our time:

You live in systems that sell you empowerment through infinite options, while the real game is happening in the framework:

how those options are laid out, ranked, and framed.

Unless you change how you think about choice, you’ll keep confusing feeling free with actually being free.

The illusion of choice happens when you feel autonomous while your decisions are steered by cognitive biases, and social algorithms that shape what you see, in what order, and with how much friction.

Across psychology, behavioral economics, AI ethics, and marketing, the pattern is consistent:

  1. Expanding choice sets or personalizing options doesn’t automatically increase autonomy.
  2. Choice architecture channels behavior without removing formal options.
  3. The result is a comfortable story of “self determined action” built on structurally constrained decision spaces.

You still act. You still choose.

But much of the stage was set before you ever arrived.

Architecture of Human Attention: Steve Zafeiriou wearing an Emotiv EEG brain-computer interface headset, demonstrating neural activity tracking for interactive art and human attention research.
Illusion of Choice: Steve Zafeiriou wearing an Emotiv EEG brain-computer interface headset, demonstrating neural activity tracking for interactive art and human attention research.

What Is the “Illusion of Choice”?

The illusion of choice is the gap between how free you feel and how free you actually are inside a given system.

On the surface, you see menus, toggles, feeds, and product grids.

Underneath, you’re moving through predetermined paths:

  1. interfaces that foreground some actions and bury others,
  2. markets where a few players define the shelf,
  3. algorithms that curate and rank an overwhelming world into a finite feed.

This is where choice architecture comes in.

Behavioral economics uses this term to describe how environments approach behavior by design rather than by force.

A supermarket layout, a search results page, or the settings screen of your favorite app doesn’t delete options..

It makes some plausible and others practically invisible.

A few familiar structures:

  1. Subscription platforms with auto renewal already checked.
  2. Mobile apps that hide privacy settings behind several taps.
  3. Recommendation feeds that surface a thin slice of all possible content.

The environment, the interface, and your own questioning (shortcuts like “pick the top result” or “go with the default”) work together as a hidden system.

These systems can genuinely help, reducing friction, lowering complexity, making navigation easier.

But when they lack transparency or alignment with your interests, they quietly narrow your real power while preserving the feeling of control.

We’ll call this tension “engineered autonomy”:

You feel in charge, but your options are already defined.

Abstract interactive-art visual exploring how perception differs from reality, used as the cover image for a 2025 article on cognitive bias and predictive processing.
Illusion of Choice: interactive Contemporary Performance, exploring how perception differs from reality, developed by Steve Zafeiriou, 2023.

The Psychology Behind the Illusion of Choice

Choice Overload: When “More” Becomes a Trap

More options don’t always mean more freedom; past a certain point, they become an expense for your cognition.

Choice overload describes the paradox where adding options leads to more stress, more avoidance, and less satisfaction.

Misuraca et al. (2024) synthesize research showing that large choice sets often undermine decision quality, especially when:

  1. your preferences are unfocused,
  2. options are hard to compare, or
  3. the stakes feel high but unclear.

Three ways this shows up in your life:

  1. You freeze on streaming platforms because the catalog is massive but your energy is low.
  2. You bounce from tab to tab when shopping, unable to commit and mentally calculating dozens of variables.
  3. You delay decisions entirely because “there’s more research to do”.

In theory, more options means more possible futures.

In practice, you have limited time, attention, and energy.

In FOMO heavy environments, this transforms into regret, paralysis, and endless second guessing.

For designers and marketers, the lesson is brutal:

  1. “We offer everything” often translates to “We exhaust everyone.”

Choice without structure becomes noise.

Entropy and the Sweet Spot of Choice

There is an optimal range of choice where diversity meets clarity. Beyond it, satisfaction breaks.

Asl et al. (2024) propose an entropy based model to formalize this.

Entropy here equals informational richness in a choice set:

  1. With too little entropy, everything feels the same (boring and predictable).
  2. With too much entropy, everything feels chaotic (overwhelming and noisy).

Satisfaction tends to follow an inverted U curve:

  1. It rises as you move from too few to a healthy variety,
  2. peaks at a “sweet spot”, and
  3. then falls as options explode past what your cognition can meaningfully process.

If you’ve ever curated a playlist, designed an interactive installation, or built a generative art system, you’ve felt this:

  1. Too few states and the system feels dead.
  2. Too many states and it feels random, ungraspable.

This points to a clear design principle:

Don’t worship abundance. Engineer a sweet spot. Enough options to feel alive; enough structure to feel navigable.

Generative Art software using Ethereum Blockchain by Steve Zafeiriou
Illusion of Choice: Generative Blockchain Art collection, developed by Steve Zafeiriou based on behavioral responses.

Defaults, Behavioral Signals, and Invisible Constraints

Why Defaults Quietly Run Your Life

If you don’t design your own defaults, someone else already has.

Defaults leverage status quo bias:

The tendency to stick with preselected paths to save time and mental effort.

Most people read a default as:

“This is normal. This is recommended. This is safe.”

Concrete examples:

  1. Privacy settings defaulting to maximum sharing.
  2. Cookie banners with “accept all” highlighted/selected.
  3. Subscriptions rolling over silently unless you actively cancel.

Notice the pattern: nothing is forbidden.

Your power isn’t removed on paper. But the landscape is tilted so that doing nothing becomes the default choice.

For busy, overloaded humans, inaction wins by default.

Behavioral Shaping by Algorithms

Contemporary algorithmic systems move beyond generic bias patterns by modeling your unique sensitivities.

Algorithmic nudging combines:

  1. behavioral science (how humans predictably misjudge risk, value, and effort), and
  2. adaptive models that update in real time based on your behavior.

*Algorithmic nudging is the use of algorithms to subtly influence user behavior and guide choices in digital environments without removing options.

Schmauder et al. (2023) outline how these systems can exploit biases in personalized environments.

Luo (2024) shows that seemingly minimal signals like rating badges or “most popular” tags in e-commerce, can reliably shift what people buy without changing the underlying option set.

Recommender systems watch micro signals:

(I use these techniques into my interactive art installtions)

  1. how long you hover over a card,
  2. how far you scroll,
  3. what you skip but don’t quite ignore.

Then they adapt.

Then they test again.

Then they adapt again.

Over time, you get a feedback loop:

  1. The system identifies behavioral patterns and vulnerabilities.
  2. It optimizes interventions for engagement, conversion, or retention.
  3. You feel like you’re just “following your preferences.”
Nostalgie World: Interactive installation exhibited at MATAROA AWARDS 2025
Illusion of Choice: Nostalgie World, an Interactive installation investigating Mental Health Disorders in the age of Technology, exhibited at MATAROA AWARDS 2025, Awarded as “People’s Choice” at Doncaster Art Fair 2023

Ethical Boundaries: Prompt vs. Manipulation

The difference between a prompt and manipulation is whose goals are being served, and how visible the guidance is.

Three ethical criteria often used:

  1. Transparency: Is the guidance understandable and discoverable?
  2. Alignment: Does it support the user’s stated or reasonable interests?
  3. Accountability: Can someone be held responsible for harms or exploitation?

Prompts that encourage safety, health, or sustainability, and are openly communicated, tend to be accepted as legitimate choice architecture.

Hidden or exploitative prompts that primarily serve platform or advertiser goals create a power imbalance:

  1. They preserve the appearance of neutral choice,
  2. while systematically shifting your behavior toward their metrics.

At scale, this isn’t just about User Experience.

It’s governance of attention and behavior.

Installation view of ‘Dark Tales’ featuring an AI-agent chat interface projected within an immersive gallery setting, combining dark narrative aesthetics and interactive machine-learning visualisation.
Illusion of Choice: Installation view of ‘Dark Tales’ featuring an AI agent chat interface projected within an immersive gallery setting, exploring existential philosophy and literature.

How Algorithms Create the Feeling of Autonomy

Personalized Feeds & Filtered Worlds

Personalization shrinks the visible world while making it feel tailor made just for you.

Lu (2024) and Joseph (2025) show how personalized systems can reduce the actual diversity of options you encounter, even as they increase subjective relevance.

Joseph calls this the “algorithmic self”:

An identity co-authored by AI systems that model your patterns and feed them back to you as “you”.

In my own work with generative and interactive systems, I see it play out like this:

  1. You interact with a system.
  2. The system trains on your behavior, building a distilled profile of your taste.
  3. It returns content that fits that profile.
  4. You internalize that loop as “this is who I am; this is what I like”.

Over time, the system doesn’t just mirror your preferences.

It shapes them.

Your “self” becomes partially defined by what the algorithm learned to predict and reward.

TouchDesigner-based digital visualization of a memory network with interconnected floating images and labeled nodes, representing synthetic memory reconstruction in a dark, immersive interface.
Illusion of Choice: Algorithmic semantic analysis on Refuge stories, investigating generational trauma and memory, Interactive Art Installation titled “Synthetic Memories” by Steve Zafeiriou, exhibited at MOMus Museum of Contemporary Art, Thessaloniki, GR 2025

Personalization as a Soft Constraint

Soft constraints don’t block options; they make some paths so smooth that others may as well not exist.

Think of:

  1. Auto curated music playlists or art feeds.
  2. Algorithmic news feeds.
  3. Product suggestions at the top of any marketplace page.

These are all soft constraints. They influence:

  1. what appears first,
  2. what’s easiest to click,
  3. what’s buried behind extra steps.

Because the system continually tunes itself to your past behavior, it encourages more of the same.

Predictability increases. Feedback loops form.

Exploration shrinks.

You’re still allowed to roam. You just rarely do.

Person holding Waveshare ESP32-S3 1.69 inch display module with custom digital artwork on screen in a workshop environment
Illusion of Choice

When More Choice Isn’t More Freedom

In modern interfaces, you face “choice compression”: formally infinite options, practically narrowed decisions.

Here’s how it plays out:

  1. The nominal number of options explodes (more products, more content, more creators, more everything).
  2. Interfaces highlight a tiny subset as the “main functionality”.
  3. Ranking algorithms spotlight some items and effectively hide others.
  4. Defaults and algorithmic influence streamline you toward the most likely outcomes.

The visible world collapses into a thin band of highly probable alternatives.

This is common across:

  1. digital markets driven by recommendation based distribution,
  2. cultural platforms where a handful of artifacts go viral,
  3. creative ecosystems where algorithms become the new curators.

The big shift: You don’t live in a world of scarcity anymore.

You live in a world where attention is scarce and visibility is algorithmically rationed.

If this hits, I wrote a deeper essay on the Psychology of Interactivity and attracting Gen Z.

GeoVision V2 3D geographic sculpture rendering, demonstrating precise spatial modeling and innovative geographic visualization capabilities.
Illusion of Choice: GeoVision’s interactive controller, that explores cultural interpretations. Interactive Art Installation developed by Steve Zafeiriou.

How to Recognize the Illusion of Choice in Daily Life

You can’t fix what you don’t see; so start spotting the structures.

Use these quick diagnostics when navigating any digital system:

  1. Watch the defaults
    • What comes preselected?
    • What happens if you do nothing?
  2. Study the real estate
    • Does recommended content dominate the screen?
    • Are alternative paths visually minimized or hidden?
  3. Track friction
    • Which actions take one tap versus five?
    • What’s easy to start, hard to stop, or tedious to change?
  4. Follow the incentives
    • Does the platform benefit more from your wellbeing or your engagement?
    • Where do ads or commercial interests intersect with “recommendations”?

You’ll see this in:

  1. endless autoplay video queues,
  2. checkout flows that push “recommended bundles”,
  3. e-commerce pages where “recommended for you” overshadows neutral browsing.

Recognizing these patterns doesn’t magically free you.

But it breaks the spell.

You shift out of passive observation, start noticing the mechanics behind the game, and become far more selective with where you place your intent.

MAX30102 Heart Rate and SpO2 Results on Display – Real-time heart rate monitoring and blood oxygen (SpO2) levels recorded using the DFRobot MAX30102 sensor, visualized for Arduino health monitoring and biometric sensor projects.
Illusion of Choice

How to Reclaim More Real Autonomy

You won’t escape systems. But you can negotiate with them.

Think of autonomy in our age as a three layer practice:

  1. Cognitive strategies (how you decide),
  2. Technological strategies (how you configure systems),
  3. Structural strategies (how systems and policies evolve).

Cognitive Strategies: Upgrade Your “Choice Hygiene”

If your mind is overloaded, no interface setting will save you.

Three practical moves:

  1. Slow down key decisions
    • Add artificial friction: wait 24 hours before large purchases, disable late night shopping, or batch decisions into specific windows. This counters impulse and gives your system less power over long term outcomes.
  2. Create personal defaults (the most powerful)
    • Decide your privacy settings, notification rules, or content filters. Treat these as “user authored defaults” that override platform defaults wherever possible.
  3. Limit decision scope upfront
    • Define constraints before you dive in (“I’ll compare 3 options, not 30”). This shrinks the entropy to a level your cognition can handle.

Proactively cleaning and structuring your decision environment instead of walking into a mess and trying to think clearly.

AI Generative Art: Using MediPipe Body Tracking For Projection Mapping
Illusion of Choice: Using Machine Leanring and prediction algorithms on projection mapping performance setup.

Technological Strategies: Bend the Tools Back Toward You

If you can’t leave the system, reconfigure it.

Some simple, high leverage actions:

  1. Disable or tame autoplay
    • Stopping infinite feeds and autorun queues gives you back natural stopping points.
  2. Favor chronological or less personalized views when possible
    • A raw timeline can be more cognitively demanding, but it exposes more of the real landscape.
  3. Audit recommendations and tracking settings
    • Use “Why am I seeing this?” observation where available.

These are not perfect shields, but they move you toward permissionless leverage over your own attention, instead of letting every platform act as your uninvited copilot.

Organizational and Policy Level Approaches

You have two priorities if you design systems:

People and Product.

If you don’t build the product, you still live inside someone else’s.

Ethical choice architecture in organizations leans on three principles:

  1. Transparent defaults: Users should know what “normal” is and how to change it.
  2. Minimal viable choice sets: Enough options to respect autonomy, not so many that you induce paralysis.
  3. Human-centric technology: Interventions that are clearly beneficial to users, not just to metrics.
GeoVision V2 system overview showcasing advanced features for geographic visualization and data integration, designed for enhanced spatial analysis and professional use.
Illusion of Choice: GeoVision, Interactive Art Installation developed by Steve Zafeiriou

Conclusion

Modern systems promise you infinite choice, but your real autonomy sits at the intersection of cognitive limits, interface design, and algorithmic curation.

As digital infrastructures increasingly mediate culture, markets, and creativity, understanding how your decisions are scaffolded isn’t optional anymore, it’s part of being an influential human in 2025.

Seeing the illusion of choice doesn’t make you powerless.

It makes you precise.

You stop mistaking every feeling of “I chose this” for a clean expression of free will and start asking: 

Who built this path, and why does it feel so natural to walk it?

From there, the next step is yours:

Audit one interface you use today, spot the defaults and nudges, and decide which of them you’re willing to accept, and which you’ll override.

Frequently Asked Questions (FAQ)

What is the illusion of choice in psychology?

It’s the discrepancy between how free you feel and how free you actually are, once you account for defaults, nudges, design structures, and recommendation systems that steer decisions without overt coercion.

Do more options always create more freedom?

No. Beyond a certain threshold, additional options often increase stress, avoidance, and regret, undermining both decision quality and satisfaction.

How do algorithms affect free choice?

Algorithms filter, rank, and prioritize information, subtly influencing you toward particular outcomes while maintaining the sense that you’re just “following your own preferences”.

Are nudges manipulative?

They become manipulative when they’re hidden, misaligned with your interests, or deployed primarily for platform or advertiser gain rather than user wellbeing.

How can I avoid the illusion of choice online?

Disable autoplay where possible, adjust privacy and personalization defaults, favor chronological feeds, and periodically audit recommendation settings to keep your environment aligned with your actual goals.

Is the illusion of choice always bad?

Not necessarily. Thoughtful constraints can reduce cognitive overload and support better decisions; the problem arises when those constraints are opaque, exploitative, or impossible to negotiate.

Total
0
Shares
Write the Artist Statement That Actually Gets You Noticed

Most artists treat their statement like a chore.

A paragraph they “should” write.
A mechanical requirement for grants, curators, and exhibitions.

That’s why most artist statements sound the same: flat, vague, and forgettable.

This guide fixes that.

You’ll learn how to craft an artist statement that actually communicates your vision; the kind that helps curators, galleries, and collectors feel your work, not just skim it.

It’s a practical playbook for turning your ideas, influences, and intentions into a narrative that moves people.