Steve Zafeiriou (b. 1998, Thessaloniki, GR) is a New Media Artist, Technologist, and Founder of Saphire Labs. His practice investigates how technology can influence, shape, and occasionally distort the ways individuals perceive the external world. By employing generative algorithms, electronic circuits, and interactive installations, he examines human behavior in relation to the illusory qualities of perceived reality, inviting observers to reconsider their assumptions and interpretations.

In search of IKIGAI
dark mode light mode Search Index Menu
Search
Abstract interactive-art visual exploring how perception differs from reality, used as the cover image for a 2025 article on cognitive bias and predictive processing.

Your Perception Is Not Your Reality (2025): Why Interactive Art Is the Safest Space to Test Belief

Most people confuse how reality feels with how reality is.

That’s the trap.

Your brain takes a messy stream of incomplete sensory data, runs it through a lifetime of assumptions, biases, and bodily states, and then hands you a polished experience that says, “This is the world”.

You rarely see the editing process. You just see the final export and call it “truth.”

In contemporary cognitive science, perception isn’t treated as a passive recording of reality but as a predictive, constructive process (Sprevak, 2023).

Predictive processing models describe the brain as a hierarchical prediction engine that continuously minimises the mismatch between expected and actual sensory input (Ficco et al., 2021; Walsh et al., 2020).

What you “see” is, in large part, the brain’s best guess about what’s out there, constrained by prior experience and current context (Hohwy, 2025; McGovern & Otten, 2024).

Interactive art is one of the few places where that invisible process gets dragged into the spotlight.

Step into a responsive installation and you can feel your sense of agency, certainty, and meaning being negotiated in real time.

The room moves. You move. The system reacts. You react.

Suddenly, you’re not just “in” a space — you’re aware that your body, your prior beliefs, and the system are co-authoring your experience.

I’ve spent years building these kinds of environments.

Not just as “cool tech art,” but as deliberate, controlled systems where people can watch their own cognition glitch, adapt, and reorganize itself, without the real-world cost of being wrong.

Nostalgie World: Interactive installation exhibited at MATAROA AWARDS 2025

Work on agency in human–computer interaction shows that these timing and feedback relationships are central to whether people feel in control (Limerick et al., 2014; Yu et al., 2024; Dutta, 2025).

This essay explores two core claims:

  1. Your perception is not your reality.
  2. Interactive art is one of the safest, most revealing laboratories for testing belief, cognitive bias, and human–computer interaction (HCI).

Drawing from my own installations and research on predictive processing, embodied cognition, and multisensory integration, we’ll look at:

  1. How the brain actually constructs experience.
  2. Why controlled interactive environments make those mechanics visible.
  3. How institutions can treat these systems as tools for epistemic inquiry, not just entertainment.

You can think of all of this as a high-agency sandbox for testing what you think is true.

Perception is not reality because your brain doesn’t show you the world as it is.

It shows you its best guess based on limited sensory cues, emotional state, and past experience, in line with predictive processing and Bayesian “brain as prediction machine” frameworks (Sprevak, 2023; Walsh et al., 2020; Ficco et al., 2021; Hohwy, 2025).

Interactive art makes this construction process obvious by putting you inside responsive, low-risk environments where your expectations shape what the system does.

You see your beliefs influence the “world,” watch them fail, and then update, all without social, financial, or physical consequences.

You get to observe your own cognition in motion; instead of mistaking it for “how things just are.”

Interactive art installation titled 'Synthetic Memories' by Steve Zafeiriou, showcasing a digital memory network on a vertical screen connected to a curated set of vintage images and a handheld interface.

What Does “Perception Is Not Reality” Really Mean?

Let’s move from slogan to mechanism.

When psychologists say perception is not reality, they’re not playing word games.

They’re pointing to how your nervous system is wired.

Your brain is not a camera. It’s a prediction engine (Sprevak, 2023; Walsh et al., 2020).

Predictive processing models describe perception and cognition as hierarchical prediction:

The brain tries to minimise prediction error by comparing incoming sensory data to internal models and updating those models when mismatches get too large (Ficco et al., 2021; Piekarski, 2021).

Empirical and meta-analytic work suggests this predictive architecture shows up across multiple brain systems (Walsh et al., 2020; Ficco et al., 2021).

Most of the time, this works beautifully.

You walk, talk, cross the street, send messages, navigate rooms, with minimal conscious effort. But the trade-off is brutal:

You feel certain long before you are accurate.

Predictive frameworks are increasingly used to explain not just low-level perception but higher-level belief, prejudice, and social interpretation; including how priors shape what we “see” in other people (McGovern & Otten, 2024).

Immersive digital art scene illustrating the idea that perception is not reality, showing shifting light and sensory distortion inside an interactive installation.

The Brain as a Prediction Engine

Here’s the basic loop:

  1. Your brain predicts what’s happening (“That’s a face”, “That’s my reflection”, “This sound is behind me”).
  2. Incoming sensory data is compared against that prediction.
  3. Any mismatch (prediction error) leads to an update, suppression, or reinterpretation (Walsh et al., 2020; Sprevak, 2023).

Your sensory systems don’t give you a full-resolution model of reality.

They give you fragments.

Your mind fills the gaps with priors, past experiences, beliefs, narratives, and bodily states (Hohwy, 2025; Piekarski, 2021).

This is why you can:

  1. Completely miss a gorilla walking through a scene or fail to notice large changes in a visual display; a phenomenon known as change blindness (Simons & Levin, 1997; Shapiro, 2000).
  2. Only notice evidence that confirms your existing beliefs; the classic confirmation bias, where you seek, interpret, and remember information in ways that reinforce what you already think (Nickerson, 1998; Berthet, 2022; Wikipedia, 2024).
  3. Have your perception shifted by subtle bodily cues like posture, gesture, and movement — as embodied cognition accounts of perception emphasise (Stanford Encyclopedia of Philosophy, 2021; Noë, 2004).

All of that is your prediction engine at work.

We’ll label this whole dynamic “constructed perception”.

DIY motion capture system utilizing an ESP32 microcontroller and an MPU6050 sensor, designed for real-time movement tracking and inertial measurement applications.

Change Blindness, Confirmation Bias, and Embodied Cognition

A few key findings that illustrate the gap between perception and reality:

  1. Change blindness: People can fail to notice large, obvious changes in a scene when their attention is elsewhere. Work by Simons and Levin (1997) helped establish this as a core attentional phenomenon. Later studies have explored change blindness in both lab and real-world environments, including museum contexts where artefacts change in front of visitors (Shapiro, 2000; Attwood et al., 2018; Gunnell et al., 2019; Andermane et al., 2019).
  2. Confirmation bias: As Nickerson (1998) argued, confirmation bias is a “ubiquitous phenomenon in many guises,” where people preferentially seek, interpret, and recall evidence that supports prior beliefs. More recent reviews show this isn’t just a weird lab effect — it shows up in professional decision-making and risk assessment as well (Berthet, 2022; Wikipedia, 2024).
  3. Embodied cognition: On embodied and enactive accounts, perception is not just something that happens in the head but a skilful, bodily activity. The Stanford Encyclopedia of Philosophy summarises this as the idea that cognitive processes are deeply dependent on the body’s interactions with the world (Stanford Encyclopedia of Philosophy, 2021). Noë (2004) famously argues that seeing is a form of sensorimotor exploration rather than passive intake.

These aren’t edge cases. These are core features of how the system runs.

We’ll label this cluster of limitations and shortcuts “the illusion of obviousness”; the sense that “things are just this way” when really, they’re just this way for your nervous system, in this context.

Why Feeling Gets Mistaken for Fact

The problem is simple and painful:

  1. Subjective experience arrives instantly.
  2. Reflective evaluation is slow, effortful, and optional.

Kahneman’s System 1 / System 2 model is one popular way of carving this up: fast, intuitive, automatic processes versus slower, deliberative, analytic ones (Kahneman, 2011).

System 1 outputs tend to arrive as felt truths; they feel like direct perception, even when they’re heavily biased.

By the time you question what you’re seeing or feeling, the brain has already delivered a fully rendered experience with a little internal label that says: this is real.

You end up mistaking:

  1. Speed for truth.
  2. Familiarity for accuracy.
  3. Emotional intensity for evidence.

Hoffman’s “interface theory of perception” pushes this further:

Perception is framed as a fitness-optimised user interface, not a transparent window onto objective reality (Hoffman, 2019; Hoffman et al., n.d.).

On this view, what you experience is closer to a desktop icon than the underlying circuitry; useful, but not literally true.

That’s how you get trapped in the illusion of obviousness.

Is perception reality: Contemporary and digital art market trends illustration for 2025 showing abstract graphics, modern gallery visuals and data overlays highlighting emerging art-investment and online art-sales growth

How the Brain Forms “Best Guesses” and How Art Breaks Them

Your perceptual system is confronted with ambiguous input 24/7.

It has to collapse that ambiguity into something actionable.

So it does this:

  1. Draws from priors: “What has this pattern usually meant?” (Ficco et al., 2021; McGovern & Otten, 2024)
  2. Picks the most plausible story: “Given my history, this is probably what’s happening.”
  3. Treats that story as reality until something forces an update (Hohwy, 2025; Walsh et al., 2020).

Recent philosophical and empirical work in predictive processing emphasises that this is a general strategy for dealing with uncertainty:

Minimise prediction error relative to internal generative models, even in social and value-laden contexts (Hohwy, 2025; McGovern & Otten, 2024).

Efficient? Yes. Accurate?

Often enough to survive. Stable? Not at all; especially when you step into an environment designed to break those priors.

Interactive artworks deliberately leverage this vulnerability by introducing:

  1. Slight delays where you expect instant response (Limerick et al., 2014; Yu et al., 2024).
  2. Non-linear mappings between your behavior and the system output.
  3. Mismatched cues between vision, sound, and movement, exploiting what we know about multisensory integration (Stein, 2020; Cornelio et al., 2021; Newell, 2023).

The goal isn’t to “trick” you for fun.

It’s to create a controlled perceptual mismatch that exposes how fragile those “best guesses” really are.

When your internal model of “how this should work” shatters, you get a clear view of the prediction engine underneath.

4b web

Why Interactive Art Is Uniquely Suited to Explore This Gap

Most environments punish you for being wrong.

Jobs, social media, public speaking, relationships, they all carry social, financial, or reputational cost.

So you cling to certainty.

You avoid experiments.

You resist uncertainty because it threatens your identity and safety.

Interactive art is different.

It’s a bounded system. The stakes are low.

You’re allowed to play, fail, and re-interpret without blowing up your life.

Controlled Environments and Safe Cognitive Risk

An interactive installation is a highly constrained world:

  1. Clear physical boundaries.
  2. Defined rules, even if hidden.
  3. Limited time, context, and consequence.

Inside those constraints, you get permissionless cognitive risk:

  1. You can test weird hypotheses about “what makes this react.”
  2. You can be wrong repeatedly in front of strangers and nobody cares.
  3. You can explore uncertainty for its own sake, not for status or survival.

Attwood et al. (2018) show how change blindness can be elicited in real museum environments, where visitors still treat the context as contemplative and safe.

That’s the same psychological safety interactive art leans on:

You know (even unconsciously) that “this is just art”, so your nervous system relaxes. Curiosity goes up. Defensiveness goes down.

This is rare. And powerful.

Technology Arts: Developing Sensorify by Steve Zafeiriou

Real-Time Feedback Loops Expose Hidden Assumptions

Interactive installations are basically feedback machines:

  1. You move → the system responds.
  2. You speak → the system shifts.
  3. You change posture → the light changes.

HCI and agency research shows that timing, predictability, and feedback structure are core determinants of whether people feel like they’re the ones causing outcomes (Limerick et al., 2014; Yu et al., 2024; Cornelio et al., 2022; Zanatto et al., 2024; Dutta, 2025).

When that loop behaves as expected, your brain stays on autopilot. When it behaves unexpectedly, you’re forced to confront your assumptions:

  1. “Why did it react to that but not this?”
  2. “Is it listening to my voice, my movement, my proximity?”
  3. “Am I in control or is it doing its own thing?”

That’s the sweet spot.

This is where you become aware of your internal model of causality: what you believe leads to what.

You get to see, in real time, where that model is wrong.

Embodied Interaction Bypasses Pure Intellect

Reading about bias engages your intellect.

Walking through a room that reacts to your micro-movements engages your entire sensorimotor system.

When:

  1. Touch sensors respond to your fingertips,
  2. Motion tracking follows your gait,
  3. Mixed reality overlays track your position,

…you start relying on tacit knowledge; body-level intuition, implicit beliefs, and predictive shortcuts that never show up in a verbal explanation.

This is where embodied cognition hits home.

Instead of saying, “Here’s the right way to use this interface”, the installation says:

“Here’s a system. Move. See what your body thinks is true.”

Close-up of ESP32-S3 1.69 inch display powered on, showing a graphics demo on TFT screen with visible wiring and tools in background

Case Studies from Interactive Installations

Let’s ground all this theory in three real systems I’ve actually built; places where people didn’t just read about perception, they walked straight into it:

  1. Dark Tales
  2. Sensorify
  3. Nostalgie World

Each one targets a different fault line between prediction and reality: narrative, sensation, and memory.

Why Interactive Art Works

So why does all of this work so well?

Because interactive art plugs directly into a few core psychological dynamics.

Illusions of Control

Humans are wired to overestimate their influence on events.

Langer’s classic “illusion of control” work and later reviews in domains like gambling show people experience agency even in largely random situations (Langer, 1975; Clark, 2021).

Interactive systems can amplify or disrupt this by:

  1. Over-rewarding tiny, incidental gestures.
  2. Ignoring obvious, deliberate actions.
  3. Mapping inputs to outputs in non-linear ways.

When a piece sometimes reacts dramatically to a micro-movement and sometimes “ignores” a big gesture, your sense of agency gets scrambled.

You start to realize:

“My feeling of control is not reliable data.”

This is painful at first. Then liberating.

Because once you see that illusion clearly in an art space, you’re more likely to question it in daily life.

Live AV performance of the ‘Qualia’ collaboration featuring NTH Dance Company dancers, immersive audio-visual projection mapping and kinetic light choreography

The Body as Perceptual Interface

In many installations, your body is the interface:

  • How you stand shapes what the system does.
  • How fast you move changes what you hear or see.
  • Your reach, gaze, and orientation become inputs.

This makes embodied cognition non-abstract.

Perception becomes something you do, not just something that happens to you (Noë, 2004; Stanford Encyclopedia of Philosophy, 2021).

The body stops being a passive “vessel” and becomes an active co-creator of experience.

Play as Cognitive Softener

The moment a space feels like a game, your defenses drop.

You’re more willing to:

  1. Try stupid ideas.
  2. Test bizarre hypotheses.
  3. Admit “I was wrong about how this works”… without shame.

This is crucial, because belief revision in real life is often tied to ego and identity.

In an installation, it’s just part of the experience.

MAX30102 Heart Rate and SpO2 Results on Display – Real-time heart rate monitoring and blood oxygen (SpO2) levels recorded using the DFRobot MAX30102 sensor, visualized for Arduino health monitoring and biometric sensor projects.

Interactive Art as a Lab for Human–Computer Interaction

From an HCI perspective, interactive installations are basically live experiments disguised as experiences.

They’re informal, but rich in data:

  1. How do people behave when there are no instructions?
  2. Where do they assume the “controls” are?
  3. How much delay can they tolerate before calling the system “broken”?
  4. Which cues (sound, light, vibration, motion) do they trust first?

Nam and Nitsche (2014) explicitly frame interactive installations as performative systems that can inspire and inform HCI research, treating audience behavior as a probe into interaction paradigms.

Installations as Live HCI Experiments

Because visitors walk in with unfiltered, everyday behavior, installations offer:

  1. Ecologically valid reactions (not lab-primed behavior).
  2. Unscripted navigation patterns.
  3. Real-time improvisation under uncertainty.

You see how people actually interpret ambiguous systems without onboarding, tooltips, or help menus.

Research on sense of agency in human–computer interaction, emerging technologies, and human–AI systems gives a formal vocabulary for what you observe:

How factors like automation level, transparency, and feedback dynamics shift perceived authorship (Limerick et al., 2014; Yu et al., 2024; Cornelio et al., 2022; Legaspi et al., 2024; Zanatto et al., 2024).

Side-by-side comparison of an Arduino Nano and Ultrasonic Sensor setup detecting hand motion, paired with dynamic fluid-like visuals in TouchDesigner, showcasing real-time interaction and generative art.

Testing Interface Assumptions Through Play

Every interactive piece is, in some sense, an interface test:

  1. What do visitors assume “counts” as input?
  2. How do they interpret feedback loops they don’t fully understand?
  3. When do they give up? When do they persist?

By watching how people experiment with mappings (“If I do X, will it do Y?”), you get a direct look at their internal models of digital systems.

This is especially relevant for:

  1. Emerging AI interfaces.
  2. Mixed reality environments.
  3. Reactive architectural systems (Cornelio et al., 2021; Sklar, 2025).

Data Opportunities (and Constraints)

With motion tracking, proximity sensing, and reaction logging, it’s possible to capture:

  1. Navigation heatmaps.
  2. Timing and latency tolerance.
  3. Common exploration strategies.
  4. Points of confusion or delight.

All of this parallels how change blindness demonstrations are used to recalibrate risk perception in contexts like driver awareness courses (Gunnell et al., 2019).

The difference is that interactive art couples that recalibration with aesthetic and reflective framing.

But there’s a line that must not be crossed.

Ethics: Insight, Not Surveillance

Any time you’re logging data on human behavior, you enter ethical territory.

Key constraints:

  1. Consent: Visitors should know what’s being captured.
  2. Clarity: Explain how data will be used (research, iteration, not profiling).
  3. Privacy: Avoid identifiable, sensitive data where possible.
  4. Interpretation discipline: Don’t pretend playful exploration is a clinical diagnosis.

The rule is simple:

Data must serve insight and better design, not surveillance or control.

Implications for Museums, Universities, and Innovation Labs

If you run an institution that claims to “challenge assumptions” or “expand minds,” interactive art isn’t just a nice to have.

Designing Environments with Cognitive Friction + Safety

The ideal installation for belief testing combines:

  1. Safety: No social, financial, or permanent cost to being wrong.
  2. Cognitive friction: Enough unpredictability to force model updates.
  3. Clear boundaries: People know where the “game” starts and ends.

You want visitors to feel:

  1. Comfortable enough to explore.
  2. Challenged enough to question their default path of perception.

Real world museum work on change blindness shows how exhibition settings can host powerful demonstrations of how much we miss, without making people feel attacked or defective (Attwood et al., 2018).

GeoVision V2 rating display showcasing user feedback and performance evaluations for the advanced geographic visualization system.

Facilitated Reflection: Turning Experience into Insight

Raw experience is powerful, but it’s not automatically meaningful.

Institutions can amplify impact using:

  1. Short guided debriefs.
  2. Reflection prompts (“What did you assume at first?”, “When did you realize you were wrong?”).
  3. Post interaction materials (sketch cards, digital follow-ups).

Simple micro-interviews or sketch based reflection can reveal:

  1. How people narrate their own perceptual errors.
  2. What they learned about their assumptions.
  3. How they might apply that insight elsewhere.

Qualitative methods often reveal more than dashboards of metrics.

Measuring Cognitive Shift

Instead of obsessing over “engagement time” and “foot traffic”, measure:

  1. Changes in explanation before vs after the piece.
  2. The richness of language people use about their own mind.
  3. Their willingness to entertain uncertainty.

This shifts the focus from “How long did they stay?” to “How deeply did they see themselves?

Sensor Data to Touchdesigner: Steve Zafeiriou, presenting the DIY motion capture controller. Tranferring real-time data to Touchdesigner and controlling a 3D model.

Prototyping Future Technology

Interactive installations are prototypes for tomorrow’s interfaces:

  1. How will people behave in fully reactive environments?
  2. What kinds of delays, glitches, or unpredictability feel insightful vs broken?
  3. How will people understand AI systems they can’t fully explain?

Cornelio et al. (2021) explicitly tie multisensory integration research to technological advances and immersive systems, making installations a perfect testbed for how people will relate to future human–computer integrations.

By treating exhibitions as test beds, museums and labs can see early how humans adapt to new digital spaces; before those spaces show up in offices, homes, and cities.

How to Build Art That Reveals Perception

If you’re an artist, designer, or researcher, here’s a simple framework for building installations that expose the perception-reality gap.

1. Engineer Uncertainty and Feedback

Create systems that:

  1. React in ways that are responsive but not obvious.
  2. Introduce subtle mismatches between expectation and output.
  3. Make people ask: “What is this actually responding to?

Examples:

  1. Non-linear mappings between gesture and visual output.
  2. Slight delays that make causality ambiguous.
  3. Multisensory conflicts (sound vs sight vs vibration).

This makes predictive processing visible.

People feel their own “best guesses” breaking.

2. Design Safe Failure

Failure should be:

  1. Reversible.
  2. Non-humiliating.
  3. Free from personal or social penalty.

When the cost of being wrong is low, people shift from avoidance to exploration.

They tinker. They iterate. They become researchers of their own perception.

3. Use Multisensory Integration

The more modalities you combine, the louder the construction process becomes:

  1. Light + sound + haptics.
  2. Motion + spatial audio + projection.
  3. Mixed reality overlays + physical props.

4. Document and Iterate Rigorously

Treat your installation like a living lab:

  1. Record (with consent) how people move, where they hesitate, what they say.
  2. Capture qualitative comments and sketches.
  3. Look for patterns of misinterpretation, delight, frustration, or breakthrough.

Then:

  1. Adjust mappings.
  2. Refine constraints.
  3. Tune friction levels.

You’re not just making a “piece.”

You’re building a system for revealing minds to themselves, which aligns with how predictive processing and agency research model the relationship between action, feedback, and belief (Dutta, 2025; Limerick et al., 2014; Yu et al., 2024; Cornelio et al., 2022).

TouchDesigner-based digital visualization of a memory network with interconnected floating images and labeled nodes, representing synthetic memory reconstruction in a dark, immersive interface.

What We Learn When Art Shows Us Our Minds

Interactive art proves something uncomfortable and freeing:

  1. Your perception is constructed, not downloaded (Sprevak, 2023; Walsh et al., 2020; Ficco et al., 2021; Hohwy, 2025).
  2. It is partial, not complete (Simons & Levin, 1997; Attwood et al., 2018; Andermane et al., 2019).
  3. It is negotiable, not fixed (McGovern & Otten, 2024; Noë, 2004; Newell, 2023).

When you step into a responsive system that doesn’t behave how you expect, you get to see:

  1. Beliefs forming.
  2. Beliefs failing.
  3. Beliefs reforming in real time.

These installations become mirrors; they don’t just show you “a world”, they show you how you build a world inside your head and body.

For institutions, the takeaway is blunt:

If you want to change how people think, don’t just give them information. Put them in environments where they can safely test what they believe.

When perception becomes visible, understanding has a chance.

Conclusion

Interactive art doesn’t tell you what reality is. It doesn’t hand you a replacement belief system.

Instead, it reveals the architecture of perception:

  1. How your brain predicts.
  2. How your body negotiates meaning.
  3. How your expectations sculpt the systems you interact with.

By inviting audiences into responsive, embodied encounters, these works let people:

  1. Witness their own minds in action.
  2. Experience the fragility of certainty.
  3. Develop a more flexible, high-agency relationship with their beliefs.

For museums, universities, and innovation labs, these installations aren’t just exhibitions.

They’re cognitive laboratories that show you how humans handle uncertainty, ambiguity, and non-obvious feedback (Attwood et al., 2018; Cornelio et al., 2021; Nam & Nitsche, 2014; Sklar, 2025).

As we design future interactive systems, such as AI tools, mixed reality environments, and responsive architecture, acknowledging the fluidity of perception isn’t optional.

It’s the baseline for building responsible, insightful experiences.

The systems aren’t neutral. Our minds aren’t either.

What we do with that knowledge is the real game.

Total
0
Shares
Dashboard screen of Konnekt Index Automate showing live art & technology NFT-market data, analytics charts, network graphs and real-time connectivity metrics

We make art intelligence effortless

Art, Creative Technology & Market Data. Your Inbox. Daily.

A scoring engine for cultural signals, built for creators, researchers, & institutions.

  • Daily alerts with relevance, sentiment, and innovation metrics
  • AISF-powered reporting with human-readable scores
  • Curated trend analysis
  • Weekly & monthly printable reports
Start your 48-Hour Free Access →

* P.S. We’re offering a limited number of free lifetime accounts for early supporters; join us early and help shape what we’re building.