Steve Zafeiriou is a New Media Artist, Technologist and Founder of Saphire Labs. His work explores the boundary between perception and reality, researching the ways technology influences, shapes, and distorts our understanding of the world around us. Through a practice grounded in generative algorithms, electronic circuits, and interactive installations, Steve investigate human behavior and the illusory nature of reality, encouraging viewers to rethink their understanding of the world around them.

In search of IKIGAI
dark mode light mode Search Menu
MATAROA AWARDS 2024, Sensorify v2.0

Sensorify

current version: v2.0
MATAROA AWARDS 2024, Sensorify v2.0

Description

The theme of the work revolves around a future scenario in which human communication takes place entirely through the internet. This communication expands and includes human sensations and emotions that will be digitally communicated to their recipient.

The theme reflects the dystopian manifestation of the future as physical contact will have been replaced by digitized data.

The installation presents the digitization of emotions as they are recorded in brain activity through an EEG. The digital presentation of the emotions will be projected on one screen while the digitization process takes place on the other.

all 2048x640 1

A Sony a6400 4k Camera is used to capture a collection of high-resolution photographs from multiple angles of a subject’s face, mapping every contour, wrinkle, and feature.

These images are then processed using photogrammetry to create a three-dimensional point cloud, a dense cluster of data points that replicates the unique physical landscape of the face in digital space. This point cloud serves as a visual representation of the subject, which can be digitally manipulated and visualized, making it an integral part of the installation’s exploration of digitized human emotions and interaction.

With data from an EEG headset, TouchDesigner is used to apply changes to the point cloud in real-time, reflecting the subject’s changing emotions and reactions. The resulting visualizations are then projected on the screen, offering a unique and immediate view into the digitized representation of human emotion. This interplay between the physical and digital, made possible by the technology of photogrammetry and the capabilities of TouchDesigner, is what gives the artwork its profound impact.

Mediums

Title: Sensorify
Year of Creation: 2023
Technologies: Touchdesigner, Javascript, React, Three.js, Photoshop, Colmap (photogrammetry), Emotiv Brain Computer Interface (EEG)
Type: Interactive Installation
Facial Models & EEG Recording: Irene Dimakopouloou

*This Installation developed under Inspire Project 2023 at MOMus Museum of Contemporary Art.

Total
0
Shares