This project involved stretching the length of Tarkovsky’s “Solaris” to 24 hours and presenting it on two screens. On one screen, the film played normally, while on the other, it played backward. I developed a program that sent data (pixels) for each frame of the film in real-time and transformed this data into music, also in real-time.
The technique I used was similar to what Artemiev employed in the original soundtrack, transforming images into sound. For instance, changes in color brightness were directly linked to changes in musical pitch, creating a direct connection between light and sound.
As Almira Ousmanova described it, our goal with “24h Solaris” was to create a situation of slowed-down viewing and reading, a truly hypnotic immersion into the fabric of the film, allowing for a profound exploration of its space and time.
Also, co-author Natalia Nenarokomova emphasized the importance of perception in this project. In the installation, we took the film beyond the traditional cinema experience and transformed the role of the viewer from a passive observer to an active participant in the gallery space. This shift allowed viewers to engage with the form of the film and the captured visuals, perceiving both the film and reality differently.
Here is how it sounds like: