Project Paperclip is a photographic exhibit that uses augmented reality (real-time processed soundscapes). The concept tries to transport visitors to a state that gives them a different interpretation of the 16 photographs present at the exhibit.
In Project Paperclip, Nuno Serrão has not only caught the moment exhibited in the photo, he was also responsible for co-creating the interactive ambient soundscapes that accompany them, allowing the channel of communication with the public to be more than just the optic nerve by including the auditory canal.
The experience is unique each time it is activated as the app’s algorithm utilizes real-time processing of variables from the visitors location, such as: the time of day, the level of noise that exists in the room, your voice, the movement and localization of the user, amongst many others.
The concept of Augmented Reality has it’s usually known, utilizes a digital interface to permit the creation of a bridge between our universe and the digital universe, creating a mixed ambience in real time where the differentiation between these two realities is reduced. In this exhibition, this is brought about by using an iPhone, headsets and the software that is available on the Apple app store.
To make full use of this exhibition, visitors that have an iPhone 3 or above need to download Project Paperclip application, at the Apple AppStore. When equipped with headphones (the better the quality, the more immersive the simulation will be), switch on the app and follow instructions to activate the reactive soundscapes. The process is simple, point the iPhone’s camera to the photograph’s QR Code and after it has been scanned, you will unlock the soundscape created for the photograph in question.
Concept & Photography by Nuno Serrão
Development and augmented sound by Yuli Levtov and Ragnar Hrafnkelsson (Reactify Music)
Soundscapes by Alexandre Gonçalves (alexnoise) and Nuno Serrão
Interface design by urbanistasdigitais.pt
Special thanks to Andreia, Alex, Yuli and Marco