P004 → Parse-String





Objects and people can serve as sources of information to which attention can be paid.
Attention can be shifted between them, resulting in different information being received.
This information is manifested abstractly on strings on the body with varying colors and intensities.





We are a multitasking society that often requires us to process multiple sources of information simultaneously, making it difficult to focus on one task. That‘s why our attention is being increasingly challenged by the fast pace of modern life. In our project objects and people can serve as sources of information to which attention can be paid or that can demand attentiveness. Attention can be shifted between them, resulting in different information being received. This information is manifested abstractly on strings on the body with varying colors and intensities.
There is a growing amount of data and information available to us all the time. We are constantly bombarded with information from a variety of sources and it can be difficult to focus on anything alone. ParseString aims to visualise the act of “paying attention”. In our wearable Information is being mapped as: velocity, colour or patterns or physical restriction. The different informations are being tracked with UWB – receivers which are installed in the room. They serve as a source for the different parameters and as an index of attentionspan.





We imagine our work to be an interplay between a fictional and a real scenario. We wanted to work with the materialisation of non-tangible data and thus make the invisible visible.
Furthermore we connect strings to the body in order to limit the movements, but also to underline the direction of the movements in space. At the same time, the costume turns the body into an abstract figure, capable of describing data through geometry and movement.





Context:
Data-Driven Wearable from 6-week Embodied Interaction Course

Technology:
UWB tags for real-time position tracking
stepper motor & pump driven mechanics
mqtt based software network for subsystem control
data driven generative MAX patch for sound

Date:
December 2022

Mentors:
Dr. Joëlle Bitton
David Wollschlegel

Credits:
Concept: S.Beti, E.Bonorva, J.Tillich, J.Reck, F.Willi
Research: S.Beti, E.Bonorva, J.Tillich, J.Reck, F.Willi
Engineering: J.Reck & S.Beti
Sound: J.Reck
Textile: F.Willi
Visual Communication: E.Bonorva
Video: J.Reck & S.Beti
Programming: J.Reck & J.Tillich