TY - GEN
T1 - Innermost Echoes
T2 - 18th International Conference on Tangible, Embedded, and Embodied Interaction, TEI 2024
AU - Hynds, Danny
AU - Chernyshov, George
AU - Zheng, Dingding
AU - Uyama, Aoi
AU - Li, Juling
AU - Matsumoto, Kozue
AU - Pogorzhelskiy, Michael
AU - Kunze, Kai
AU - Ward, Jamie A.
AU - Minamizawa, Kouta
N1 - Publisher Copyright:
© 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.
PY - 2024/2/11
Y1 - 2024/2/11
N2 - In this paper, we propose a method for utilizing musical artifacts and physiological data as a means for creating a new form of live music experience that is rooted in the physiology of the performers and audience members. By utilizing physiological data (namely Electrodermal Activity (EDA) and Heart Rate Variability (HRV)) and applying this data to musical artifacts including a robotic koto (a traditional 13-string Japanese instrument fitted with solenoids and linear actuators), a Eurorack synthesizer, and Max/MSP software, we aim to develop a new form of semi-improvisational and significantly indeterminate performance practice. It has since evolved into a multi-modal methodology which honors improvisational performance practices and utilizes physiological data which offers both performers and audiences an ever-changing and intimate experience. In our first exploratory phase, we focused on the development of a means for controlling a bespoke robotic koto in conjunction with a Eurorack synthesizer system and Max/MSP software for controlling the incoming data. We integrated a reliance on physiological data to infuse a more directly human elements into this artifact system. This allows a significant portion of the decision-making to be directly controlled by the incoming physiological data in real-time, thereby affording a sense of performativity within this non-living system. Our aim is to continue the development of this method to strike a novel balance between intentionality and impromptu performative results.
AB - In this paper, we propose a method for utilizing musical artifacts and physiological data as a means for creating a new form of live music experience that is rooted in the physiology of the performers and audience members. By utilizing physiological data (namely Electrodermal Activity (EDA) and Heart Rate Variability (HRV)) and applying this data to musical artifacts including a robotic koto (a traditional 13-string Japanese instrument fitted with solenoids and linear actuators), a Eurorack synthesizer, and Max/MSP software, we aim to develop a new form of semi-improvisational and significantly indeterminate performance practice. It has since evolved into a multi-modal methodology which honors improvisational performance practices and utilizes physiological data which offers both performers and audiences an ever-changing and intimate experience. In our first exploratory phase, we focused on the development of a means for controlling a bespoke robotic koto in conjunction with a Eurorack synthesizer system and Max/MSP software for controlling the incoming data. We integrated a reliance on physiological data to infuse a more directly human elements into this artifact system. This allows a significant portion of the decision-making to be directly controlled by the incoming physiological data in real-time, thereby affording a sense of performativity within this non-living system. Our aim is to continue the development of this method to strike a novel balance between intentionality and impromptu performative results.
KW - improvisation
KW - liveness
KW - music composition
KW - physiological sensing
KW - public art
KW - sonic art
UR - https://www.scopus.com/pages/publications/85185217556
UR - https://www.scopus.com/inward/citedby.url?scp=85185217556&partnerID=8YFLogxK
U2 - 10.1145/3623509.3633356
DO - 10.1145/3623509.3633356
M3 - Conference contribution
AN - SCOPUS:85185217556
T3 - ACM International Conference Proceeding Series
BT - TEI 2024 - Proceedings of the 18th International Conference on Tangible, Embedded, and Embodied Interaction
PB - Association for Computing Machinery
Y2 - 11 February 2024 through 14 February 2024
ER -