Рет қаралды 224
The pandemic has been a disaster of unimaginable proportions. Making art and music during such a time, whilst others are suffering and enduring great hardship, seems futile. However, music and art are a great comfort to many, perhaps not more so than the musicians themselves and the social interaction that plays an indelible role in music. Using the Ninjam server setup at York to synchronise two geographically distant modular synth setups; Ben Eyes and Jethro Bagust explore how streams of found audio, real-time modular synthesis, stochastic compositional processes and video (courtesy of Lynette Quek) can be merged online to create a real-time audiovisual miasma. The piece was recorded live in one take after several distanced rehearsals.
Jethro writes: 'The instrument played by Jethro is populated with numerous chance elements that are linked to musical parameters. These elements of uncertainty blur the distinction between the roles of performer, composer, and audience because we are all hearing the music for the first time. Improvising with indeterminate instruments such as this, that defer the note by note production to algorithms, might be akin to steering an animal that you can point in a particular direction but not precisely know their behaviour. There is a tension between the human and the machine, the player must listen and react, responding to the system at an indirect meta-level. The pre-recorded audio sources are from John Cage and Morton Feldman 'In Conversation, Radio Happening I of V recorded at WBAI', New York City, 1966-67. Ben's own setup is based around a custom Max/Msp patch linked to a modular synth that allows real-time interaction with musical sequences and rhythms. Influenced by dub and techno, sound sources in the system are filtered, delayed and reverberated live in the mix to create musical form and progression.'
Jethro Bagust: / jethro-bagust
Ben Eyes: www.beneyes.co.uk
Lynette Quek: lynettequek.site/