After yesterdays sorting out of projections I have been rethinking the space quite a lot. There is a lot of bleed from one screen to another so I think I really need to block off all corners of screens so that the back projections have a clean framed image on the white sheets.
The array is configured to where I have placed the speakers. It is set up with the full array on the out side – this will have the Ambisonic field recordings with the MIDI Sprout controlling all aspects of the outside/outer parts of the installation. In the middle the 4 speaks will be separated so that each speaker will be designated a plant. Each plant with a piezo mic that will be detecting vibrations.
The last bit of tech that I needed to sort out was how the piezo mics would trigger the prerecorded MIDI Sprout data. Through out the semester I had been working with Max quite a bit and was only until a few days before we broke up for Christmas I suddenly understood how Max4Live worked! I realised that it just made total sense to do everything through Ableton and max4live instead of two separate things – Ableton doing the Ambisonic stuff and then Max doing all the plant noises/visual aspect.
As my laptop wouldn’t recognise any interfaces it was frustrating not being to work on this over the Christmas period but because I knew how everything was going to work and a majority of how it will all sound I knew it was going to be ok to get this all done during this week of prep/install.
I had to get some help on how to get audio tracks to trigger MIDI tracks in Ableton but had researched M4L.listofabstractions would likely be the thing I was looking for. Now on each audio track (for the piezos) there is a max4live patch that is set to trigger a particular track – in this case within the Ableton file itself, it is the next MIDI track along from the audio input and are paired that way to make it clear which audio input is triggering which MIDI track.
I had also connected up the Arduino but realised there wasn’t really anything for it to do… Although it was interesting seeing how that all works and think because I got a bit inpatient with not being able to do anything over the Christmas period, I started to work on sensors. I’d like to potentially carry this on into the next semester though see how sensors/lights/triggers could work in that space. I think currently there is enough going on in the installation for a sensor – however interesting – to complicate and maybe be a bit out of place in the installation.
So I’ve made quite a few adjustments to the space. I’ve made two corridors that flow into the installation to give the audience direction to walk in/out of the installation. This is also to block out any bleed from the projectors. This is the first time I’ve used black sheets in an installation. Although it looks a bit make shift when the lights are on, I was surprised about how effective they are when the lights are off!
I sewed up the edges of the sheets to neaten it all up. Whilst doing this I think this was the first time I thought to myself that it was a proper installation and that it was now becoming a real thing. I only lightly sewed them so that you can’t see the stitches on the other side of the screen. This was super effective and really brought the whole space together.
I have blocked off the back space now. It just made sense to do that as it was kind of a void space that wasn’t really doing anything. As it was something that was on my mind for the last couple of days I knew i just had to do it. Now the space has an entrance and exit to direct the audience more clearly through the installation.
Last thing was to finish taping up edges to make the installation safe.