Director and dramaturg on Define Your Journey, Kate Lovell, writes about her experience testing out painting with sound.
I joked with Jo-anne about wishing I didn’t have to wear my director’s hat and keep us focused on our meeting agenda. Instead I wanted to don my jester’s hat and let us play all day with the new digital instrument that our creative technologist, Charles, had been working on. I hadn’t yet had a chance to play, but Jo had, and so she already understood its mesmeric quality.
With most of the logistics talked out in our cross-continental meeting, I was delighted when Charles suggested we have a go with the digital instrument he had been coding. The instrument makes it possible to paint with sound via a website Charles has created. When I clicked on the link Charles shared in our Zoom chat, I thought it would be just me playing solo. But then as I painted trails, leaving both visual and sound echoes, Charles joined in the painting too. I hadn’t realised that it would be possible to play with someone else, and Charles explained there could be more than two players, even a group jam is possible.
As I was playing the instrument, something unexpected happened within me. I focused on the jamming session and the rest of the room, the meeting and the world fell away. Suddenly everything was about just me and Charles painting a sound picture together. Charles in Canada, me in the UK, but both of us connecting via this opportunity to jam with sound and painting. At first, I tried to keep out of Charles’ way, I didn’t want to spoil or interfere with his painting and sound making. But then he explained that when the paintings crossed over and interacted, it would affect both the quality of sound and the visual look of the painting.
I followed trails, bounced polka dot chimes, swirled around, swept away from the other paint trail, and back again, converging and departing, meeting and leaving, all of the journey swooping across the screen, layering up like real paints on a canvas. I was incredibly moved by the experience. Being able to connect with someone after such a time of separation and distance, and to interact playfully, creatively, with someone thousands of miles away in physical distance, felt profound.
I’m incredibly excited about the opportunities for meaningful interaction that the sound painting tool will open up in Define Your Journey. I hope you enjoy Charles’ fascinating video about the sound painting, as well as working on sound-responsive lanterns for Defiant Journey, the live theatre experience. A full transcript of Charles’ video is also available below.
Full transcript of Charles’ video:
Digital Instrument Development for Define your Journey
Hi, in this video, I’d like to talk about our process of developing some of the custom digital instruments for Define your Journey. How we started out by creating lanterns and instruments for live, in person performances in a previous version of the project, and how we’ve adapted when everything moved to an online space.
My name is Charles Matthews, I’m working as a creative technologist on the project. And just the audio describe myself, I’m white man in his late 30s, sporting a beard and glasses, and I’m sitting on a very poorly lit rooftop, here in Montreal — and unfortunately, quite windy as well.
But I’m out here because I’m hoping to give something of practical demonstration of these lanterns and how they connect. So as I’m speaking, this one is lighting up in blue and purple to respond to my voice with a row of LEDs.
This is not intended to be a technical demo, but be able to demonstrate how some of this works describe a bit more broadly how it works. In particular, I’d quite like to highlight how we’ve noticed that addressing various angles on the accessibility and other technical issues are really crossing over with creative decisions.
And I’d also like to talk a little bit at the end about how we’ve been able to extend this indirectly, as an open source project, or certainly with the intention of making this an open source project by sharing our code and ideas with another organisation that I’m involved with calls and blurring the boundaries arts here in Montreal. And in doing so, share how we’ve been trying out some of these ideas in practice, as Blurring the Boundaries have been meeting up with people online.
Now, let’s rewind for a moment. Unfolding on the screen now is a video clip of some of the circuit boards that I was just playing with. This time inside paper lanterns hanging from trees in St Viateur a park in Montreal at night. The lanterns are swaying gently in the wind as the camera moves through them. And they’re responding to the sound of Joanne Cox’s cello with different colours, depending upon the note she plays. The texture is changing as the echoes on the cello pulsate, as well. There are oranges, pinks, blues, and these are reflected faintly on the leaves of the trees. This piece represents the Dragon’s Cave in Define your Journey, and it also appeared in the scratch performance in London the year before.
This was just the start of a lockdown, and on an alternative timeline, we probably would have been sitting in a rehearsal studio in London doing this. Pretty much every collaboration was becoming remote at that time. It presents us with quite an exciting challenge to work on this long distance. Sound was coming from a clip that Joanne had sent me over email. And we knew that it was going to take some time to do this live. Having said that, it was quite magical to see and hear it coming to life outside.
As well as responding to the sound of Joanne’s electric cello, we could use a microphone that we would pass around to participants in a workshop for example. We also started work on an iPad app, really in its early stages, but something that could be used to choose the colours that was shown on the mantains stage lines. In this next section, we’re going to see a really early version of this and I’ll keep the sound of the other piece running underneath just for a bit of musical context.
So as the trees and the lanterns fade out, we return to the roof this time of split screen. I’m holding an iPad on the left and on the right hand side there’s a single lantern, again gently swaying in the wind.
Along the screen, I’ve got columns, which will probably be familiar to anybody who’s used Thumb Jam before, but in different colours. And if I slide my finger along the screen, I can play different notes. And depending on how high up my finger lands on the screen, I’m also changing the volume of those notes. And the screen is responding by getting lighter and darker. On the other side of the roof, I’ve got a lantern set up to reflect these changes. Now there are a few different places we can go with this. And it’s worth mentioning that we’ve been working on some instruments for fiscal performance that we’ll have to cover in another video. These ones are using a proximity sensor, various other sensors and switches. This is nothing particularly new. But what’s nice about it is that they use the same boards as the lanterns. So they’ve got the lights already, and they’ve got the MicroBit for radio control.
By doing this, we’ve already got spares, and we’re testing the equipment out so that when we’re ready to try physical performance in the same space again, it won’t be such a big leap.
But what I’d really like to focus on now is how things have progressed from that iPad app into playing online. This was a way of setting the colours for different notes, but it wasn’t really quite where we wanted it to be in terms of customising things and passing an iPad around in the workshop, it’s not quite the same as accessing something through a browser in your own time. Now as we move developments onto the web, as well as the custom interfaces that we made for the iPad, we’ve had the opportunity to play around with existing elements such as the colour pickers built into the web browser. And one of the advantages of using these is that they’re compatible with screen readers and voice control and other interfaces as long as we don’t close those options off, in contrast to a lot of more specialist software that I might use in this context.
So I’m going to use a colour picker built into Google Chrome. I’ve got three boxes on the screen, one, two, and three. And those correspond with lanterns behind me. As I click on number one, it brings up a gradient from black to red. So I’ll drag the mouse pointer up to the right hand side of this colour picker to create a red lantern. Number two, I’ll do the same. Moving over to red, and then there’s a slider in the middle that I’ll use to move this over to pink and then blue. Finally, for number three, I’ll do the same thing once more: I’ll move number three over to yellow.
Underneath the colour pickers, there’s a start button. And if I click this, then lantern number one should start responding to my voice. So with the colour that I’ve chosen, the intensity of the sound is now changing the intensity of the lantern. Now I’m not trying to suggest that this is an inherently accessible interface, and it might well not be the best way to set the colour for the lanterns. But it is fairly easy in terms of code to swap out for other options. And that’s one of the most important things about access: first of all, having the options, and not closing them off.
This is where we start crossing over with the meetups that Blurring the Boundaries have been hosting. As I’ve been speaking, a video has been crossfading in with myself and Gift Tshuma on a Zoom call, playing with something that’s not a million miles off what we were just looking at.
We’ve been trying to set up some sort of regular session to get people together to play music online. And over the last few weeks, we’ve just been making a point of trying things out but not necessarily finished. In this case, it’s just the two of us. And almost by accident, this was the first time that we got to control the lanterns over the internet while making music. We wanted a very clear interface for this. So on the right hand side of the screen, we had a gradient from blue to red, much like the colour picker. We’re moving our mouse pointers around this to make the sounds not in any particular scale.
Sounds are coming through the website rather than on the call, but we’ve kept the video on so that we can communicate more generally. So there’s a direct connection between what’s going on the screen and the lantern on my end of the call on the bottom. We’ve been trying out so many different ways of playing online using webcam tracking, mouse pointers, touch screens, anything we can get our hands on really. And yet, this gave us a different focal point. It’s really exciting and actually helped us engage with a few more people with a bit more of a visual feel for the sessions.
So in the final clip I’m going to play, we’ll see something quite different than moving away from using the lanterns and into painting. It’s another split screen situation. So on the left, we have the first person’s perspective. On the right, we have an audience perspective.
At the moment, there are other people on the call playing and there’s nothing going on in the first person’s perspective. So both screens are the same aside from the traces from earlier on the session for the audience.
We wanted to explore a situation where if you’re drawing and playing your sounds, and your visuals will actually be quite different, quite easy to distinguish, especially if there are ten other people on the call. This actually makes it harder to interact directly because you don’t know exactly what other people are going to be seeing and hearing. Some people found that quite counter intuitive. It’s something that we need to think about if we’re going to move forward with an idea like this, how to set up the expectations.
(First person perspective is coming to life as a person who’s recording the call is starting to play).
As something we hope will become an open source project, and just starting to play around with exchanging ideas, it’s been quite interesting to do this indirectly. To be able to make a copy or branch of the project at Blurring the Boundaries… To take it in our own direction, without worrying that we’re going to change things in the main project, and things we make can be integrated if they fit.
Here (returning to the on-screen painting), the improvisation is kind of coming to a peak. We have a few people playing at once and lines are starting to emerge between the different players. Ideally it should feel like software is encouraging people to play together.
This has been a really different direction from how we started out. So it’s going to be interesting to see this, perhaps, combined with the lanterns down the line.
Thanks to Arts Council England for supporting the further R&D of Define your Journey. Define your Journey is supported by disability arts online drag music together 2012 and Blurring the Boundaries. Previous support from Help Musicians UK do it Differently Fund enabled the design and development of radio controlled lighting which contributed to this project.
Blurring the Boundaries Arts’ online meetings are part of the Canadian Accessible Musical Instrument Network (CAMIN), which is supported in part by funding from the Social Sciences and Humanities Research Council. Thanks to the University of Calgary and VibraFusionLab.
For more information on Define Your Journey and the team behind it, visit cello.joannesonia.live
The online interactive elements in this video were created using the p5js library. The radio controlled elements of the lanterns are enabled by MicroBit
Blurring the Boundaries will be launching public activities soon: watch blurringtheboundaries.org/ for announcements.
Follow Jo-anne Cox