fbpx

New media and electronic music: quakes and creations

caleb k

The second in a series of Digital Seminars at Metro Screen was based on the questions, “Interactive new media and electronic music have developed within the same historical zone. Will they become a hybrid? Will live music become more like an installation or an interactive performance or distributed netevent?”

It’s that same old question of computers and art, of what’s new or somehow different from what has gone before. Technology and art/music have always gone hand in hand. At the centre of all art/music is technology in some form, whether a brush/computer or a violin bow/laptop. Technology is always there. That said, the topic was rarely addressed by the speakers, but a number of interesting angles were opened up.

Toby (Kazumichi) Grime demonstrated his Electronic Sound Remixers project. He gave examples of analogue sound production equipment and the equivalent electronic interfaces available, explaining why he felt these interfaces were clumsy and difficult to use, and then demonstrating his solution in the form of his Director-based application. The interface allows Grime to play and mix his music in a more interactive way both live and in the studio. The Remixers open possibilities for a more visual and graphic interface for the non-linear creation of Grime’s audio.

Scott Horscroft spoke about his use of new 3D sound imaging systems in installation and sound works. His projects have a huge presence using the processed sound of installed wind devices like fans and airconditioners, as well as prerecorded sounds, to create the audio. Interaction within the system is limited to timing and triggers—the sound within the work is only partially live. Horsecroft’s installations sit well beyond the inhibiting term ‘hybrid.’

Wade Marynowsky talked around his Interactive Keyboard which allows for triggered images and sound at the touch of a key. Image stills flash in sequence and are attached to sounds. This combination works well when played in a live environment. Sounds converge, are joined to image and played out in real time. The piece firmly etches itself on the viewer’s brain like a scrambled MTV playback. Wade may want to infect all computers with sound bites but what he has done is infect brains with sound/image viruses.

David Rogers explained his use of a massive ex-museum earthquake machine. The links with the seminar topic were at best tenuous and Rogers had no interest in addressing them. There is noise in the quake system and the work of Triclops International is, by all accounts, extremely noisy…but electronic music? However, the audience seemed attached to Roger’s work, speaking at length with him during question time, engaging much more with him than with the other sound artists. Audio gets lost, again, when words and critical engagement are needed.

New Media and Electronic Music, Metro Screen, Sydney Film Centre, Paddington Town Hall, February 14.

RealTime issue #36 April-May 2000 pg.

© Caleb K; for permission to reproduce apply to realtime@realtimearts.net

1 April 2000