6 Questions to Composer Bruno Degazio
Composer Bruno Degazio’s (b. 1958) film work includes the sound design for the Oscar nominated documentary film, The Fires of Kuwait and music for the all-digital, six-channel sound tracks of the IMAX films Titanica, Flight of the Aquanaut and CyberWorld 3D. His many concert works for traditional, electronic and mixed media have been performed in North America, Europe and Asia.
As a researcher in the field of algorithmic composition he has presented papers and musical works at leading international conferences in Toronto, New York, London, The Hague, Koln, Tokyo and Hong Kong. He was a founding member of the Toronto new music ensemble SOUND PRESSURE and of the Canadian Electroacoustic Community. He has written on his research into automated composition using fractals, genetic algorithms and techniques of computer-assisted composition. He teaches Sound Design in the Classical Animation Program of Sheridan College, Canada.
 Briefly describe your musical / sound art background and education, formal and informal.
I spent five years at the University of Toronto from 1976–1981, receiving a Bachelor’s degree in 1980 and a Master’s degree in Composition in 1981. It was here that I first learned how to use modular analog synthesizers and the techniques of tape recording and manipulation.
After that however my real education began. I created soundtracks for experimental film-makers, TV commercials, documentary films and eventually feature films. I had to learn about much more complex recording techniques involving multi-channel perforated magnetic film, large mixing consoles with “gapless punch-in”, and complex layering of sonic materials. Early on I worked for experimental film-maker Bruce Elder, whose technique of filmmaking involved the complex layering of many strands of material, both visual and sonic. I remember one section of his film Illuminated Texts where we had 200 tracks of sound effects, narration, and synth-generated music running simultaneously.
Later I worked for IMAX, where I was very lucky to be able to work with some of the most brilliant minds, both technical and creative, in the country. Way back in 1984 they invented a digital audio playback system in their theatres. Because they wanted to be able to advertise an “all-digital soundtrack” we very early became involved in PCM digital recording and editing. Because they wanted each film to sound unique they were also generous in allowing a lot of R&D time for the sound designers. So I was able to explore a lot of techniques that wouldn’t have been possible otherwise.
The IMAX soundtracks were very complex, though not in the chaotic way of the Elder film. IMAX had a “Surround” sound type system years before anyone else, so we were very careful to show it to good advantage in those films. We therefore created very rich soundscapes, sometimes recording in quad, Ambisonic or binaural formats to fill out the sonic space. On some of these films we had up to 400 tracks of material, which of course had to be pre-mixed in stages down to the 6‑track master for the theatre. I also had the opportunity to work closely with great sound designers like Peter Thillaye and Ben Burtt, guys who had grown up in the tradition of feature film sound design and really knew how to put together a film soundtrack.
For all these reasons I consider my years with IMAX to be the third phase of my musical/sound-art education. The fourth phase is under way currently, as I’m in the 2nd year of a doctorate in composition at the University of Toronto. I want to explore instrumental composition again, perhaps connecting it with some of my past interests like chaos theory and art as simulation of nature.
 Could you briefly describe your current musical activities, private, within the community, and public?
My main activities for the past few years have centered on development an algorithmic composition system that I call the Transformation Engine. It is a software version of the techniques of instrumental composition that I carried out by hand in the past. I’ve presented updates on this software at the International Computer Music Conference and at the Toronto Electroacoustic Symposium. Anyone interested in finding out more can check out [my Sheridan College page].
For the past five years I’ve also had a weekly gig at the Church of Our Lady of Lourdes in Toronto, where I perform on the Electronic Wind Instrument (EWI), with a choir directed by Maria Dimanche. I play the Yamaha WX5, and for a synthesizer I use the Yamaha VL70 (though software instrument plugins have gotten so good that I’d like to try my laptop next year.) This is of course mainly traditional church music, with however the frequent opportunity to improvise.
My most recent composition projects have been spin-offs of my teaching work in the Animation program at Sheridan College. I have done many film scores for animated films by my colleagues and students. I usually use the Transformation Engine software, though sometimes it’s easier just to improvise on the EWI. These soundtracks are electroacoustic in the broad sense of being a combination of electronic and acoustic sounds, though musically they’re usually in traditional or popular styles, as needed by the film.
 Please briefly describe your uses of technologies in your creative life. You may want to include a short description of the equipment and software / services you use (number of computers, phones, scanners, Facebook, Skype etc.).
I currently use two computers regularly — A Mac Pro laptop for general use and a much more powerful 8‑core Mac Pro desktop as a studio computer. The studio computer has to be powerful enough to run my Transformation Engine software as well as a full orchestra of plugins. I use Plogue Bidule to host the plugins, but occasionally I have to revert to Logic Pro when Bidule isn’t behaving itself. In general though Bidule is much more efficient than Logic, and is more appropriate for my purposes, since I don’t use any of the sequencing functions of Logic.
In my teaching work I constantly use PowerPoint & Keynote as well as numerous video formats when I lecture on film sound design, music history, and animation production methodology. I also teach Pro Tools and Adobe Premiere in a computer lab context. Sheridan College has fantastic technology facilities and keeps very up to date on software and hardware. For their animation our students use Maya, ToonBoom and Flipbook, but, not being an animator, I don’t specifically teach those.
I sometimes use the computer language Forth for demonstration purposes in my classes. For example, I’ve written programs to demonstrate Karplus-Strong synthesis, Physical Modeling, and audio-rate Chaos. And of course the Transformation Engine is written entirely in Forth.
As I mentioned above, I use a Yamaha WX5 Electronic Wind Instrument and VL70 synthesizer to perform traditional church music.
This past year I’ve discovered DropBox as a fantastic tool for collaboration. DropBox makes it very easy to trade files up to 2 GB with people. It’s free and works very smoothly. I’ve done soundtracks for four short animations this summer, three of which were done almost entirely via DropBox. The fourth one was with National Film Board animation legend Kaj Pindal, now 83 years old, who preferred to meet face-to-face.
 How do you feel that the use of these technologies has contributed to those areas of your creative life where you employ them? You may also wish to comment on those that you don’t use (and the reasons).
Technology has always been central to my creative activities. The use of the computer has now become so pervasive that it’s hard to see where it could stop. Wigner’s essay, “The Unreasonable Effectiveness of Mathematics” is relevant here — this mysterious fact that mathematics (and hence the computer) seems to be an extremely effective tool for working with anything. Tim Leary said, “The PC is the LSD of the ’Nineties.” For me creatively the computer is a sort of magnifying mirror on my material. It allows me to look at it from points-of-view that would otherwise have been impossible.
How many people do you know who compose by sitting at a piano with a pencil and a sheet of music paper? I gave that up years ago (though I still sometimes improvise thematic material on my EWI). For me, composition is an exploratory activity, like sculpture, where the artist works to discover the potential (a beautiful figure) inherent in his raw material (a shapeless rock). The computer, especially the Transformation Engine software, turns the process of composition into an interactive exploration of rhythm, contour, harmony and timbre. It allows me to work out compositional detail in a way that would have been possible in the past only with repeated rehearsal.
An alchemical motto says — “Let Nature be your guide”. Nature is the great teacher. We are embedded in Nature and cannot escape that fact. Human Culture, the other great fact of our lives, is a sort of blossom on the tree of Nature — beautiful, but only a small part of greater whole. So my æsthetic derives from Art as a simulation of Nature. Hence the use of the computer to reproduce processes of Nature — chaotic processes like weather patterns and other fractals, and predictable processes like the motion of the planets. Technology as a human artifact, a product of Culture, is related to Nature as a part to the whole. Another saying of the alchemists — “What Nature begins, man completes” — seems to imply that such creative activity is central to what it means to be human. (Apologies for the gender-specific language — these guys were living in the 16th century after all.)
In this vein, I’m currently researching the simulation of ocean waves, because I’d like to be able to reproduce musically those entrancing but unpredictable rhythms of surf breaking against a shoreline. It’s a hard problem, but there’s been a good deal of work done on it in the computer graphics field. You can even buy a plugin for Maya that generates realistic waves. But I would like a simplified model that I can use to drive musical parameters within the Transformation Engine.
 Facebook, MySpace, YouTube, Skype, Twitter, Blogs … are part of the lingua franca of the students I meet every year. Are there ways for the older generation to use these technologies to communicate our values to those who were born after (about) 1988?
Out of that list I would pick YouTube and blogs as being especially relevant to me, the latter mainly because my students use them to maintain an online portfolio and generally stay in touch.
YouTube has a much greater relevance to my teaching, in particular a music survey course where students do a presentation on a musical style from a particular era or part of the world. Before Youtube it was next to impossible to present, say, a demonstration of Tuvan harmonic singing. With Youtube it’s practically an everyday event — I exaggerate a little, but you get the point. For the students it can be an incredible mind-opener to see & hear these other ways of making music, and YouTube (and the internet is general) brings it all home. Though McLuhan seems to have been largely forgotten, the Global Village is now an established fact.
 Open area commentary.
[No response submitted.]