Expand AU Menu

News

  • RSS
  • Print

New Audio Technology Professor Designs Musical Tools

Photo credit: Ariana Stone

When William Brent was a teenager, his life became “consumed” by classical piano. The new Department of Performing Arts professor practiced constantly, sometimes 10 hours a day, and eventually double-majored in piano performance and music composition as an undergraduate student.

Eventually, though, classical music became predictable in some ways, enough that Brent temporarily lost interest in it. “That pushed me to experimental music,” says Brent. “In experimental music, you don’t know what you’ll get or if it will even resemble music, or what most people consider to be music.”

Brent has since re-visited his favorite classical records with renewed enthusiasm. But the questions that experimental music raised for him—what is music? Why are some experimental works classified as music and others not?—are still a part of his work today. Brent fuses his love of music with his experience as a performer, composer, audio technician, and self-taught computer programmer to explore new ground in music production.

Though Brent no longer has time to practice piano for 10 hours a day, he has pursued some performance-based projects that integrate his interest in computer-based music. Most notable are his “Ludbots”—computer-controlled robots that play percussion instruments. “On a very practical level, Ludbots provide the ability to play difficult scores that cannot be achieved with human performers,” Brent writes on his Web site. “Their small size and easy mounting also make them ideal for installation use—filling a room with immersive, acoustically produced sound.”

However, most of Brent’s time is focused on research-based music projects. As a master’s student, he picked up some programming languages on his own. This served as the foundation for a recent program that he wrote: timbreID. Timbre is the quality that makes the sound of different instruments unique and distinguishable from each other. The software can be used in a number of ways, for instance, to analyze the timbre of real instruments—a contrabass, a piano, or a clarinet—and synthetically recreate similar sounds.

While at AU, Brent wants to begin creating a digital music instrument that will assist him in researching the role of physical gesture in music performance. This instrument will have a software component—a “sound synthesis engine” that produces sound—and a hardware component—any sort of interface or physical object that can be used to play the instrument. This could be a Wii remote, an iPad, or a custom-designed controller. Theoretically, the software could include any type of sound from instruments and beyond.

His hope is that the instrument will give insight into people’s expectations of music performance based on gesture. For example, when you see a person raise their hands up to play a violin, “you already have in your head some idea of what sound they’re about to make,” says Brent. “If I take my Wii remote out and I [move it around], you have no idea what’s going to happen. So it’s a blank canvas. You can do whatever you want.”