Interactive Audio Fair May 3 4-7pm Meyer Hall TODAY!

April 27, 2012 in Announcements, E'vents, Interactive Audio, Sound Arts by vicki

RSVP for the Interactive Audio Fair

Ex’perience It!

A Celebration of Interactive Audio Work
Thursday, May 3, 2012
Meyer Hall
4–7PM

Join us for an interactive audio fair, where you will be able to view outstanding projects representing the emerging field of interactive audio. This is your opportunity to ask questions and see first-hand the skills that are being taught in the Interactive Audio program.

RSVP: expression.edu/iafair

The Interactive Audio Fair will feature projects from the following students and alumni:
Joe Bradshaw, Alex Galler
Project: The Sound of Baseball

Cameron MacNair
Project: NoisePrint~

David Claudio
Project: FXP360 (Effects Processor 360)

Donald Scully, Kai Jorgensen
Project: Drumpad King

Jordan Suhr, Lance Tipton
Project: i ♥ Sound

Evan Shwartz, Austin Roberts
Project: Chroma: Interactive Environmental Processor

Justin Loeffler
Project: We Are All Musically Kinected

Kenneth Smith
Project: InfraRad Synth

Bryce Quig-Hartman
Project: Seismic Synth

Brett Shipes
Project: Custom Built Interactive Music and Audio Engine – Space Invaders


Read more about of the projects below.

Joe Bradshaw, Alex Galler
Project: The Sound of Baseball

Let the ballgame come to you – this project is an aural soundscape of a baseball game. Experience exactly what is happening in a baseball game based on the sounds that are triggered through the audio engine, built in Max. This audio engine reacts in real-time to data extracted from an Internet broadcasted baseball game.

Cameron MacNair
Project: NoisePrint~

NoisePrint~ is a tool for game audio designers, simulating an acoustic environment and implementing it into the game. When a “dry” (unprocessed) audio signal enters NoisePrint~, it becomes affected by reverb, delay, equalization, and other properties that simulate the way it would sound in “the real world”. The patch was originally designed in Max, redesigned in SuperCollider, and the interface/waterfall display was developed in Processing. NoisePrint~ was created in hopes to inspire game audio designers to pursue accuracy and consistency, through a feed of code.

David Claudio
Project: FXP360 (Effects Processor 360)

The FXP 360 is an audio effects processor built using Max and operated by an Xbox 360 controller. Using a driver that allows the implementation of the Xbox 360 controller data into Max, the user can control the parameters of 4 effects processors.

Donald Scully, Kai Jorgensen
Project: Drumpad King

This project is based on the popular “Guitar Hero” and “DJ Hero” games, incorporating the use of an eight-pad MIDI controller. The goal of the game is to accumulate points by accurately triggering the different elements of a song, as they appear in sync across the screen. The programming is developed in Max and the music was mixed and printed in Pro Tools.

Evan Shwartz, Austin Roberts
Project: Chroma: Interactive Environmental Processor

Chroma takes weather data from the RSS feed on weather.gov and sonifies it. The interactivity is held in the touch-based user interface, which functions using the Processing coding language. Max is used under the hood for the actual sonic application. Four cities are set up – San Francisco, NYC, Nashville, and Austin – each city having its own musical “personality.”

Justin Loeffler
Project: We Are All Musically Kinected

“We Are All Musically Kinected” is an audio experience controlled by the body without a traditional touch-based interface. The project uses three software applications; “Synapse,” written by Ryan Challinor, is a program that uses the Kinect camera to map the human body and give real time coordinates via OSC messages. This data is manipulated as audio through Max for Live. All the sounds and music are original and have been written specifically for this project.

Kenneth Smith
Project: InfraRad Synth

The InfraRad Synth is a granular synthesizer controlled by a Nintendo WiiMote. The idea behind this project is to think of synthesis as a motion. The role of the WiiMote is to control four of the most important parameters of a granular synth, which include pitch, grain size, range, and a track scrub. This project includes the use of Max, Logic Pro, SoundFlower, Nintendo WiiMote, and an IR light bar.

Bryce Quig-Hartman
Project: Seismic Synth

The Seismic Synth is a hybrid synthesizer developed in Max. The synth is controlled by data that is collected via RSS feed from the USGS (United States Geological Service). Once the data is received within Max, the different information is parsed out and used to change different synth parameters. Seismic Synth spawned from a general interest in earthquakes. These tremors are happening constantly – there is really no way to hear a small seismic tremor 25 miles below the earth’s surface – until now!