Consider us your emerging tech update. We blog daily about breaking news, innovation, success stories and more.
The Push, Sponsored by Mutual Mobile
The Push, Sponsored by Mutual Mobile

We'll send the top stories to your inbox.

Inter-App Audio sounds like a good idea for the music industry

By Paul Williams / August 16, 2013


One of the more obscure features announced for iOS 7 is Inter-App Audio. As described on Apple’s iOS 7 Developers page, this functionality allows developers to send and receive audio between apps in real-time, while providing MIDI-controlled rendering of audio, making it easier to develop music instrument apps.

Audiobus introduced similar capabilities to the iOS platform last year, and it ushered in a revolution among iPad- and iPhone-loving musicians, with big names like Korg and Moog adding Audiobus support to their synth apps.

Audiobus made it possible for musicians to use an iOS device to send audio from a synth app like Animoog through an effects app like Jam XT, and record the results in a multi-track recording program like GarageBand or BeatMaker 2.

Inter-App Audio provides system-level access to the audio stream

One major differentiator between Inter-App Audio and Audiobus is its ability to keep the routing at the operating system level so users won’t need to access another app in the process. Inter-App Audio also allows users to access the audio stream at a lower level, theoretically providing superior performance.

In the pre-release developers documentation for iOS 7, Apple also provides two code examples for Inter-App Audio. One details how you can build and publish an audio delay to be used by other music apps in iOS, while the other covers a sampler instrument able to play audio when it receives MIDI note information from another app.

How apps communicate using Inter-App Audio

In iOS 7, Inter-App Audio apps are known as either “nodes” or “hosts.” Essentially, any app capable of publishing an audio output stream is a node, while hosts connect and manage node applications. Some apps, like GarageBand, serve both roles, letting users record a synth sample from a node app or send a GarageBand guitar riff to different host like BeatMaker 2.

While multi-track recording apps primarily serve as hosts, effects apps end up having to play host when consuming audio from a synth app, and act as a node when routing the processed audio on to a multi-track recording app. Any truly useful effects app should also be able to process live audio from a mic or one of the many audio interfaces available for iOS, like the iRig guitar interface.

Music app developers now have an option from Apple

If you’re interested in music-app development, or using one, the introduction of Inter-App Audio promises a simplified approach for beginners and superior performance for seasoned vets. This attractive update has also encouraged Audiobus and its competitors to amp up their own inter-device communications, giving audio producers an even larger selection of high-quality mobile software. Whether you’re an Apple developer or just an all-around music lover, Inter-App Audio for iOS 7 should make you do the happy dance.