Did you know Apple put Core Audio and Core Graphics into iOS from 4.2? I didn't.
These two APIs have been cornerstones of OS X for yonks, allowing, for example, all those iMovies, iPhoto and GarageBand tricks - and yeah, that cool Mosaic screensaver in System Preferences.
But they also power, or assist, Aperture, Final Cut Pro and Logic, Apple's professional applications for photo wrangling, movie making and audio production, plus assist the interoperability of all these programs with each other.
Core Audio is technology for playing, processing and recording audio.
There's more about Core Graphics in iOS here, but I will concentrate on Core Audio in this post.
Thanks to Core Audio, apps can play one or more simultaneous sounds, handle streamed audio content and record, plus it can even trigger vibrations. Core Audio also manages the audio environment, automatically routing audio when using headphones, Bluetooth headsets and docks.
For advanced effects, most likely for deployment in games, the also-included OpenAL API models and plays audio in a 3D space, similar to OpenGL for graphics operations.
What this means for iOS is that working on images is now much faster, among other tricks, like supporting audio apps that can analyse sound, like Spectogram Pro. (I love spectrum analysers, they're so handy. And now I have one on my iPhone, too.)
Another is VoiceAnalyzer for $1.29 - the Pro version of which, for a bank-busting NZ$5.29, lets you touch on a displayed frequency marker and shows you what that frequency is.
Apps like Tremor are also appearing. This is a Core Audio mixer for the iPhone/touch/iPad, turning them into DJ Mixer panels.
It can mix your tracks from your computer straight onto your device, and record from the microphone to enhance and create your own mixes.
You can even stream internet music with it and use it as a mixer input.
There's more info at the site.
Quite a surprise, in a demo that Apple Australia put on in Auckland this week, was a synthesiser app routed through some remote Klipsch speakers wirelessly via an AirPort Express. This was as much a delight to Apple's representative as it was to me. First we turned it on in AirPlay, then we switched to the synth, and voila - beautiful sound. This is pretty impressive. Apparently the correct code is being worked into apps by different developers and if you have anything like that, plus an AirPort Express connected to your stereo, you should try it too. And if it doesn't work, try again after the next app update.
It even worked - in my testing - with Guitar + HD, which lets you play a virtual guitar, although the lag in this case was too bad for this to be a workable 'instrument'. It didn't work with Solo Synth, did work with MusicStudio, but again, unacceptable lag. I Can Drum worked, so did BeBot, but Thumb Drum didn't.
Still, exciting possibilities.
But the new chops for iOS also means that what I wrote, about eight months ago, about iPad being a great Mac controller-in-waiting for Final Cut Pro and Logic has now, pretty much, come to pass.
There are several apps out there already for Logic, like TouchOSC ($6.49).
TouchOSC has a site for more information, but basically works by letting you send and receive Open Sound Control messages over a Wi-Fi network using the UDP protocol.
The application allows to remote control and receive feedback from software and hardware that implements the OSC protocol in Apple Logic Pro and Express ... and for Renoise, Pure Data, Max/MSP/Jitter, Max for Live, OSCulator, VDMX, Resolume Avenue 3, Modul8, Plogue Bidule, Reaktor, Quartz Composer, Vixid VJX16-4, Supercollider, FAW Circle, vvvv, Derivative TouchDesigner, Isadora, and others.
The interface provides a number of different touch controls to send/receive messages, including faders, rotary controls, push buttons and LEDs.
It supports full multi-touch operation, but 'only' five controls can be used at the same time. It comes with default layouts but hey, you can download an editor application for OS X (it's also in Windows and Linux versions) to design and upload custom layouts for TouchOSC for iPhone/iPod Touch and iPad - just scroll down on the TouchOSC web page.
The app can even send Accelerometer data. Once again, I imagine wireless lag could be a pain in the arts for this.
Except that's not true - in my quick tests, the iPad works instantly with no perceptible lag and was an excellent controller.
And seriously, being able to equalise instantly with finger strokes is a real buzz. Imagine - you can wonder around your room trying different equalisations on the fly. Really cool.
NZ apps
Meanwhile, our local development community (of which Apple is well aware, BTW) continues to beaver away - ProWorkflowMobile is a venture by ProActive Software that lets staff time-track and view basic project details, but also add, edit and report from iPhones and other mobile platforms.
It's not an installed app - users visit the mobile login page and sign in. Learn more at the ProActive site.
As usual, please let me know what you're doing so I can broadcast your efforts. I love to. And no, I don't require a pecuniary addition to a Swiss bank account. Just the info, please.
I will be reporting from Webstock next week, and I hope to interview luminaries who have touched to a greater or lesser extent on Apple, including John Gruber of Daring Fireball. Cool! Let me know if there's anything you'd like me to ask - but it will have to be before the end of Tuesday 15th February.
So ... see you in Wellington.
Core Audio in iOS
AdvertisementAdvertise with NZME.