Category: Android

Ableton Live Talks OSC to Cinder

The goal here is to have Ableton Live transmit all track data and user actions to a C++ program which can use that data to generate unique visuals.  If you want to create interactive visuals for a music performer, the obvious way is to use a live audio feed.  However you can only do so much with such a feed and visuals end up only responding to the master level.  With the help of a tool called LiveOSC, it’s possible to get OSC data out of Ableton Live.  OSC or Open Sound Control is a data format which is usually transmitted over a wifi using a protocol called UDP, and this data can be picked up by a C++ application.  In this case we’re using a C++ framework called Cinder and the OSC is interpreted into visualizations.  This OSC data can tell us stuff like:

– when the set begins or ends
– when each beat occurs (as in once a second for 60 bpm)
– when any track level or the master level is changed
– when the user switches to the next track
– when the user tweaks any knob

I’m using the following tools to accomplish this task.

Ableton Live 8.2.6 (OSX in my case)
LiveOSC
Cinder
XCode or VisualStudio 2010 (or Eclipse which I’m using, but that’s a whole other topic)

Ableton Live gets LiveOSC

LiveOSC is a simple way to get OSC data out of Live.  It’s hosted here by its creator:

There’s not much help on the site, but it really is that simple to get it working.

1. install python 2.5.1 (Do not skip!)
2. place LiveOSC in Live.App/Contents/App-Resources/MIDI Remote Scripts/
3. go to Preferences -> MIDI Sync and select LiveOSC as a Control Surface

Live should now be transmitting OSC data over UDP on port 9001, as well as listening on port 9000.  However we don’t yet have a way of listening to the transmission.  You could probably use some sort of network monitoring tool to check if it’s working.  But it’s actually easier to proceed with setting up the next tool, Cinder, since it comes with a sample app called OSCListener which will pick up this OSC data and parse and display it.

[As an aside, by default LiveOSC seems to transmit to it’s own host machine’s IP, my next goal is to make it transmit to a different IP]

Listening with Cinder

Cinder is an excellent C++ framework that makes it a breeze to make complex interactive and visual apps. You can download it for free and it comes with many sample apps as well as a set of tutorials by a talented artist, Robert Hodgin.  My preferred way of setting up Cinder is with Git, using these instructions.

Once you are setup correctly, you should be able to launch any of the sample apps that the Cinder library comes with.  Go into cinder/blocks/osc/samples/OscListener/ and then launch either the Xcode or VS2010 app depending on your platform of choice.

Now in the source file OscListenerApp.cpp, make one simple change.  Look for the following line and change the port number from 3000 to 9001:

listener.setup(9001);

Now run OscListenerApp and then go to Ableton Live and move the master level slider up and down.  You should see some OSC data pop up in you Xcode console!

The master level slider will also effect the OscListener app screen.  The white part of the window should extend to the right as you increase the volume.  This slider will have this effect while others may not.  In the code the simple visual effect is only performed when the first argument is a float.

Making sense of the OSC

I’m still figuring out the nitty gritty of interpreting the OSC messages from LiveOSC.  Most of it is pretty self explanatory.  Here’s an example.  The output is reformatted so it looks more readable (to me)
 
–> /live/play  #args:1  [0]int32: 2  
–> /live/beat  #args:1  [0]int32: 1  
–> /live/beat  #args:1  [0]int32: 2  
–> /live/beat  #args:1  [0]int32: 3  
–> /live/beat  #args:1  [0]int32: 4  
–> /live/beat  #args:1  [0]int32: 5  
–> /live/beat  #args:1  [0]int32: 6  
–> /live/scene  #args:1  [0]int32: 3  
–> /live/clip/info  #args:3  [0]int32: 0  [1]int32: 2  [2]int32: 3  
–> /live/beat  #args:1  [0]int32: 7  
–> /live/clip/info  #args:3  [0]int32: 0  [1]int32: 1  [2]int32: 1  
–> /live/clip/info  #args:3  [0]int32: 0  [1]int32: 2  [2]int32: 2  
–> /live/beat  #args:1  [0]int32: 8  
–> /live/beat  #args:1  [0]int32: 9  
–> /live/beat  #args:1  [0]int32: 10  
–> /live/beat  #args:1  [0]int32: 11  
–> /live/play  #args:1  [0]int32: 1

Things to note:
– we get a message on start and stop with the address /live/play
– we get a message on every beat with the address /live/beat
– we get a message when switching tracks with the address /live/clip/info
– the last argument in some cases is an int that represents the following values:

0 = empty
1 = stopped
2 = playing
3 = triggered

So the 3 /live/clip/info messages were sent when I triggered the new clip, the old clip stopped, and the new clip started.

TouchOSC talks to Cinder

This section is not really related to the above, but is another simple proof of concept.  Another way to send OSC signals to a Cinder app is to use your phone.  Just install an app called TouchOSC (free on Android, $5 and more full featured on iOS).  Make sure your phone and computer are connected to the same wifi network, open up TouchOSC, and configure it with the following settings:

Host: 192.168.1.2             Port (outgoing):  9001
Port (incoming):  9000

Now just start up one of the touch layouts in TouchOSC, start up your OscListener app, and like magic, you should see messages coming in to your Cinder app.

Processing + Eclipse + Android = Proclipsoid?

Right off the bat I have doubts about using Processing for Android development.  Processing only supports up to Android version 2.2 at this point.  Android is up to 4+ now.  This can’t be good.  But I’m not sure what the repercussions are.  Maybe simple apps will be forwards compatible?  Features like NFC, which only the latest Android versions support, are going to be out of the question, right?  I hope to delve into Android more to answer such questions, but for now I have a few simple steps to get a basic Android app, which leverages Processing, up and running in the Eclipse IDE.

First we need to perform steps as documented by Google to get Android up and running on its own.

1.  Install Android SDK.

Run SDK Manager and select the components from the list as follows.  My screenshot says “update” in some places because I last played with Android a year ago.

 
2.  If you haven’t already, install Eclipse for Java.

3.  Install the Eclipse ADT Plugin.

4.  Create an AVD using Android version 2.2.

5.  Create a blank Android Project again using Android version 2.2.  On the 3rd screen for Package Name, be sure to use something like “com.example.helloandroid” and not “HelloAndroidProcessing” or you’ll get strange errors.

 

 

 
6.  In Eclipse, create a new run configuration using the AVD you created earlier.

7.  Now make sure the Android app runs on its own.  Launch using your new run configuration and you should see the Android emulator pop up.  If you’ve gotten this far, you are ready to develop for Android sans Processing.

8.  If you haven’t already, install Processing (just need to download and unzip).  I’m using version 2.04a.  Stable version 1.5.1 won’t work according to the Processing Wiki.

9.  Now import android-core from your Processing installation.  I created a folder in my new Android project directory called “lib” and placed android-core.jar into it.  Then in Eclipse go to Project -> Properties -> Java Build Path -> Add JARS… and find and select android-core.jar.  (No Proclipsing happening here, though that would be my method of choice for developing Processing in Eclipse for non-Android purposes)

10.  Now make the following code replacement in your main activity class.  Notice I commented out most of what was there by default.  I grabbed a simple Processing sketch from Dan Shiffman’s Learning Processing.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
package com.example.helloandroid;
//import android.app.Activity;
//import android.os.Bundle;
import processing.core.*;
 
/*
public class HelloAndroidProcessing extends Activity {
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.main);
    }
}
*/
 
public class HelloAndroidProcessing extends PApplet {
    public static void main(String args[]) {
        PApplet.main(new String[] { “–present”, “something.whatever});
    }
 
    //@Override
    //public String sketchRenderer() {
    // return P2D;
    //}
 
    PFont f;
    String message =this text is spinning”;
    float theta;
 
    @Override
    public void setup() {
        size(200,200);
        f = createFont(“Arial”, 20, true);
    }
 
    @Override
    public void draw() {
        background(255);
        fill(0);
        textFont(f);
        translate(width/2,height/2);
        rotate(theta);
        textAlign(CENTER) ;
        text(message,0,0);
        theta += 0.05;
    }
}

Running your run configuration should start up the slow slow emulator.  Once I unlock, I see the following:

 
Also you will now have a file called HelloAndroidProcessing.apk in your bin folder.  Simply copy it over to your Android device and install it and you’ll see your app in action.