Archive for the 'Pure Data & Arduino Term 2' Category


Pure Data workshop with ed kelly……..

We had a Pure Data workshop with Ed Kelly and I found it to be invaluable.  My main goals for this workshop where to:

1) Modify the existing Ultrasonic sensor code to get two sensors working with the Arduino

2) Work out how to split the sensor data within Pure Data to gain independent control over both the sensors (very important for functionality of the installation!)

I had some code that was for using multiple Ping sensors, the problem was that both sensors would only send a single echo pulse when activated, obviously I needed this process to be repeated continuously and as Ed was a very busy man trying his best to get everyone up and running using an Arduino and Pure Data (no mean feat!) Tim had a little look for the code for me. 

He established that it was it a bit of an odd order, the main problem being that the ‘void loop’ command (the message that tells the program to repeat a certain section of code) was in the wrong place.  Tim’s help along with a contribution from Ed meant that we eventually managed to get both sensors working with both LED’s blinking nicely.

Now for phase two…….

Firstly I must state that main problem here was that I was receiving two sensors data within one number box in Pure Data.  This meant that control over what the sensors manipulate, video and audio wise, would be erratic at best.  The solution was to split the data into two number boxes so they would not affect each other’s readings and be able to be used independently of one another.

Ed’s first idea was to tell one sensor to add 100000 to its total, this would mean that the two sensors would still be contained within one number box in Pure Data but the data readings would be so different, because one reading would be 100000 higher than the other, that in theory you could use the sensor data independently. 

A good idea but unfortunatly this led to the visuals being even more difficult to control because the [threshold] objects (the objects that determine at what distance events happen) Where being doubly confused by the erratic changes in numbers. 

Now enter Alfonso….

Alfonso graduated from Digital Arts last year and used the exact same sensors with Pure Data for his final project.  His piece used one sensor but he did give me some patches that seemed to be set up for multiple sensors (although he had never tested them)

I loaded in a patch and to my surprise the sensors split themselves into two boxes.  At first I thought that this was it! yet on closer inspection the data was very erratic.  The terminal in Pure Data was saying that it was receiving data from sensor (1) then sensor (2) then sometimes a multiple of both and at other times just sent one sensors reading multiple times.  Ed said that he and Alfonso had the same problem last year and that they never worked it out… then Ed had an epiphany. 

He used a combination of trigger and float objects to split the signal, then printed the new signals out and either an ‘A’ or a ‘B’ for the terminal to comprehend.  This rectified the data flow and we finally had total control over both sensors… hell yeahJ

Here’s a look at the patch so far.  This uses 3 videos.  One of which only plays when there is no presence within the space. The other two play as a composite when an individual enters the space.





Arduino communicating with Pure Data……

I downloaded various software from the Arduino website for communication between Arduino and Pure Data.  These are Arduino2PD, Pduino-0.3.1 and SimpleMessageSystem (all of which can be found at

First I uploaded the code to the Arduino board that I downloaded from the Arduino website ( I had zero success locating the sensors output within Pure Data with Pduino-0.3.1 and Arduino2PD, yet the SimpleMessageSystem seemed to locate this information instantaneously and outputted the numbers (that represented the distance the sensor was reading) as a list within Pure Data. 

Here is the updated patch includiInsert/edit link (Ctrl+a)ng the SimpleMessageSystem.

 Click Here 

I have also modified this set up with 3 array tables that can play back 3 separate audio files.  This is with the intention of experimenting with having different audio that triggers at different distances. 

The [route list float symbol] connects via the list to a number message, this message receives the data from the ultrasonic sensor, the data is then passed onto the various [threshold] objects that trigger either the audio or video at set distances.

Here is a video of the sensor triggering a change in video.


This is all fine and dandy as a prototype, but there is a problem…

The sensor has a measuring distance of 3 metres, which is a good length but the problem in that the width of the sensors range, it is quite narrow.  I have tried placing the sensor a various heights and had minimal improvement, but it does seem that the best position is for the sensor to hit you right on the nose (a premise that is advocated by various forums I had read about ultrasonic placement) But it is still not good enough as the results are often erratic and temperamental.

The solution is to have multiple sensors so that the sense of field is widened.  I have downloaded a multiple sensor code for the Ping.  Tim had a look at the code the other day and we managed to get it working with Pure Data with the signals being received one after the other for the ‘multiple’ sensors (albeit we were only had one sensor but the principle was there)

This opens up another problem though…..

The [threshold] object waits until it receives a distance from the sensor that is lower then the value set and then it alters the visuals or audio.  When the distance rises above this number it switches back.  The problem is that if there is only one [threshold] object is receiving information from, say, three sensors, it will get confused as which number to read.  As if one sensor drops below the threshold rate and the other two sensors stay above this number it would flip the video or audio uncontrollably back and forth and behave erratically….not what I had in mind.

The way to solve this would be to employ some sort of gate to the threshold.  The idea being that if the [threshold] object received a distance from one sensor that was below the threshold rate then the gate would block all information from the other sensors and the threshold would only read the sensor that is currently below the threshold rate. 

When said sensors distance rises above the threshold rate then the gate would open allowing the other sensors to be read again and control the imagery.

This is possible I know. I think I may need the help of a Mr Ed Kelly….. 



Arduino connected to ultrasonic sensor……

I finally received my Arduino board and a Parralax Ping Ultrasonic sensor, together with a few bits and bobs such as wires and a solderless breadboard.  Having no previous experience of how to wire up electronics and having never seen a solderless breadboard before I did feel a little in over my head, but to my surprise……… it was easy to wire up!

My reason for wanting to use an ultrasonic sensor for my project is so that I can detect the audiences’ movement.  Movement is a very important factor within the theoretical side of my practice as our movement through the non-place has a predominant affect on how we define it as a space.

Here’s a picture of the setup below:


 The Arduino website has a good amount of documentation to on both hardware and software to get you up and running.  There is also a good community who seem happy to share their knowledge within the forum pages.

I downloaded the Ultrasonic code from the website and uploaded it to the board, I ran the serial monitor within the Arduino environment and the code seemed to work as described as the sensor was reading a distance back to the computer.  Below is a video of me moving the sensor to and from my computer to demonstrate this:


The challenge now is to get the Arduino to communicate with Pure Data and run a series of tests to begin to work out how this project can be realised with the best possible functionality.

I’ve got a feeling that this next process is not going to be as straight forward……


updated pure data patch……

I’ve been working hard this week trying to get my head around the program Pure Data, and the learning curve is steep!

But saying that the open source nature of this program is to be revered, and by scouring the forums for examples and advice I have got to the stage where I have a patch that is not pretty, but it works!

The more I use Pure Data the more it makes sense.  PD is primarily an audio based application but the objects that the audio part of the program uses more often that not transpose over into the video element of the program (the only difference is that audio objects use a ‘~’) 

I spent a lot of my teenage years playing around with musical hardware and problem solving by bypassing an input here to get an output there and so forth, so the idea of patching objects together is not foreign to me.

Also, as some the objects have audio titles such as threshold (which are used in audio compressors) they do make some sense to me, and I have found that my instinct on what an object may do has usually been correct.  This knowledge combined with a bit of mathematics (long time since I had to do that!) has led me to create the patch below.


larger picture – click here

The patch is a combination of 2 video mixers and an audio sampler (GEM, the environment that PD uses for video does not support audio, so you have to extract the audio separately and sync it up later in the process)

The videos are switched by a horizontal fader and the audio is switched by a vertical fader.  These are all connected to a ‘Bang’ (the object that make stuff happen in PD)

The Bang is connected to a threshold object, the threshold object emits a bang when a value is passed through it that is greater than the value you set (and with a bit of math) sends another bang signal when the number drops below the value you set.

It is my hope that when I get an ultrasonic sensor up and running this value that is sent to the threshold object will be the distance the sensor is reading.  So when an individual gets say within, 200cm of the sensor, I can set the threshold to emit a bang when the number drops below 200 to flip the video, and send another bang when the individual moves more than 200cm away that will flip the video back to its original state.

If this works I will have good control over whatever space I have to show my work, as the size of the room can be accounted for by increasing or reducing the threshold value.

Fingers crossed……….    


my first pure Data success!!!!

I have managed to use Pure Data and get it to do something that I need it to do!!  That is….. play video!  I followed some tutorials on the net and managed to create a video player by making this patch.


I know this sounds like a bit of an anti-climax but after all the problems I’ve been encountering when trying to understand coding, scripting etc, and my non-existent knowledge of everything programming related it feels good to actually get a result.  Morale is up again!!

One thing down, just another ‘God knows how many’ to go………