Archive for January, 2008


Arduino connected to ultrasonic sensor……

I finally received my Arduino board and a Parralax Ping Ultrasonic sensor, together with a few bits and bobs such as wires and a solderless breadboard.  Having no previous experience of how to wire up electronics and having never seen a solderless breadboard before I did feel a little in over my head, but to my surprise……… it was easy to wire up!

My reason for wanting to use an ultrasonic sensor for my project is so that I can detect the audiences’ movement.  Movement is a very important factor within the theoretical side of my practice as our movement through the non-place has a predominant affect on how we define it as a space.

Here’s a picture of the setup below:


 The Arduino website has a good amount of documentation to on both hardware and software to get you up and running.  There is also a good community who seem happy to share their knowledge within the forum pages.

I downloaded the Ultrasonic code from the website and uploaded it to the board, I ran the serial monitor within the Arduino environment and the code seemed to work as described as the sensor was reading a distance back to the computer.  Below is a video of me moving the sensor to and from my computer to demonstrate this:


The challenge now is to get the Arduino to communicate with Pure Data and run a series of tests to begin to work out how this project can be realised with the best possible functionality.

I’ve got a feeling that this next process is not going to be as straight forward……


updated pure data patch……

I’ve been working hard this week trying to get my head around the program Pure Data, and the learning curve is steep!

But saying that the open source nature of this program is to be revered, and by scouring the forums for examples and advice I have got to the stage where I have a patch that is not pretty, but it works!

The more I use Pure Data the more it makes sense.  PD is primarily an audio based application but the objects that the audio part of the program uses more often that not transpose over into the video element of the program (the only difference is that audio objects use a ‘~’) 

I spent a lot of my teenage years playing around with musical hardware and problem solving by bypassing an input here to get an output there and so forth, so the idea of patching objects together is not foreign to me.

Also, as some the objects have audio titles such as threshold (which are used in audio compressors) they do make some sense to me, and I have found that my instinct on what an object may do has usually been correct.  This knowledge combined with a bit of mathematics (long time since I had to do that!) has led me to create the patch below.


larger picture – click here

The patch is a combination of 2 video mixers and an audio sampler (GEM, the environment that PD uses for video does not support audio, so you have to extract the audio separately and sync it up later in the process)

The videos are switched by a horizontal fader and the audio is switched by a vertical fader.  These are all connected to a ‘Bang’ (the object that make stuff happen in PD)

The Bang is connected to a threshold object, the threshold object emits a bang when a value is passed through it that is greater than the value you set (and with a bit of math) sends another bang signal when the number drops below the value you set.

It is my hope that when I get an ultrasonic sensor up and running this value that is sent to the threshold object will be the distance the sensor is reading.  So when an individual gets say within, 200cm of the sensor, I can set the threshold to emit a bang when the number drops below 200 to flip the video, and send another bang when the individual moves more than 200cm away that will flip the video back to its original state.

If this works I will have good control over whatever space I have to show my work, as the size of the room can be accounted for by increasing or reducing the threshold value.

Fingers crossed……….    


keep it simple……

I had a tutorial with Andy last week in which we spoke for part of it about the relationship between the technology behind the work and the work itself.  

Andy’s opinion was that the work (visually) should always come first. He felt that the technology shouldn’t be at the forefront of the audiences gaze, it should not be the first thing they notice, quite the contrary, they should not notice it at all, it should blend into the background and become a part of the experience, and not be all of the experience.

He then he said those wise old words of ‘keep it simple, the simplest ideas are always the best’.

After this discussion, and looking back at my blog at some of the preliminary ideas I have come up with to date, they now look quite cumbersome and ugly.  They place the technology at the forefront of the audiences gaze and I couldn’t agree more that the work itself could become lost within these (physically structural) ideas.

So with all that in mind, here is a new idea that is clean and simple.  My project focuses on the non-place, and how our movement within this space shapes and defines it, our presence is its identity.  It seems obvious that movement has to be an important variable in the final installation, and in a way shape how it is experienced.


Andy and I spoke about how it would be interesting to have an installation that reacted and transformed itself only when there was a presence in the room.  When there is no movement it remains an un-definable entity, but when it senses a presence, your movement through this space it will begin to reveal its true identity to you.

With this diagram the sensor is placed so that it bounces back an ultrasonic echo the distance between it and the entrance to the installation.  If you are walking past or standing outside the room the visuals would remain unidentifiable (either they would be completely static or maybe even moving at great speed so they could not be recognised) If you choose to enter the space the ultrasonic sensor would pick up your presence and change the visuals and become something that you can begin to understand.

We also spoke about how sound could be used, maybe having no visuals and only abstracted sound, and when you enter the space the visuals start up and complete the puzzle?  But this is all for a later date, for now I need to concentrate on getting a prototype up and running with an ultrasonic sensor, then I can settle down to the nitty gritty of developing the visuals. 

Its back to Pure Data…….. (deep breath)


my first pure Data success!!!!

I have managed to use Pure Data and get it to do something that I need it to do!!  That is….. play video!  I followed some tutorials on the net and managed to create a video player by making this patch.


I know this sounds like a bit of an anti-climax but after all the problems I’ve been encountering when trying to understand coding, scripting etc, and my non-existent knowledge of everything programming related it feels good to actually get a result.  Morale is up again!!

One thing down, just another ‘God knows how many’ to go………    



Installation idea number 3…….

This is the same as installation idea number 2, yet instead of using an ultrasonic sensor, it uses pressure pads instead to ‘feel’ the presence of the viewer an slow down or speed up the video accordingly.     pressure3.jpg   

I’ve had a look at FXscript in Final Cut Pro but it seems that this application is best suited to creating new filters and transitions and not manipulating video (well, not in the way that I intend to anyway) I’m getting dragged down and down into a coding nightmare!!

Every program I have looked at thus far is just not suitable for this project. I know I have very little knowledge of this but I can get an idea of whether a program is going to be able to do what I need to do pretty quickly (by looking at the manuals, playing around with a few functions and seeing how other people have used the application with example on the net and so forth) 

With that all in mind (and my mind becoming quite disheartened with this whole affair) I am going to tackle the beast that is Pure Data.  I know this application can do what I need it to do and people tell me that when you get into it is not all that scary.  

So, here we go……..  



installation idea number 2…….

This idea revolves around using an ultrasonic sensor to judge the position of viewer in relation to the projection, and depending on this distance, either slow down or speed up the projection accordingly.

From what I understand an ultrasonic sensor works by transmitting a short burst of ultrasonic sound towards a target which then reflects the sound back to the sensor, it measures the time it takes for the echo to return to the sensor and computes the distance between it and the target.

I would intend to build a frame to house the sensor and a v-shaped construction that complemented the sensors field of vision.  This way the viewer, when within the installation would not be able to stand outside the sensors field of vision therefore always affecting the visuals.  

The housing for the sensor would act as a barrier between the viewer and the projection ensuring that the closest point the viewer could reach would still be an acceptable distance from which to view the work.


The image below is a birds-eye view of the installation.  It aims to give a rough idea what speed the projection would play back at when the viewer is standing in one the four zones.


 To use the ultrasonic sensor I would have to use an Arduino running alongside another application that I have yet to decide upon.  Tim suggested checking out FXscript in Final Cut Pro and there is obviously the program Pure Data.

 This is all so rough at the moment, and also very confusing for me as I have never taken on anything quite like this before, but I hope as my knowledge grows, and I work out exactly how to achieve this I will be able to write more fluently in my blog about what it is exactly that I am doing.  As it stands I feel like I’m just waving my hands around in the dark hoping to bash into something that I can get my little brain around and use successfully.  Its getting tough now……… and time is of the essence. 



chris o’shea – “out of bounds”

Chris O’Shea’s ‘Out of Bounds’ installation allows people to see through walls. The experience is made convincing by the artist giving visitors an infrared torch in which to project onto the walls and interact with the work. This works by the software tracking the position of the IR emitter via an overhead security camera, and the whole thing is coded to make the impact realistic.

The software is coded in Open CV (an open-source computer vision library from Intel, in C++) and OpenFrameworks (a lightweight multimedia C++ framework for artists)


Chris O’Shea stated:

‘There is a childlike quality about wanting the ability to see through walls with x-ray vision like a superhero character. I want to encourage visitors to bore through the walls of the museum and engage in a ‘behind the scenes’ experience with an x-ray torch. This playful interaction encourages childlike curiosity in young and old alike, and opens up a portal into the Museum’s forbidden spaces.  Shine the torch at the wall to reveal the secrets hidden beneath. Pay an anonymous visit to the staff office, collection’s store, workshop, roof hatch or plant room.’

I like this idea, it involves the audience and creates an experience for which the viewer can take away from the gallery and savour.  Everyone at some point must has wished that they had some form of superhero power! (I know I have, I keep asking my girlfriend for ‘The Force’ for Christmas, but so far she hasn’t come through) This is a playful and fun installation, what a day that would be when (even though only virtually) you had the ability to see through walls!