Archive for February, 2008
We had a Pure Data workshop with Ed Kelly and I found it to be invaluable. My main goals for this workshop where to:
1) Modify the existing Ultrasonic sensor code to get two sensors working with the Arduino
2) Work out how to split the sensor data within Pure Data to gain independent control over both the sensors (very important for functionality of the installation!)
I had some code that was for using multiple Ping sensors, the problem was that both sensors would only send a single echo pulse when activated, obviously I needed this process to be repeated continuously and as Ed was a very busy man trying his best to get everyone up and running using an Arduino and Pure Data (no mean feat!) Tim had a little look for the code for me.
He established that it was it a bit of an odd order, the main problem being that the ‘void loop’ command (the message that tells the program to repeat a certain section of code) was in the wrong place. Tim’s help along with a contribution from Ed meant that we eventually managed to get both sensors working with both LED’s blinking nicely.
Now for phase two…….
Firstly I must state that main problem here was that I was receiving two sensors data within one number box in Pure Data. This meant that control over what the sensors manipulate, video and audio wise, would be erratic at best. The solution was to split the data into two number boxes so they would not affect each other’s readings and be able to be used independently of one another.
Ed’s first idea was to tell one sensor to add 100000 to its total, this would mean that the two sensors would still be contained within one number box in Pure Data but the data readings would be so different, because one reading would be 100000 higher than the other, that in theory you could use the sensor data independently.
A good idea but unfortunatly this led to the visuals being even more difficult to control because the [threshold] objects (the objects that determine at what distance events happen) Where being doubly confused by the erratic changes in numbers.
Now enter Alfonso….
Alfonso graduated from Digital Arts last year and used the exact same sensors with Pure Data for his final project. His piece used one sensor but he did give me some patches that seemed to be set up for multiple sensors (although he had never tested them)
I loaded in a patch and to my surprise the sensors split themselves into two boxes. At first I thought that this was it! yet on closer inspection the data was very erratic. The terminal in Pure Data was saying that it was receiving data from sensor (1) then sensor (2) then sometimes a multiple of both and at other times just sent one sensors reading multiple times. Ed said that he and Alfonso had the same problem last year and that they never worked it out… then Ed had an epiphany.
He used a combination of trigger and float objects to split the signal, then printed the new signals out and either an ‘A’ or a ‘B’ for the terminal to comprehend. This rectified the data flow and we finally had total control over both sensors… hell yeahJ
Here’s a look at the patch so far. This uses 3 videos. One of which only plays when there is no presence within the space. The other two play as a composite when an individual enters the space.
After some experimentation I feel that the sense of field of one ultrasonic sensor is not going to be wide enough to reliably control the work. With this in mind I have also been thinking about how to use a space and its dimensions to my advantage.
This idea uses two ultrasonic sensors in a space that is 3 metres long, and 1 metre wide (enough room to comfortably fit one person) the work is shown on a 20” (or so) TFT monitor set at the furthest point away from the entrance. The sensors are mounted opposite the screen (my other ideas have placed the sensor either above or below the projection) I’ve found that the sensors work best when they are placed at upper body height, so it is not possible to have them on the same wall as the work for their positioning would be quite obtrusive, also placing them on the opposite wall means that they will distract less from the work, keeping the technology hidden as much as possible.
I want this to be an investigative environment, I feel by using a relatively small TFT as opposed to a large projection the viewer will be more inclined to approach the screen therefore walking the length of the installation, this will mean that the sensors can use their optimum range to maximum affect and hopefully the space will be more successful for it.
When there is no presence in the room a video plays at un-identifiable speed. When a person enters the space the video will slow down and change. The two sensors will control two separate videos composited on top of one another. They will also handle separate audio.
Using your movement you will have the ability to stop and start the audio and two videos either independently or simultaneously (depending on your distance and position within the room)
For me this process has some interesting connotations, firstly, by having an element of control over when the video is played or stopped gives the viewer the opportunity to inspect the footage and see imagery within it that would have otherwise been lost. This relates to how we experience the non-space at speed preventing us from ever truly identifying with it.
And secondly, with each individual having an influence over whether the videos are playing or are stopped means that the composite will progressively change. By the end of the show you would have an alternative edit or mix, a mix that has been shaped by the presence of the people that chose to inspect it. It would have shaped its own identity.
‘Even within the most anonymous of zones, signals of identity litter the landscape, embodying the space with a certain identity.’
I’ve been reading a bit of Michael Foucault to broaden my understanding of space and non-space. I’ve found his writings on heterotopias to be quite insightful, according to Foucault:
First there are the utopias. Utopias are sites with no real place. They are sites that have a general relation of direct or inverted analogy with the real space of Society. They present society itself in a perfected form, or else society turned upside down, but in any case these utopias are fundamentally unreal spaces.
There are also, probably in every culture, in every civilization, real places – places that do exist and that are formed in the very founding of society – which are something like counter-sites, a kind of effectively enacted utopia in which the real sites, all the other real sites that can be found within the culture, are simultaneously represented, contested, and inverted. Places of this kind are outside of all places, even though it may be possible to indicate their location in reality.
Because these places are absolutely different from all the sites that they reflect and speak about, I shall call them, by way of contrast to utopias, heterotopias. I believe that between utopias and these quite other sites, these heterotopias, there might be a sort of mixed, joint experience, which would be the mirror. The mirror is, after all, a utopia, since it is a placeless place. In the mirror, I see myself there where I am not, in an unreal, virtual space that opens up behind the surface; I am over there, there where I am not, a sort of shadow that gives my own visibility to myself, that enables me to see myself there where I am absent: such is the utopia of the mirror. But it is also a heterotopia in so far as the mirror does exist in reality, where it exerts a sort of counteraction on the position that I occupy.
From the standpoint of the mirror I discover my absence from the place where I am since I see myself over there. Starting from this gaze that is, as it were, directed toward me, from the ground of this virtual space that is on the other side of the glass, I come back toward myself; I begin again to direct my eyes toward myself and to reconstitute myself there where I am. The mirror functions as a heterotopia in this respect: it makes this place that I occupy at the moment when I look at myself in the glass at once absolutely real, connected with all the space that surrounds it, and absolutely unreal, since in order to be perceived it has to pass through this virtual point which is over there.
Foucault separates the basis of all heterotopias into six main principles.
1) All cultures have heterotopias.
2) They can be given different functions in relation to society changing over time.
3) They can juxtapose several real places into one.
4) They have a temporal dimension.
5) They are isolated through a system of opening and closing.
6) They are related to all other spaces.
For a more detailed explanation of these principles see this website.
Food for thought….
Julian Opie attempts to find new ways of dealing with our world as one of accelerated movement and overabundant information. As representations of reality supplement reality itself, everyday experience is increasingly about a fast-paced flow of imagery.
Society is in flux, buffeted by a constant flow of information and of people. Our lives are channeled through road, air and rail routes, around airports, service and railways stations, dependent on invisible and interconnecting cable and wireless networks.
Auge believes that we do not yet know how to look at this world; it is in fact a world that we read rather than look at, a world through which we pass at speed. Speed drastically alters our perception of the landscape and Opie’s art is much inspired by the idea of travel and motion.
‘Imagine you are driving’ presents us with an endless sequence of images of the road ahead: we have less of a sense of the inspiring and exhilarating pace of movement, and more an expression of the anonymity and monotony of motorway travel. But the obsessiveness of the depiction is compelling. We fix on the white lines marking the tarmac, propelled by the vanishing point towards a horizon we never reach.
Opie attempts to capture the real effects of driving, how the car both liberates and distances us from the world – so that we pass through the landscape quickly and are closed off from direct experience of it. The sights, sounds, tastes, temperatures and smells of the material world are reduced to the two-dimensional view through a windscreen, we find ourselves in a sealed, stable, weightless environment, with our senses impoverished and our bodies fragmented, we fall into a dream-like state.
Opie feels that when we drive through the city, the streets and buildings become the backdrop to our thoughts, virtual passages through which we move, on the way to another place. The signs and texts planted along the motorway tell us about the landscape through which we are passing, making its features explicit. This fact might enable us to refrain from the need to stop and really look.
Because we are constantly on the move we are always in a state of distraction, having to deal with a barrage of visual and social stimuli. This is where my own practice and Opie’s work overlap. We are both questioning whether because of our ‘modern’ state, and specifically when within these ‘modern’ landscapes, have we learned to overlook the subtlety and detail of the space? And how does this change our experience of it?
In contrast to the visual stimuli that Opie presents to us in ‘Imagine you are driving’, Opie’s ‘Cityscape’ is an audio recording of a journey through London by car. In it, he and fellow artists Lisa Milroy, Richard Patterson and Fiona Rae recorded what they saw en route, each of them focusing on a specific subject category. Opie listed the brands of cars seen, Patterson identified building types, Rae read from posters and billboards and Milroy described the people he observed along the way.
For me this work emphasises that we are unable to assimilate everything that surrounds us; that we reduce what we do see to the essentials in order to negotiate our way. But on the other hand, the abstract flow of words can be as evocative as actual images, and just by listening to them; we can conjure up images that we know intuitively from memory.
Our senses are always working and deciphering the world around us. My task now is to break down these spaces into their component parts, to discover the D.N.A of the space (a phrase that Andy should have credit for!) Once I can understand this, I can use this information to manipulate the environment, to change the code as it were and present a different representation that will attempt to challenge the conventional way most of us experience this landscape.
I downloaded various software from the Arduino website for communication between Arduino and Pure Data. These are Arduino2PD, Pduino-0.3.1 and SimpleMessageSystem (all of which can be found at http://www.arduino.cc/playground/Interfacing/PD)
First I uploaded the code to the Arduino board that I downloaded from the Arduino website (http://www.arduino.cc/en/Tutorial/UltrasoundSensor) I had zero success locating the sensors output within Pure Data with Pduino-0.3.1 and Arduino2PD, yet the SimpleMessageSystem seemed to locate this information instantaneously and outputted the numbers (that represented the distance the sensor was reading) as a list within Pure Data.
Here is the updated patch including the SimpleMessageSystem.
I have also modified this set up with 3 array tables that can play back 3 separate audio files. This is with the intention of experimenting with having different audio that triggers at different distances.
The [route list float symbol] connects via the list to a number message, this message receives the data from the ultrasonic sensor, the data is then passed onto the various [threshold] objects that trigger either the audio or video at set distances.
Here is a video of the sensor triggering a change in video.
This is all fine and dandy as a prototype, but there is a problem…
The sensor has a measuring distance of 3 metres, which is a good length but the problem in that the width of the sensors range, it is quite narrow. I have tried placing the sensor a various heights and had minimal improvement, but it does seem that the best position is for the sensor to hit you right on the nose (a premise that is advocated by various forums I had read about ultrasonic placement) But it is still not good enough as the results are often erratic and temperamental.
The solution is to have multiple sensors so that the sense of field is widened. I have downloaded a multiple sensor code for the Ping. Tim had a look at the code the other day and we managed to get it working with Pure Data with the signals being received one after the other for the ‘multiple’ sensors (albeit we were only had one sensor but the principle was there)
This opens up another problem though…..
The [threshold] object waits until it receives a distance from the sensor that is lower then the value set and then it alters the visuals or audio. When the distance rises above this number it switches back. The problem is that if there is only one [threshold] object is receiving information from, say, three sensors, it will get confused as which number to read. As if one sensor drops below the threshold rate and the other two sensors stay above this number it would flip the video or audio uncontrollably back and forth and behave erratically….not what I had in mind.
The way to solve this would be to employ some sort of gate to the threshold. The idea being that if the [threshold] object received a distance from one sensor that was below the threshold rate then the gate would block all information from the other sensors and the threshold would only read the sensor that is currently below the threshold rate.
When said sensors distance rises above the threshold rate then the gate would open allowing the other sensors to be read again and control the imagery.
This is possible I know. I think I may need the help of a Mr Ed Kelly…..