Thursday, April 16, 2009

My Part of the FinAL Thing


http://www.youtube.com/watch?v=WTMWZmzHn6w Doc VIDEO

The Flash part of HAND IT was created to take in the data sent by MAXMSP and manipulate that data to work with the interface. MAXMSP sends the two colors tracked as “xyxy” strings in an XML. The first xy is first color being track. The second xy is the second color being tracked. The two colors we used are red and blue because they are the most stable colors. In the Flash, we have created many objects for the game. There were the gates, the notes, the hands and the scoring system. The hands are controlled by the xyxy data sent from MAXMSP. The gates are used to increase the score of the player. They exist in the four corners of the interface. Notes shoot from the middle to one of the corners of the interface to match the beat of the game. Gates become active when the hand is hovering on the gate. When a note passes over the gates, by activating the gate while the note is over, the score will increase. The score determines the final result of that the user will receive. If the score was less than 900, then the user will receive a fail screen. If the score were between 900 and 1200, the user will get a RANK B ending. If the user miraculously scores over 1200 points, then the RANK A ending will appear showing the magnificent universe that he or she has created. Most of the flash were done in if statements, and for constant updating they were all done using enterFrame events. We used actionScript 2.0 because that’s how FLOSC (Flash Open Sound Control) was written in. Originally the entire game was already written in actionScript 3.0, but because we were unable to rewrite FLOSC, we were forced to translate it to actionscript 2.0. The Flash part that linked to MAX MSP was to receive the data, identify how the xyxy were sent, break it down into individual x and y integers to synchronize the hand.


CODE:


For score detection:

stageMC.onEnterFrame = function() {

//TOPLEFT SCORE CHECK
if ((HANDL._x > 60 && HANDL._x <> 50 && HANDL._y <> 60 && HANDR._x <> 50 && HANDR._y < 130)) {
hitS.gotoAndStop(2);
if (s1._x > 60 && s1._x<140) {

pointS += 1;
ScoreS.text = pointS;
}

if (s2._x > 60 && s2._x<140) {

pointS += 1;
ScoreS.text = pointS;
}
} else {
hitS.gotoAndStop(1);
}
//TOPRIGHT SCORECHECK
if ((HANDL._x > 500 && HANDL._x <> 50 && HANDL._y <> 500 && HANDR._x <> 50 && HANDR._y < 130)) {
hitP.gotoAndStop(2);
if (s3._x > 500 && s3._x < 580) {

pointP += 1;
ScoreP.text = pointP;
}

if (s4._x > 500 && s4._x < 580) {

pointP += 1;
ScoreP.text = pointP;
}
} else {
hitP.gotoAndStop(1);
}

//BOTTOMLEFT SCORECHECK
if ((HANDL._x > 60 && HANDL._x <> 350 && HANDL._y <> 60 && HANDR._x <> 350 && HANDR._y < 430)) {
hitM.gotoAndStop(2);
if (s5._x > 60 && s5._x<140) {

pointM += 1;
ScoreM.text = pointM;
}

if (s6._x > 60 && s6._x<140) {

pointM += 1;
ScoreM.text = pointM;
}
} else {
hitM.gotoAndStop(1);
}

//BOTTOMRIGHT SCORECHECK
if ((HANDL._x > 500 && HANDL._x <> 350 && HANDL._y <> 500 && HANDR._x <> 350 && HANDR._y < 430)) {
hitO.gotoAndStop(2);
if (s7._x > 500 && s7._x < 580) {

pointO += 1;
ScoreO.text = pointO;
}

if (s8._x > 500 && s8._x < 580) {

pointO += 1;
ScoreO.text = pointO;
}
} else {
hitO.gotoAndStop(1);

}

};


CODE for Shooting STUFF to the corners


//FUNCTION TO SHOOT THE STARS to the CORNERS.
function topLeft1() {
var Tx1:Tween = new Tween(s1, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s1, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function topLeft2() {
var Tx1:Tween = new Tween(s2, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s2, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function topRight1() {
var Tx1:Tween = new Tween(s3, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s3, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function topRight2() {
var Tx1:Tween = new Tween(s4, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s4, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function bottomLeft1() {
var Tx1:Tween = new Tween(s5, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s5, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

function bottomLeft2() {
var Tx1:Tween = new Tween(s6, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s6, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

function bottomRight1() {
var Tx1:Tween = new Tween(s7, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s7, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

function bottomRight2() {
var Tx1:Tween = new Tween(s8, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s8, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

topRight2();
bottomRight1();
bottomLeft2();

//FUNCTIONS THAT MOVE THE HANDS


moveLEFT(10,10);
moveRIGHT(501,100);

//HANDL.startDrag();
//TEST PURPOSE
stageMC.onKeyDown = function() {
//topLeft1();
//var Tx1:Tween = new Tween(s7, "_x", None.easeNone, 0, 190, 3, true);

};

Thursday, March 5, 2009

023 TAI IAT 320

Well this time our project did NOT go as smoothly as I hoped for.
I really wish i couldve done more with the programming, because I would have cleaned it up and I understand how the code, but i never got a chance to touch it, if i oculd I wouldve added more images in the series and Make the growth more smooth.

The slides were pretty bad, I wanted to illustrate that the USER's Energy is being captured in the camera, and from that then goes to

We found our technical research from Motion Capture by Chris Teso
And our theory research from

Christa Sommerer and Laurent Mignonneau's

INTERACTIVE PLANT installation.

Where in their installation, the plants growth, in the screen, is affected by the real plants around the installation.

OUR EOC

Our attempt was to grow the plants from our own energy, like if plants could absorb the energy emitted of humans, this would be a simulation of how they would grow.

Tuesday, February 24, 2009

EOC, Energy Over Camera

Our project EOC, (sketch2), so far progress quite well.
Currently it only possesses 3 thresholds for images in a series of 3 images.
Camera captures how much motion is created and sums it up by pixel through difference.
As motion requires energy, our concept is our energy is captured through the camera to the image inside the screen affecting its state.

Currently we lack... feedback, immediate feedback to be exact. If a user went in front of our project, they would know nothing because it lacks feedback to the user. As an improvement, we will add a feedback bar, to display the amount of energy captured through the camera.

AND

Next time, the series of images will be more meaningful, such as something with a little more life.
Like a seed, that can grow into a giant plant due to the energy it captures, and keeps its state rather than revert when it runs out of motion.

Thursday, February 19, 2009

MAXMSP

Well for the last week we've been learning MAXMSP and Jitter.
It actually seems quite complicated, and theres lots I don't understand in terms of syntax.
Sketch 2 will be quite difficult to do without understand those syntax.

DAY2:

Well that now I've hooked up MAXMSP and the web cam, I noticed something about compatibility between 4.7 and 5.0, 5.0 CANNOT go back to 4.7, which forces me to work in 5.0 and no long able to move back to 4.7.

The syntax is still Very complicated.

Thursday, February 5, 2009

Our Code for Sketch1

The code is very similar to the AnalogInput provided by arduino.

int potPin = 2; // select the input pin for the potentiometer
int potPin2 = 3;
int ledPin = 13; // select the pin for the LED
int ledPin2 = 12;
int val = 0; // variable to store the value coming from the sensor
int val2 = 0;

void setup() {
pinMode(ledPin, OUTPUT); // declare the ledPin as an OUTPUT
Serial.begin(9600);
Serial.println("Systems On");
}

void loop() {

val = analogRead(potPin); // read the value from the sensor
Serial.println(val);
val2 = analogRead(potPin2);
Serial.println(val2);

if (val > 100) {
Serial.print("Object Detected on Sensor 1");
digitalWrite(ledPin, HIGH); // turn the ledPin on
} else {
digitalWrite(ledPin, LOW); // turn the ledPin off
}
if (val2 > 100) {
Serial.println ("Object Detected on Sensor 2");
digitalWrite(ledPin2, HIGH); // turn the ledPin on
} else {
digitalWrite(ledPin2, LOW); // turn the ledPin off
}
}

All the images/videos are located on Zack's blog.
The biggest issue we had with this project was the wiring. The coding of was easy, but the wiring and attaching the ardruino was really really difficult, The wires kept on popping out and breaking the connection.

It is strange that one of the lights are dimmer than the other. And I don't know how to add more power to that slot without causing the code to malfunction.

Anyhow whats done is done.

Monday, January 26, 2009

Week 3 - Sketch One Optic Blast




For this week our team have decided on to creating a optic visor which is suppose to relate to the visual impaired.

At first the idea was more of a children's toy to relate to X-Men, Cyclops's optic visor. Where it can simulate the character's attacks. However, it doesn't seem to possess any sort of purpose other than being a children's toy or a cosplay prop.

Back to the visually impaired relation, a possible direction to take this is to use this as a communication tool for the visually impaired. Similar to a white cane, it can be used to identity those who are blind or legally blind, perhaps give signals to them to help them around their environment.

The original attempt at this idea was to use infrared proximity sensors to identify objects around the user, and by using the strength of the light to judge whether the object is close or not. However do to technical issues, our team was unable to do that. Also only those who are legally blind (not completely blind) can use this optic visor as the user will still need his/her perception of light.

Instead we're going to use this optic visor to help communicate to the outside world. The user will be a visually impaired to person. The maximum level of blindness is "unable to see, but still can sense light intensity".
It will feature the following:
1. Identify day and night..., as a blind person, you are unable to tell whether or not people can see you, thus, by using a light sensor, when the surrounding become dark, the LED will constantly emit light, thus allowing others to see you.
2. Communicate by Morse code: On the side of the visor will be at least 2 buttons which causes the LED on the visor to blink. The long blinks are lines, and the short blinks are dots. The most unique way for this to work is that the blind person is trying to communicate with a deaf person.

3. This feature is currently unclear, but it will have something to do with acknowledgment.


here are some of the concept work;



Tuesday, January 20, 2009

Week 2 - Circuitry and Conductive Material

This week we had a review on the circuitry of Arduino and digital input.
The digital input we had was a push button, where in the example code if we pushed the button the LED will light up and when we released the button, it turns off.

From that we were suppose to edit the code so that there are different modes for the lights to turn on. Our team's code gave the LED 3 different modes. OFF, Blinking, and ON.
The code is as follows:

int ledPin = 13; // choose the pin for the LED
int inputPin = 2; // choose the input pin (for a pushbutton)
int val = 0; // variable for reading the pin status
int mode = 0;
boolean press;
void setup() {
pinMode(ledPin, OUTPUT); // declare LED as output
pinMode(inputPin, INPUT); // declare pushbutton as input
}

void loop(){
val = digitalRead(inputPin); // read input value
if (val == HIGH) { // check if the input is HIGH
press = true;
if (mode == 0) {
digitalWrite(ledPin, LOW); // turn LED ON
}

else if (mode == 1) {
digitalWrite(ledPin, LOW); // turn LED ON
delay(500);
digitalWrite(ledPin, HIGH); // turn LED ON
delay(500);

}
else if (mode == 2) {
digitalWrite(ledPin, HIGH); // turn LED OFF
}
}

else {
if (press == true) {
press = false;
digitalWrite(ledPin, LOW); // turn LED ON
mode += 1;
if (mode == 3) {
mode = 0;
}
}
}
}

As for the lab assignment we are trying to make a bracelet using the same concept.
//FASTER BLINKING Mode
//----------------------------
else if (mode == 2) {
digitalWrite(ledPin, HIGH);
delay(200);
digitalWrite(ledPin, LOW); // turn LED OFF
delay(200);

}

//2 FAST BLINKS, 3 FASTER BLINKS Mode
//--------------------------------------------
else if (mode == 3) {
digitalWrite(ledPin, HIGH); // turn LED ON
delay(100);
digitalWrite(ledPin, LOW);
delay(100);
digitalWrite(ledPin, HIGH); // turn LED ON
delay(100);
digitalWrite(ledPin, LOW);
delay(100);
digitalWrite(ledPin, HIGH); // turn LED ON
delay(50);
digitalWrite(ledPin, LOW);
delay(50);
digitalWrite(ledPin, HIGH); // turn LED ON
delay(50);
digitalWrite(ledPin, LOW);
delay(50);
digitalWrite(ledPin, HIGH); // turn LED ON
delay(50);
digitalWrite(ledPin, LOW);
delay(50);

}

// "And the screen went
BeepBeepBeepBeepBeepBeep!" Mode
else if (mode == 4) {
digitalWrite(ledPin, HIGH); // turn LED OFF
delay(50);
digitalWrite(ledPin, LOW);
delay(50);
}

(Input IMAGE HERE)

The code remained pretty much the same except for 3 added modes and the LEDs are connected through parallel in Pin 13. Also the LEDs are copper taped onto the felt and a button is sewed in.
However, we do notice that sometimes the connection is weak.



Tuesday, January 13, 2009

Week 1 - Progressing with I/O


During this week in the lab, we were told to combine an input of an certain thing, in my case DIRECTION, to produce a certain output, for me it was FIRE.

Through my brainstorming, I started with an object which had similar quality of input and output. a FLAMETHROWER, or a TORCH, however then I realized that we can take qualities of the output element and manipulate the scenario which uses the element for its qualities.

So then my scenario became a direction controlled heater. The user is in a structure where the human has become the dial to control the heat. If the user were to spin on the spot to the clockwise, the sensors in the room will detect that and increase the heat, and if he were to do it counter clockwise, the heat in the room would reduce.

Later in a different exercise, I teamed up with Zack Bush, who possessed the input of BRAINWAVES, and the output of WIND. Our job was to make a complete system with all the inputs contributing to the output.

We began to brainstorm by doing each one individually and picked out the similarities. By combining the WIND and FIRE, we created air conditioning as the OUTPUT, and by combining BRAINWAVES and DIRECTION, we created a locator which detects human comfort. Our whole system was suppose to be in a room rigged with brainwave sensors, motion sensors and air circulation system. Depending on where you are and what your brainwave emits, warm air or cool air may blast towards your direction.

After that was done, we tested out Arduino, by lighting up and programming the blinking of LEDs and that was the end of the lab.

-THOMAS