Friday, April 15, 2011

Tippy the Telepresence Robot

So Vincent, Benny and I have been working on a cute little project over the past few weeks. It's a compact telepresence robot, similar to the one that the famous Johnny Lee wrote an instructional for on his procrastineering blog. The key factor of our design is the optical coupling between the mobile device and the robot control hardware: in our implementation, we use Skype for two-way video but embed the control signals in the video stream as well - thereby reducing the amount of development (in terms of hardware and software interfacing). We've submitted an ICEC demo paper along with the following video, which should do a slightly better job at explaining how it works:


We hope to get together an instructional, and a DIY kit soon so people in the community can play around with the idea (if the video isn't enough to get you going :)

A few points on my personal motivation/comments for this project:

1.) Robots are cool; telepresence robots are also cool.

2.) Mobile devices are so powerful these days that you can do so much with them. However, for the DIY hobbyist/hacker, it's not that easy to interface these devices with custom hardware that you build. For example, on an iOS device (iPhone, iTouch, iPad etc), you are quite limited when it comes to interfacing hardware. Even if you do have a developer license, it's still pretty hard to get a physical connection to the serial port working (at the risk of voiding the warranty and blowing up the device, etc). Using platforms such as Android does away with the license hurdle, but the hardware interface is still not easy. This project is essentially a demonstration at a quick and dirty (but working!) solution to this problem. There are many ways that it can be improved (e.g. modulating the visual signal, auto-calibration of the sensors, etc etc), but it should provide a good starting point.

3.) I'm surrounded by a bunch of amazing people with different skills and really appreciate the opportunity to work with them. This project was definitely worth the evenings and weekends we spent at home or in the lab tinkering away. Good job guys!

Sunday, February 06, 2011

access granted

It's been a while since we finished this project, but I've been neglecting to update this... so here goes: (Video should be pretty self-explanatory).

Sunday, November 28, 2010

The Singing Notebook

As exploratory steps in controlling articulatory speech and singing synthesis (research topic), I hacked some sensors into an old notebook that fittingly contained some notes regarding the hardware I was using.


The book contains a bend and two pressure sensors hooked up to an Arduino Pro Mini interfaced to the laptop using a BlueSmirf serial to Bluetooth interface, and runs off a 1000mAh lipo cell. In the first mapping, the one pressure sensor controlled the lung parameter of the source model driving the synthesis, and the other one the blend between the tube shape for an I and AH vowel. Then, I change the mapping using a preset (on the laptop keyboard) which changes the target vowels used for the blending. Finally, I open the book and show the bend sensor controlling the pitch.

Friday, June 04, 2010

Simple IR Filter using exposed film

A friend is doing a project that involves building a robot that finds and blows out candles placed within a course. They are thinking about using an infra red filter on a webcam to locate the candles, and asked me for suggestions. I know one could purchase filter sheets that serves this purpose, but thought there must be cheaper options/quicker options. First I thought about using red/dark red coloured cellophane, as the frequency is close enough to infra-red and maybe with enough layers it'll block out most visible light. Also, where to find the right coloured candy wrappers? Arts and crafts stores? There MUST be a cheap and simple way to get an infra red filter... (without having the crack open a TV remote or a Wii-controller - all these devices have infra red filters on the front).

After a quick google and 2 minutes of tinkering, I sent an excited message back to my friend with the following images:

2010-06-04 23-35-50.726



The solution: Exposed film negatives!! It turns out developed film that has been exposed to light is a great infra red filter. The bit at the beginning of the roll is usually exposed when you put it into the camera, unless you load it in the dark. (I remember sometimes trying to do this under a blanket to save the first few shots of a roll... this way a roll of 36 can get you 38~40 shots, if you're lucky). You can see the tea-light candle and the film strip in my hand in the above image. And here is what the captured image looks like with the film strip taped onto the front of the web cam:



2010-06-04 23-36-51.840


The solution was so quick, cheap and simple that I had to post about it, right away! :-)

Now the problem of course is the lack of availability of film these days... luckily I kept all my photos/negatives from before...

Credits to here and here.

Monday, April 19, 2010

Headtrack+Servo+Webcam = fun

For our EECE541 project, we're building a 3D webcam chat system. I've been working on the headtracking portion, and decided this would be a neat way to demonstrate the results:



Ingredients:

-Processing sketch
-Arduino Microcontroller
-Servo
-Two webcams: one for head tracking, one for the view. One is slightly hacked to fit onto the servo.

The system demonstrates three concepts:

1. Simple pixel-based 1D head tracking - take the difference between a static background and a live view, threshold it, and return the horizontal value of the top-most pixel

2. Fishtank AR/VR systems: utilizes the viewer's position to render a scene. Kinda like this.

3. Arduino/Processing: An awesome platform to work with hardware and software to prototype new ideas. Servos, webcams, image processing... all done with a couple lines of code!

Monday, September 08, 2008

Wii-mote Controller for Google Earth

A really simple controller for Google Earth using the Wii-mote. Simply maps a few buttons and actions to keyboard keys using GlovePIE. This was one of the first youtube videos I ever uploaded... somehow forgot about it...

Saturday, April 05, 2008

Wii-mote Force-feedback Joystick

What do you get when you combine a Wii-mote, a force-feedback joystick and a few lines of GlovePIE script?


code for part 1:

var.Y = MapRange(wiimote2.Pitch, -90, 90, 1, -1)
var.X = MapRange(wiimote2.Roll, -90, 90, 1, -1)
var.S = 0.8

joystick3.Vibration1 = var.X*var.S
joystick3.Vibration2 = var.Y*var.S
code for part 2:
var.xRot = Wiimote2.RelAccX
var.yRot = Wiimote2.RelAccY

joystick3.Vibration1 = MapRange(var.xRot, -50, 50, -1, 1)
joystick3.Vibration2 = MapRange(var.yRot, -50, 50, -1, 1)


note the joystick/wiimote numbers. your mileage may vary.

Friday, April 04, 2008

Time Lapse

Nothing to do with the wii-mote here. A simple time-lapse video of a trip to the Van Dusen Botanical Gardens in the heart of Vancouver.




I've wanted to do this for some time now, but never got around to it. Finally, on a sunny spring day, between writing a big project report and working on the presentation for said project, I finally managed to get it done. I had my camera set on the lowest resolution (640x480), and lowest compression settings (the icon that looks like stairs on Canon cameras :p). The entire video contained about 400something shots, taken at roughly 4~5-step intervals (I used steps instead of time to try and create a perceived constant motion). There are some skips here and there, since I wasn't being totally meticulous about the shooting intervals...



The pictures were imported into Windows Movie Maker, essentially creating a slideshow 'movie'. I set the image/fade durations to give it the accelerated motion feel (about .3 second for each image). And thats it... more or less. Nothing fancy...

Saturday, January 12, 2008

A 'full' Wii-Drum kit

So I've been playing with the wii-motes on the PC for a while now, on and off. Karl Kenner's GlovePIE makes it very easy to fiddle around with the various input and output parameters of the device. One of the demos that come with GlovePIE include a simple 'drumming' script that was based on the work done by Bob Somers. This script continuously measures the force values of the wii-mote, and if it is beyond a certain level, a midi drum note is triggered depending on the buttons that are being pressed at the time.

A snippet of the code appears as follows:
var.yRot = Wiimote1.RelAccY
...
...
if var.yRot > 25 then
var.S = Wiimote1.A and ...[combination of buttons]
....
endif
else
var.S = false
endif

Midi.AcousticSnare = var.S
...

If you duplicate the code for a second wii-mote, then you could have two drum 'sticks'. However, another major component of a 'real' drum kit is missing: the foot pedals. Since you can attach a nunchuk controller to each wii-mote, and the connecting cord is long enough to reach from your hands to feet (when sitting down), it seemed like a good idea to emulate the foot pedals with nunchuk controllers. And thats what I did. The result is here:




I simply added another variable that's hooked up to the nunchuk's force values, and an additional if loop that triggers the bass drum, and hi-hat on the two different nunchuks, respectively. An 'open' high hat is also added by looking at the pitch of the hi-hat nunchuk. So if you have your foot up, and hit a normal hi-hat with the 'stick' (wii-mote in hand), it will generate the open hi-hat sound.

Here is the code. If you have any questions feel free to ask me.

And by replacing the midi note triggers with keyboard inputs that correspond to the ones used in the DTXMania simulator, I was able to do this: (sorry about the song, which apparently not a lot of people like ;)


Monday, September 25, 2006