Eyetracking. The replacement for headtracking.

16 Feb

I’ve been a bit lax with my updates the past week or so. Not due to the fact that I havn’t had anything to post, but the sheer fact that I’ve been developing so much tech in such a short amount of time.

It all started with the fact that my headtracking went down the toilet. It proved to be the biggest problem and threat to my entire project.

I spent weeks trying to fix the perpetually broken head tracking system that was held together with shoddy pieces of cardboard.

After getting headaches and proving to myself that the headtracking system I had in place was going nowhere, I decided to research into the back up plan. Eye tracking.

I noted earlier on in an earlier post that eyewriter.org had posted up information about creating a simple, cheap eyetracking system with their own custom software. I decided to look into what I needed and the price to create the whole system turned out to be about £40 so I decided to give it a shot.

Using a ps3 eye, some simple sunglasses and lots of tape, I put together the simple system.

It works remarkably well for something that took a little over an hour to put together. The hardest part was taking the damn camera apart.

The camera needs to have its visible light filter taken out and a bandpass filter installed so that it can only read IR light. The theory is that if you bathe the eye in IR light, point this camera at your eye, you can then track, with the appropriate software, the pupil of the eye. The IR light allows for much easier and more accurate tracking of the pupil.

And, it works. Reliably.

I can keep coming back to this system and it will work, straight away. It’s such an improvement over the headtracking.

There was one downside, I can’t wear these sunglasses over my normal glasses.

I had a look round for goggles and safety visors, and decided that what I needed was some ski goggles. They look good, are tinted for extra glamour, easily fit over glasses and they dont need to be held on as they have an adjustable strap.

Getting the camera to work with the new goggles was a little tricky, it took a bit of tweaking, but after finding the right position, it works like a charm.

I’ve even tried these new goggles out on my housemates, they were able to put the goggles on and the whole system worked straight away. Brilliant!

The next step for me was to get it working in Flash. This took a little fiddling about with xcode and open frameworks. The eyewriter app is written with open frameworks. I’ve never even touched open frameworks before so this was fun for me. I had to look up how to get data out from OF and into Osculator, the program I use for shifting data between flash and other software. I used this program for grabbing data from the Wiimotes and was confident that I could get the data from OF into Osculator.

Luckily, someone has already written the library for handling this exact task, so I just had to implement this code. This itself was all down to luck, just copying over files and pasting in code until all the errors went away and data started to flow into osculator.

Luckily, I had already written the flash code to handle data input from osculator. I just had to adapt the wiimote code so that it would handle the new data from the eyetracking.

This was simple enough. Translating the data into something that would work better within the flash stage was slightly tricky. This was just me changing numbers around until the movement felt good and accurate.

I came out with some very odd figures, like the formula for the x axis is:

bl.x = ((_eyeX-25)*stage.stageWidth/12);

I’m not really sure why those numbers work, but they do, so I can’t complain.

The final hurdle now, was the calibration screen. The xcode app needs to be calibrated for every user, as everyone is individual like a snowflake.

What I needed to do was to send a byte out from flash, into xcode to trigger the calibration event which I could then mimic in Flash.

The plan is that if I could get this action to occur in xcode ( and judging by how their calibration screen is triggered by two simple functions, I knew I could do so), then I could layout the screen in excatly the same way in flash, animate the screen in exactly the same way, so that when people look at the points in the flash version, xcode thinks they are looking at the right points in the xcode version.

Sneaky sneaky.

After more playing about with ports and osculator and all the functions from the OFOSC library, I finally got actions to be triggered in xcode from flash.

Everything is golden.

I can now calibrate the eyetracking in xcode via flash, I can control flash via xcode and the eyetracking works.

My project is becoming reliable.

I’m very happy now, I’m confident that this project could be used in an exhibition style way without it falling apart or having minute tolerances like the head tracking did.

And also, eye tracking is more natural, you look at the things you want to destroy.

The only limitation that I’ve currently found is that you have to keep your head still, but that can be done with a well crafted chair.

It’s going well.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: