A technical perspective of developing Spaced Out

23 May

So, I’ve talked a lot on here about the whole process of Spaced Out, all the technical stuff is covered but it is dotted around in various places.

For those of you who are new to this blog/project and wondering what the hell I’m talking about, Spaced Out is a mind controlled arcade game that I made as part of my final project at university. It’s a game all about a giraffe in space who gains mind powers and has to battle his way home through various enemies including David Bowie.

The game is created using Adobe Flash, and for input I use a brain wave reading headset and a kinect. And this is how I did it.

In reflection, I developed four separate interface systems to control my game. One of course being mind control, that one was set up within the first few weeks, and I had it running, and running reliably which is something that I’m always keen on developing.

But the others? Well I only chose one of those in the end, the Kinect of course. But first of all I had other motion controls, both of them a bit ropey.

Let’s discuss.

Brainwave reading was an obvious first choice for me, it was something that would allow me to control flash games through the power of thought, and that’s just crazy, even now 9 months after I started this project. (9 months, holy crap. It’s just flown by). I can show this project off to people and they look at me like I’m actually using magic.

And that’s why I chose mind control. Who wouldn’t want to make a game where the go “Are you kidding me? What?!” when you tell them that you just have to THINK to control the game.

That was the simplest choice I had to make during the project.

It’s pretty straight forward in terms of its technology and implementation too actually.

I use a star wars force trainer, which I took apart and found some pins inside. I read up online and found out that you could hook a micro controller up to two of those pins and read serial data out from them. A little bit more digging and I found that someone had written up some code for the arduino so that it could read the serial data in through that micro controller. This was perfect for me, I like arduino’s and I know how to implement them with Flash and control things in real time.

To get the data from Aduino into Flash, I use Tinkerproxy, the mac version of serialProxy which is just a bit of java code that takes the data from arduino and sends it out on a local socket which you can then tap into with a bunch of programs, such as Flash.

I then use some pretty standard code for getting the data into flash.

private var arduinoSocket:Socket = new Socket(“localhost”,5331);

That’s how I set up a basic socket in as3, and then I split the incoming string to split up all the data which I store in an array so I can have the Concentration data, the Relaxation data and the Signal data all streaming from the headset.

var arduinoOutput:String = arduinoSocket.readUTFBytes(arduinoSocket.bytesAvailable);

output = arduinoOutput.split(“&”,4);

This code isn’t anything special, it’s just some basic AS3 socket code that can read incoming data from an Arduino. It’s the barebones type stuff that gets things up and running and working quickly.

NOTE: I have three items of incoming data, but I split my string up into 4 chunks. Why? To stop an annoying bug that was rendering my array useless a lot of the time.

When streaming data over sockets from arduino to flash, you tend to get an invisible character turning up every now and again, usually a carriage return or something like that. It’s just something little that can throw my whole game out of balance.

In my arduino code, I just add on an extra character, and then in flash split the string up into 4, this protects my first three data chunks from getting malformed by this bug. Annoying when it happens, but a relatively simple fix.

Now, that’s pretty much it for my brainwave reading dabbling, now I shall move onto an interface method which took up half of my total dev time. Head tracking.

This, on reflection was a silly mistake really, I’m not really sure why I spent so much time trying to get this to work instead of finding other methods.

What I did was mount IR LED’s on top of the brainwave reading headset. Then, by using a wiimote, I could track the players head position. They could then move their head side to side and rotate it to control a cursor on screen. This is old hat for us IMer’s, we’ve done motion controls with wiimotes before. When it came out and it was found to be hackable, of course we used it! Full on cheap motion controls for our installations. Brilliant.

Well, this method kind of worked, if you sat in the exact right place under the mounted wiimote(s). But even then, shift a bit out of place and the whole control mechanism goes out the window. That’s not cool.

But, I did learn how to make a wiimote to flash connection package that worked a hell of a lot better than wiiFlashServer. On mac, wiiFlashServer can spend ages not connecting your wiimotes to your mac, sometimes it just wont work at all. It’s really weird.

I found out about a neat little program called osculator. Now, this is for Mac only, but if you are a flash developer, with a mac, playing around with wiimotes, I’ll let you in on how I made instant connecting, wiimote powered flash games. It’s bloody marvelous.

What I did was create an osculator file that moves all the necessary data to the right place. It moves the data from the localhost port to 9000. This was a little tricky to set up, so if you want to use my files, you can. I’ve added all my files for connecting wiimotes to flash, including my AS3 code to github.

https://github.com/JonathanReid/Osculator-to-Flash—Wiimote

Thats the link right there, feel free to download it and maybe improve it, but it’s a nice little stand alone class that you can plug into your games and access instantly.

Right now, that code just deals with the IR sensor input from the wiimotes, but it can be easily adapted to use button clicks instead if you prefer.

The code uses FLOSC to get the data from OSC into flash and parse it correctly, my code takes the OSC stream and makes it more human friendly.

Anyway, that’s one of my more exciting moments in coding to date, developing that little system.

Back to wiimote head tracking. It became a little problematic as I found that people didn’t sit down the same as others. Also, I would have had to build a huge cabinet to hold everything in. Not fun for me, lots of hard work there.

So, after some time I decided not to bother with head tracking. I thought it was time to move onto something that I personally found quite exciting. Eye tracking.

Now, I had seen these projects where people had created really cheap ways of eye tracking, and I thought that with a little jiggery pokery, I could get that info into flash and control things with my eyes.

Well, I did.

I built myself a little eye tracking system based on the design from the guys over at the graffiti research lab and used their code as a base to start on.

They had built their code to track eyes with in Open Frameworks. And this was my first foray into the OF world. It’s confusing at first since if you don’t link up your files correctly, (theres lots of files, I dont know where any of them are) you can’t compile.

So after a long while trying to figure all that out, I finally got it to work, and then from there it was only a matter of time before I had it sending out the data in a flash compatible way. Didn’t take too long actually, someone had written an OF to Flash plugin for OF, all I had to do was send over the data that I needed, the X and Y co ordinates of your eye.

And I did, and I had a game that you controlled by looking at the screen. Cool ay!

Apparently not to anyone apart from me, others found it cumbersome and not very interesting or exciting. There was no magic, everyone could see exactly how it was done so it didn’t excite anyone.

Now the Kinect, that brought up all kinds of excitement.

I had the eye tracking running for about 2 weeks before I decided to ditch it. That’s just the kind of world I live in. So the kinect it is.

This was to be my final choice for input. This is the one that I would be exhibiting with. Luckily I had it up and running with my game in less than a day.

Now, to do so I used a program called TUIOKinect a little program by the makers of the reactable that detects hand sized blobs with the kinects camera and then broadcasts it over your computer in a TUIO friendly manner.

Luckily for me, there is so native AS3 TUIO code that can harness the power of blobs. After implementing that, I hand control over my game by waving my hands around.

This proved to be a much more intuitive interface than the previous two, which is good because at this point, I had run out of ideas to rely on.

People are also still wowed by the kinect, they can just put out their arm, and there it is, they are controlling stuff on screen. That’s pretty cool right?

I would talk more about the TUIO as3 client but I simply dont know much about it, I got it working by sheer luck really, It slots in my code and stores all instances of blobs on screen as an array which I can access and create cursors for everything in that array, which is what I did. I’m not sure what’s going on past that, I’m not much of a TUIO man, I’ve dabbled in it before, trying to see how it would work and I had never got it up and running before, it’s not exactly the easiest of things to get running to be honest. So, if you want to learn more, have a look at their site tuio.org.

And that’s how I went about creating four different interfaces for my game.

The game itself is quite a simple one, it’s my first steps into the flash gaming world, and now I know that I’ve made plenty of obvious coding mistakes while creating it, but hey, now I know what to do. I can code cleaner now and I’ve learned a lot about keeping your code manageable. All through my own proud mistakes.

The game is essentially a little tower defence type game, you have a bunch of enemies held in an array that get animated towards the left hand side of the screen. The enemies are generated based upon your meditation levels. If you are relaxing over a certain amount, an enemy gets generated. Simple stuff. But, as I found out while testing, this wont work for everyone, some people just dont relax, so as a fail safe, I added in an enemy that gets generated every 5 seconds or so to keep the game going.

Then, I use simple hit test structure with the cursor (generated by the kinect sensed hands) and the enemy. When that hit test is triggered, I also check to see if you are concentrating over a certain level, if you are, then the enemies explode!

so the in game process goes Point > Think > Explode.

Simple stuff in terms of gameplay, but’s it’s one that people enjoy due to the physical and mental aspects of the game rather than the exciting gameplay mechanics.

Hey, I taught myself how to make flash games while making all the rest of that stuff, what do you want from me?

being truly indie, I also drew all my own graphics which is fun. I love drawing so it’s nice when I get down to drawing and art again, I have a very distinct style of photos + vector outlines. It does mean that it takes me some time to do just one character since I draw them by hand first, but hey, to me it’s worth it to get that visual style. It fits in well with the whole surreal aspect of my game to. Handy that.

So yeah, learn to embrace the ports and sockets, because you can make some pretty fun stuff when you start streaming data all over the place. Get a wiimote and try it out yourself.


		
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: