Nanoflix.net Blog

by Peter Rasmussen

Home Made Motion Capture

Posted by Peter Rasmussen on April 17, 2007

Gyp 

This is an extended reply to Hugh Hancock’s comment to the “Up to your armpits in virtual space” post.

Yes, Gypsy looks great. It’s so simple.

Around the time I was starting in on Stolen Life I knew that eventually I wanted to do a machinima movie with human characters. Wouldn’t that be a novelty? I took a close look at the Gypsy motion capture system. As low cost as it is it’s still a ton of money to scrape together.

Being someone who loves to reinvent the wheel I seriously took on an attempt to make at least a partial motion capture system of my own taking a few tips from the Gypsy system. I took apart an old “ball” style mouse and connected a long leaver to the light wheel via a large pulley. I set up a virtual arm to match it in 3DGameStudio. For a while it looked like it might work. I’m sure I didn’t have any mechanical slippage in my system. The problem seemed to be in the electronics. The first swing was fine but with continued movement there was accumulative error and the zero start point of the virtual arm would drift further and further away from the arm in the physical world. The real showstopper was that windows XP unlike previous operating systems will only support one mouse. You can attach two but you can’t collect xy coordinates discreetly. So I put it all away in a box and got on with making Stolen Life.

Gypsy
http://www.animazoo.com/

Advertisements

5 Responses to “Home Made Motion Capture”

  1. Hugh "Nomad" Hancock said

    You rock, man. Seriously, that’s very very cool.

    I’m watching the new inertial motion capture systems closely – they use technology similar to the Wii to perform motion capture using accelerometers and gyroscopes. The Gypsy exoskeletons are apparently a complete pain in the proverbial to actually use, but the new Gypsy intertial stuff looks great.

    Alternatively, if you’re more of an optical man, you could check out http://www.naturalmotion.com, who are promising a motion-capture system before the end of the year using their $200 cameras. I’m reckoning they’ll be offering motion capture for a few thousand dollars at most.

    And there are various people offering camera and silhouette-based motion capture too. Paul Marino wrote about one of those recently.

    I reckon it’s three years until we’re all using mocap in our homes to do Machinima.

  2. Thanks.

    Do you know if anyone has found a way to make all the data from motion capture more manageable? As I’m sure you know motion capture nails every single frame of every single reference point so the time line is wall to wall keyframes. Is there a way to group sets of frames “in between” master keyframes? I’m not aware of anything like that in Maya 5.0

  3. Malkyne said

    Hi Peter! NaturalPoint has a budget (in relative terms) optical mocap system they’ve had on the market for a couple of years:

    http://www.naturalpoint.com/optitrack/products/flex-c120/

    However, as far as I know, they only provide an SDK, and not a full software solution.

    As for making mocap data manageable, I don’t know of any inexpensive software that can do that sort of thing. I could probably write you some very nice code, if I didn’t have a day job. 😛

  4. Hi Malkyne

    Thanks for that. It looks really powerful. And the cost is certainly survivable.

    Cheers

    Peter

  5. D’oh. Naturalpoint, not Natural Motion. My comment above about the low-cost mocap solution that’s going to be unveiled soon relates to Maturalpoint – they’ve apparently now developed a mocap solution.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: