Interaction Progress Development

I have no idea how to show the development for kinect integration with unity other than using recorded screen captures, so here is my interaction development progress.

Video 1: Handpoint Test



Video 2: Menu scroll test (using handpoint)
  


Video 3: User Map (might not be included in application, not sure if it would be much use)




Video 4: Menu Scene Test



In the video above, the character mirrors my movements (hands and legs) as well as my body rotation (camera view). However, I had problems moving back and forth as the characters flips my movements; going forward when I move backwards and vice versa. 

I decided to make the character moves forward by using head detection. So I narrowed down the variables in the original OpenNiSkeleton script and focused on the head and hand joint. I then added my own variables (but saving the data for head joints into mine) so the script can iterate another script that I would add in order to make the character moves when I lean in by detecting my head's position. (hand joint is for hand control, on the menu)


I then wrote another simple script that uses Move instead of Transform function (on the character, now with Character Controller assigned to it) to move the character by following the head joint's changes. 

To give it  feel as if the user him/herself is in the environment, I turned off the mesh renderer for my character and make it look as if the user is controlling the camera view in the scene.





Boar Progress

I did a little bit of modification on the model. Just a little bit, it now has slightly bigger legs and bigger upper body and head. Looks more like a boar now rather than the commercial pig. Heh





painting progress.

......







final (kinda)