< Back to IRCAM Forum

Leapmotion for Max

Thanx Riccardo and Jules!
It does compile correctly with VS2013.
I’ll study and explore all the project settings to help my compiling adventures ( i’m studing some max external developing in my spare time ).
Count me in for testing or any help here and on other projects.

thanks again
best

a.

Hello Jules,
I am a beginner, I have downloaded the Leapmotion for Max and it does work with my Leap Motion. A naive question, do you have a tutorial ( or an example) for concrete implement? e.g combine with sfrecord~, sfplay~, or even with jit.grab, it.qt.movie etc… as a installation using. thanks!

best

Shing-kwei

Hi Shing-kwei,

Thanks for the interest!

We developed the leapmotion external for experimenting in the research team, but we did not release any example of use for sound control so far.
The only other example you can find is a patch for gesture recognition with the leapmotion in Mubu (see http://forumnet.ircam.fr/product/mubu/).

To get started, I advice you to experiment with connecting some high-level parameters from the tracking (for example, the position/speed of the hands) to sound control parameters, and then iterate with other parameters form the skeleton tracking.

We will make announcements on the forum if we make more example patches public in the future.

Best,
Jules

thanks! Jules.

Shing-kwei

thanks! Jules.

Shing-kwei

Hi Jules,

I have started using your external, and I’m trying to zoom in and out with the jit.render object. Usually, it’s done with the z axis of the camera object. But it only seems to rotate the object (which makes sense). I’m using the help patch as a basis.

Is there anything we should take into account to achieve the zoom?

Thank you

Coralie

Hi Jules, thank you for this link and for the leap max object. Works great… but I trying to understand something:
I am doing a BA project on interaction within a 3d world… and would love to implement your leap object to interact with other jit.phys.body

Unfortunately the jit.gl.sketch, which render the GL visualisation of the incoming data from the leap, are not a phys body them selves…therefore they cannot collide-interact with anything (??not sure if i am right here please help??)
Do you think is the best way of doing so? Or should I simply use the data from your leap patch and send it to jit.phys.body for interaction…I do like the fact of seeing the hand in order to interact within the 3d world.

Please see example

Screen-Shot-2015-09-09-at-11.27.08.png

Bonjour Francesco,

Thank you for the message and for sharing your research.

Indeed, the leapmotion external was only made for streaming the data from the LeapMotion tracking within Max. The 3D sketch of the hand is just here to give a simple visualization of the skeleton, but it is probably not the right way to create interactions with 3D objects.

I guess you have 2 possibilities. Either you develop your own model of interaction to link the hand skeleton to actions upon 3D objects. Or you find a way to represent the hand itself as a 3D object within Jitter’s physical modeling framework, to take advantage of all the existing physical behaviors. I guess the second options seems more sensible; however, I never used jit.phys, so I can’t give you much advice on this part… Are there any existing 3D hand models that you could link to the skeleton to facilitate the control?

Don’t hesitate to share the progress of your project on the forum!

Best,
Jules

HI Jules,
Thank you so much. Ok that makes sense. Yes the second option sounds like what I had in mind.

Look I have found an example here that uses a jit.gl.sketch linked to a jit.phys.body and jit.phys.ghost.
Day 5 ( https://cycling74.com/2012/09/19/00-physics-patch-a-day/#.VfFgF2RVikr )
patch attached

I think I need to make something like this linked to your jit.gl.sketch hand…what you think?

Def going to post all the updates…and once is all ready I will share the patch too.
Thank you for all the help

4434.ForceSystem.zip (3.38 KB)

Hi Benjamin, I hope you are well.

I have designed physical hands (using a leapmotion) to interact with object with in a world.
Everything works great… I have made 6 jit.pshys which represent my palm position and tip positions. Now I can grab / move things and do other world’s interactions.

I have a quick question: I am using Graham’s Oculus patch (working fine btw) for the Rift.
GitHub - grrrwaaa/max_oculus: Max/MSP/Jitter external & example for the Oculus Rift

The main problem I have right now is that when I move-navigate around… my hands (visualization of the leapmotion) are not following the head tracking position, so they remain in the same position when I navigate within the world.

I know I can get the head-position (quat / position ) from Graham Oculus object, but I am not sure how-where to send those values so that the hands are following links to the character position (navigation-head position).

Please let me know if anyone can help.
Thank you so much.

Hi @fr4nky8oy,

I haven’t used the occulus rift at all so I can’t give you much help on this…

My guess is that you need to change the coordinate system according to what you get from the occulus rift. You should probably translate and rotate the coordinates from the leapmotion to fit your position and orientation in the 3D environment.

Good luck!

Jules