Hello,
Is there a way I can look into the motionfollower object? I’m trying to build a system where the live audio (of a acoustic piano) can follow a prerecorded soundfile from the same piece. (Because I want some sort of interactive backingtrack that follows the live piano instead of the piano player following the backingtrack or clicktrack).
I’ve tried to use voicefollower~ but because this is build for voices, and not for a piano with so many different pitches and overtones, it isn’t working good enough yet for what I’m trying to make.
So now I’m thinking about using the motionfollower~ to compare the first 10 or 20, or how many is needed, partials from the live audio and the prerecorded file with each other for a more accurate following. I use iana to get the partials out of the audio but I’m not sure how to send them into the motionfollower~ object because it needs mubu files. I think I could make a mubu file with lists from the first 10/20/etc partials from the prerecorded file and I’m not sure wether it’s possible to write the live data directly into something the motionfollower~ object can read… Maybe if I could look into the motionfollower object I could try some things? Or maybe one of you knows how to do this or has a tip on how I should do it better?
Thanks in advance!!
Silva