I’m struggling to find informations about the real differences between bpf and wave tracks view.
Actually, I can imagine one is data and the other one is for audio, with all the sample rate time critical things to do, but I see (tutorials, help files) some iMubu objects provide wave+multiwave and some other in examples provide bpf+multibpf+score.
What is the attributes that allow to set this up ?
In my case, I’d need ONLY midi cc data + midi notes eventually (as it seems more convenient than multiple markers’ tracks.
the main difference between bpf and wave is that bpf applies to mubu timetagged tracks while wave to sampled tracks. Depending on the nature of the tracks imubu proposes a set of compatible gui interfaces that can be used to display the content of the tracks. You can define it via the imubu message “interface bpf” for example, provided that the interface is compatible with the track.
Hi @borghesi , thanks a lot. I got it.
Btw, is there an official documentation for MuBu ?
I struggled to find it.
I’m building mine for my own use and project purpose, across max help patches tutorials and stuff here, but if there was one official, that would save a lot of time (even if max help patches are VERY well documented)
at the moment the only documentation available are the maxhelp and the mubu package examples.
However, there are some nice video tutorials made by Matthew Ostrowski which integrate the help files informations. For now there are 3 but more will come soon. You can find them here: https://forum.ircam.fr/article/detail/tutoriels-mubu/
another question: if MuBu data are not time-tagged, they are wave ? I mean, tracks can be bpf or wave only ? I’d really like to know what a track can be… I mean all the tracks “features”, type.
I have to deduct it through iMubu gui
I can see bpf, multibpf (does it mean I can use many bpf in the same track ? what purpose vs unique bpf in many tracks ?), matrix, scatterplot, sonogram, markers, trace, spline…
In my project, I won’t record data realtime ever.
I’ll use MuBu for creating my data “offline” then to play it during live performance time.
Two ways for getting data into this :
data from outside, scaled and injected in tracks,
data generated through scripts (js probably) to tracks.
I need only :
sometimes tracks with break points function (long evolving interpolations between points)
smoother evolution (ideally, curves between points as function object with line vs curve features)
sporadic triggers (markers vs a proper score with midi notes used as sporadic triggers but with more dimensions from pitch to velocity… )
I’m a bit confused, at the moment as I’d need more global informations about all of this.