< Back to IRCAM Forum

Headtracking & live sound augmentation

Hello everyone,
I aim to use AirPods Max & its head orientation sensor data in a setup in which footstep sounds are augmented and transformed (moved) three-dimensionally in real-time based on live footstep coordinate input, ideally having it sound as „real“ as possible.

I tried doing it with LogicPro: live head tracking and live object panning works, but the Dolby Atmos sound spatialization is limiting when it comes to vertical object placement, and Logic Pro when it comes to space augmentation.

Did I understand it right that MaxMsp and Panoramix or PanoLive is the recommended Spat way of approaching it?

My MaxMsp knowledge is very basic, and my spat knowledge not existing yet but eager to dive into both.

(Getting the Airpod head-tracking data stream to maxmsp should be possible via running the iOS SDK test app on macOS silicon)

Any hint would be very much appreciated.
Thanks a lot!

Hi,

Happy Easter,

In PanoLive, I include head tracking and I would like to add the AirPods, but I have to get some and the dialogue with Max is feasible… Since you are on Logic you should use Panoramix. How do you make it work ?

Hi,

Yes, your understanding is correct : Spat5 in Max, or Panoramix (or PanoLive) might be useful in your context, as long as you can access the head-tracking data stream (typically as a stream of OSC data).

If your Max knowledge is limited, Panoramix might be easier to start with.

Best,
T.

1 Like

Thanks, Jerome & T.! I will start looking into Spat5 in Max. If you have any tutorial links for head-tracking & live sound augmentation, more than welcome to hear them.

@Jerome: Getting the AirPods head-tracking data stream is, to my knowledge, currently only possible through the iOS SDK. If you have a macOS with M1 chip or later, you could run iOS apps on your Mac directly otherwise you need an iOS device as a forwarder in between.

One app that gathers the tracking data and forwards it to OSC is M1 Monitor Control: ‎M1 MNTR CTRL on the App Store

A swift project for gathering AirPods tracking data that could be extended with custom OSC mappings is this one: GitHub - tukuyo/AirPodsPro-Motion-Sampler: This project is a sample to get the motion sensor values of the AirPods Pro(1st or 2nd gen) / AirPods Max / AirPods(3rd gen) or Beats Fit Pro.

In case I modify the latter I will share the link here

Hello Emanuel,

I’ve managed to install M1 MNTR CTRL on my MacMini M1 and I can run it. I only need the AirPods Pro now

To be budgeted…

Many thanks,

Jerome

PS : Since it’s over the bluetooth, I think there will be some latency

1 Like

Hi there,
I found the in-built tutorials (really helpful) and got started with spat5.oper in Max.

Currently, I am looking for a function to simulate the floor as a closeby reflective surface of the footstep sound sources. Is there a function that can add that? Or is there a possibility to set the room dimension and position relative to the listener?

Thanks in advance!