Giter Club home page Giter Club logo

headphonemotion's Introduction

Headphone Motion Unity Plugin

HeadphoneMotion is a plugin for Unity3d that exposes Apple's Headphone Motion API (CMHeadphoneManager) in Unity.

You can use this to get head tracking data from Apple headphones like AirPods Pro into your Unity scenes.

Headphone Motion Example Gif

Table of Contents


What is this?

Apple released the new Headphone Motion API in iOS 14, which provides head tracking data from compatible headphones. Currently (as of September 2020) the only compatible device is AirPods Pro.

Please note that Apple's Head Motion API only provides rotational data (3dof), but no positional data.


Installation

Option 1: Clone this repository into the Assets folder in your Unity project.

Option 2: Donwload this repository as a zip, rename the extracted folder to HeadphoneMotion, and place it anywhere in your Unity project Assets folder.


Running the example scene

In order to verify that the plugin works, it's best to test with the provided example scene.

Here is what you will need in order to build and run the example scene:

  • Unity3D (tested with version 2019.4.1f1)
  • Xcode version 12 or higher
  • AirPods Pro running the latest firmware
  • Apple mobile device running iOS 14 or greater
  • Basic knowledge of how to sign and install iOS builds
  • Make sure that your project is set to build for iOS. Go to File > Build Settings...
    • If iOS is not already selected, do select it and click the Switch Platform button
  • Add the example scene HeadphoneMotionExample to the build, and click Build and Run
  • Name your build folder where the Xcode project will be saved, and click the Save button. This will start the build process.
  • Make sure that your iOS device is connected to the computer and unlocked

If at this point you get a Code Signing Error, take care of the signing steps. For the sake of brevity, app signing is outside the scope of these instructions.

On the first build attempt, you will likely get a crash with the following error:

[access] This app has crashed because it attempted to access privacy-sensitive data 
without a usage description. The app's Info.plist must contain an NSMotionUsageDescription 
key with a string value explaining to the user how the app uses this data.

What this means is that head tracking data requires a special permission from the user, and you, the developer, must provide a reason to the user as to why they will need to grant this permission.

In order to do that, go to the Info.plist in the root folder of the Xcode project, add a new key titled NSMotionUsageDescription, and in the Value field enter the permission text.

Add NSMotionUsageDescription Value Gif

  • Hit the Play button in Xcode to build again, and the project should successfully build and install on your device.
  • If you don't already have your AirPods Pro in, put them on now, and you should see the cube rotating to match your headpose.

Headphone Motion Example App Screenshot

Example App Features

  • Disable Tracking / Enable Tracking - this button toggles the overall Headphone Motion functionality. Note that if the tracking is disabled, you will not receive headphone connection events.
  • If your device supports the Headphone Motion API, you will see Headphone motion is available (otherwise you will see not available)
  • Headphones are connected (or "not connected") - shows the headphone connection status

Dealing With Rotation Offset

You may notice that the "default" rotation of the cube is slightly offset when it starts tracking headpose. This is due to the position of one of the AirPods in your ear. The best way to fix this is to adjust the AirPod position in your ear.

You can also try to use the Calibrate starting rotation button (click it when you feel that you're looking perfectly straight forward). The cube rotation will be adjusted based on this calibration, but it will not work well with large offsets.

Click the Reset calibration button will reset the calibration value.


How to use

Here is the minimum amount of code you will need to start using this functionality.

Include the HearXR namespace by adding the following line of code to the top of your script:

using HearXR;

Place the following code in some instantiation function in your script (for example into Unity's Start() call).

// This call initializes the native plugin.
HeadphoneMotion.Init();

// Check if headphone motion is available on this device.
if (HeadphoneMotion.IsHeadphoneMotionAvailable())
{
    // Subscribe to the rotation callback.
    // Alternatively, you can subscribe to OnHeadRotationRaw event to get the 
    // x, y, z, w values as they come from the API.
    HeadphoneMotion.OnHeadRotationQuaternion += HandleHeadRotationQuaternion;
    
    // Start tracking headphone motion.
    HeadphoneMotion.StartTracking();
}

In the same script, place the function below, which is the handler function that receives the headpose rotation data. In this exaple, the received data will be applied to the GameObject that this script is attached to.

private void HandleHeadRotationQuaternion(Quaternion rotation)
{
    transform.rotation = rotation;
}

Also examine the example scene for some additional usage examples.


FAQ

  • Is this 3dof or 6dof?

    • Apple's Headphone Motion API only provides 3dof information
  • Why does the rotation seem "crooked" when my head is perfectly straight?

    • This is likely due to the position of one of the AirPod Pros in your ear. Only one headphone is being used for the rotation data. Just wiggle each one to figure out which one is currently being used, and then adjust it until your target object appears straight when your head is straight.

License

License

headphonemotion's People

Contributors

anastasiadevana avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

headphonemotion's Issues

Tracking in the background?

Thanks for this great plugin. One question, is there a way to get it to work in the background? I can't find anywhere in the API documentation that its not allowed, so I'm assuming this functionality is possible?

Headphones not connected

hi great plugin!
I'm not able to get it to recognise the airpods pro I'm using though. They work normally and Headphone motion is available but it says Headphones are not connected.
I'm getting this hang risk if that could be a cause?

Thread Performance Checker: Thread running at QOS_CLASS_USER_INTERACTIVE waiting on a thread without a QoS class specified. Investigate ways to avoid priority inversions

Can this be debugged directly on unity without using build to the iphone?

I can't run this sample program on my computer,And when trying to run the base code, the console prompts that the API can only run on IOS devices. This means that when I want to write something with it, I need to build it into my phone every time I debug it, so I want to ask if I can run it directly on unity

no support for ios 17.3? how to fix

My iOS device is an iPhone 11 Pro, currently running the latest version of iOS 17.3. However, when I try to deploy this program to my phone using Xcode, it tells me "The run destination my iPhone is not valid for Running the scheme 'Unity-iPhone'."
Xcode does not support iOS 17.3.
It was already not working when I was on iOS 17.1, so I thought updating iOS might help since I downloaded Xcode 15, which supports iOS 17.2 and above. But it still doesn't work.
I see that this project supports Apple mobile devices running iOS 14 or greater, so in theory, it should support version 17.3, right? Why doesn't it?

Application crash after loading a new scene with Airpods connected

I am having an issue with a crashing iOS application after loading a new scene with Airpods connected. Upon loading a new scene, without connecting Airpods, the application runs properly. However, as soon as Airpods are connected the application crashes. I am using Unity 2020.3.0f1, running iOS 14.6 on an iPhone 12 Pro. Below is the log collected from Xcode with the application crashing:

2021-08-23 19:09:26.697980-0400 airpods-plugin-debugging[10247:3918284] Built from '2020.3/staging' branch, Version '2020.3.0f1 (c7b5465681fb)', Build type 'Release', Scripting Backend 'il2cpp'
2021-08-23 19:09:26.702616-0400 airpods-plugin-debugging[10247:3918284] MemoryManager: Using 'Default' Allocator.
-> applicationDidFinishLaunching()
-> applicationDidBecomeActive()
GfxDevice: creating device client; threaded=1
Initializing Metal device caps: Apple A14 GPU
Initialize engine version: 2020.3.0f1 (c7b5465681fb)
2021-08-23 19:09:27.611487-0400 airpods-plugin-debugging[10247:3918284] Unbalanced calls to begin/end appearance transitions for <UnityViewControllerStoryboard: 0x13d307e50>.
UnloadTime: 3.378333 ms
2021-08-23 19:09:29.407236-0400 airpods-plugin-debugging[10247:3918284] HeadphoneMotion init complete
2021-08-23 19:09:29.407428-0400 airpods-plugin-debugging[10247:3918284] Set the headphone connection delegate
2021-08-23 19:09:29.407501-0400 airpods-plugin-debugging[10247:3918284] Set the rotation delegate
2021-08-23 19:09:29.407622-0400 airpods-plugin-debugging[10247:3918284] Motion is available. Started tracking motion
2021-08-23 19:09:29.531496-0400 airpods-plugin-debugging[10247:3918284] Headphones connected
2021-08-23 19:09:36.017451-0400 airpods-plugin-debugging[10247:3918284] Stopped tracking motion
2021-08-23 19:09:36.017978-0400 airpods-plugin-debugging[10247:3918284] Headphones disconnected
2021-08-23 19:09:37.220800-0400 airpods-plugin-debugging[10247:3918284] Motion is available. Started tracking motion
2021-08-23 19:09:37.242229-0400 airpods-plugin-debugging[10247:3918284] Headphones connected
-> applicationWillResignActive()
-> applicationDidBecomeActive()
-> applicationWillResignActive()
-> applicationDidBecomeActive()
Unloading 5 Unused Serialized files (Serialized files now loaded: 0)
UnloadTime: 6.361625 ms
2021-08-23 19:09:49.988894-0400 airpods-plugin-debugging[10247:3918284] Set the headphone connection delegate
2021-08-23 19:09:49.989204-0400 airpods-plugin-debugging[10247:3918284] Set the rotation delegate
Headphone Motion Initialized!
HearXR.airpodsTracking:Start()

Checking if Headphone Motion is available...
HearXR.airpodsTracking:Start()

Headphone Motion Available!
HearXR.airpodsTracking:Start()

Headphone Motion Started Tracking!
HearXR.airpodsTracking:Start()

libc++abi: terminating with uncaught exception of type Il2CppExceptionWrapper
terminating with uncaught exception of type Il2CppExceptionWrapper
(lldb) 

Example App loads up on iPhone but no cube appears

Hi! Have you ever encountered an issue where the app loads into a phone perfectly fine, but when you open the app the cube you're supposed to be able to move with your head is not there? Thank you so much!

Laggy and stuttering data?

The incoming data appears to be lagging by about 100-200ms. Is that normal and the best performance that we can get from that API?

Also, the data seems to be sampled at around 30hz... Can you confirm that or is it possible to get 60hz?

Fixing the lag would be more important for my application though... (spatializing mono sources and looking around in ambisonics spheres)

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.