Make your ARKit experiment compatible with mobile VR/AR headsets*. A Template using SceneKit. Useful for playing with head-worn Mixed Reality, Augmented Reality, Virtual Reality with headtracking, or a mix of everything.
Hello,
thanks for sharing this interesting work! Do you have any idea about compatibility with https://holokit.io/ or https://www.aryzon.com/ cardboards? Will they need any customization of the parameters (eyeFOV, cameraImageScale) or should it just work?
ToDo: Update for iOS 11.3 Beta / ARKit 1.5 Compatibility.
The relationship between eyeFOV and cameraImageScale has changed - since ARKit 1.5 has gone up from 720p to 1080p.
For example, at eyeFOV=90 instead of
cameraImageScale = 6.0
it'd be 6.0 x 1080 / 720. So cameraImageScale would be 9.0 for ARKit 1.5
Your work is fantastic! I learned a lot from your github repos!
As you mentioned that to optimal performance, METAL should be used. And I am wondering if you can point us to the direction on how to use MTKView for the this stereoscopic views?
Currently, we need to access depth data to create depth images for depth image classification (with CoreML). So we use the MTKView as a previewView to show what the camera is seeing. It is working fine as a single view application. But what we are trying to achieve is having a to have stereoscopic views or triple views for headset view. Since capturedDepthData for ARFrame only available on front-facing, depth-sensing camera so we probably can't use SceneKit to achieve the same as you shown from this repo.
I would like to seek for your advice. Any suggestion will help.
Hi, I really appreciate your project, it helps me a lot, now I'm interested in whether this can be reconstructed in SwiftUI framework or not? I notice that in this project, two camera feed preview is in the class which is called UIStackView, is it set from storyboard?
There's any way to switch from Normal View to Headset View and keep all models inside last view.
I'm working with placing models on the ground in Single View, then user clicks a button to switch to Headset View (and wear google cardboard).
I want to keep all models which user placed them on the ground after switching view.
Thank you.
I understand the readme does not show support for these versions. But I unknowingly updated my computer and device, and now I'm seeing sluggish performance and crashes.
The log just before it crashed:
2018-10-23 16:25:21.280046+0530 ARKit Headset View[652:69347] [Technique] World tracking performance is being affected by resource constraints [1]
I couldn't find any error on the log when it actually crashed.
Thank you for the awesome code!
Running it on iPhone 11 pro iOS 13 though, each side's camera feed view's width is much shorter than what is previewed on the final image in your Readme.md.
How would one adjust the camera feed's width-height scale?
iOS 12.1.2 iPhone 8 Plus ,it's unsmooth and crash at let cgimage = context.createCGImage(ciimage, from: ciimage.extent) In updateImages() Hope to update code to a new steady one.