Giter Club home page Giter Club logo

ios-arkit-headset-view's Introduction

ARKit Headset View

Make your ARKit experiment compatible with mobile VR/AR headsets*. An example setup using SceneKit. Useful for playing with head-worn Mixed Reality, Augmented Reality, Virtual Reality with headtracking, or a mix of everything.

This is the newer, more experimental (code-wise), and heavier sibling of "iOS Stereoscopic ARKit Template".

stereoscopic image of spaceship in augmented reality

Language: Swift

Content Technology: SceneKit

Written in: Xcode 9 beta 2 (9M137d)

Tested on: iPhone 7+ running iOS 11 beta 2 (15A5404i)

Footnotes

* Mobile Headset needs to have an opening for the iPhone's camera (i.e. a headset that supports Mixed Reality or Augmented Reality). You could also use a Google Cardboard with a hole cut out.

Notes:

  • At the beginning of the code, there's easy access to variables: interpupilary-distance, background color, and the eye's field-of-view (which must be changed alongside the cameraImageScale).

  • METAL should ideally be used for optimal performance in future. (We're currently using SceneKit here.)

  • This code was written as a proof of concept. Production-level applications should ideally use more accurate calculations (e.g. for camera.projectionMatrix, and FOV).

  • The framerate displayed by SceneViews is actually 3x it's actual rate (as we're actually rendering the same scene 3 times).

  • This is experimental code running on (iOS11) beta software that is likely to change.

Steps Taken

(P.S. If needed, the commits show more granular steps.)

1. Add a left-eye and right-eye SceneView

Same steps as here: "iOS Stereoscopic ARKit Template". Except, now we have 3 SceneViews (the original, the left-eye, and the right-eye).

2. Adjust PointOfView according to a FieldOfView value

In updateFrame:

let eyeFOV = 60 // (To be Adjusted later on)

let eyeCamera : SCNCamera = SCNCamera()
eyeCamera.zNear = 0.001
eyeCamera.fieldOfView = CGFloat(eyeFOV)

let pointOfView : SCNNode = SCNNode()
pointOfView.transform = (sceneViewRaw.pointOfView?.transform)!
pointOfView.scale = (sceneViewRaw.pointOfView?.scale)!
pointOfView.camera = eyeCamera

3. Re-Render Camera Image

In Storyboards, create two UIImageViews, behind the ARSCNViews. These represent the camera feed of the left and right eye.

Clear the camera image.

In updateFrame:

sceneViewLeft.scene.background.contents = UIColor.clear // This sets a transparent scene bg for all sceneViews - as they're all rendering the same scene.

Re-Render the Camera Image - adjusting for scale and orientation

let cameraImageScale = 3.478 // (To be Adjusted later on)

// Read Camera-Image
let pixelBuffer : CVPixelBuffer? = sceneViewRaw.session.currentFrame?.capturedImage
if pixelBuffer == nil { return }
let ciimage = CIImage(cvPixelBuffer: pixelBuffer!)

// Convert ciimage to cgimage, so uiimage can affect its orientation
let context = CIContext(options: nil)
let cgimage = context.createCGImage(ciimage, from: ciimage.extent)

// Determine Camera-Image Orientation
let imageOrientation : UIImageOrientation = (UIApplication.shared.statusBarOrientation == UIInterfaceOrientation.landscapeLeft) ? UIImageOrientation.down : UIImageOrientation.up

// Display Camera-Image
let uiimage = UIImage(cgImage: cgimage!, scale: scale_custom, orientation: imageOrientation)
self.imageViewLeft.image = uiimage
self.imageViewRight.image = uiimage

4. Determine appropriate cameraImageScale for different eye-fieldOfViews.

This was done with some maths, but mainly lots of guessing. He's some values that I've estimated based on the iPhone7+'s size.

//    let eyeFOV = 38.5; let cameraImageScale = 1.739; // (FOV: 38.5 ± 2.0) Brute-force estimate based on iPhone7+
//    let eyeFOV = 60; let cameraImageScale = 3.478; // Calculation based on iPhone7+
//    let eyeFOV = 90; let cameraImageScale = 6; // (Scale: 6 ± 1.0) Very Rough Guestimate.
//    let eyeFOV = 120; let cameraImageScale = 8.756; // Rough Guestimate.

Done!

Feel free use however you like (MIT License). Have fun! 😁

stereoscopic image of spaceship in augmented reality, 90 degree FOV

ios-arkit-headset-view's People

Contributors

hanleyweng avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.