Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device’s camera in a way that makes those elements appear to inhabit the real world.
I am going to show how neat is to write an AR iOS application with ARKit, a framework that provides you high-level classes for tracking, scene understanding and rendering. More specifically, ARKit is a session based framework. This means that everything will happen in a concrete session. The session relates the virtual objects with the real world by means of the Tracking.
This app runs an ARKit world tracking session with content displayed in a SpriteKit 2D view. Every session has a scene that will render the virtual objects in the real world, accessed by means of the iOS device sensors.
This app is based on the code ARBalloons.
- This app has been successfully tested on Xcode 9.4 + iOS 11.4.
- When running it in your device, simply click on where you want to see Post Malone. Then click again to explode it out! 😎
An information property list file is a XML file that contains essential configuration information for a bundled executable. Example of the information you want to add is:
- The name of your app (
<string>PostMaloneBalloon</string>
). - Camera usage (
<key>NSCameraUsageDescription</key>
). - Frameworks you need (
<key>UIRequiredDeviceCapabilities</key>
with<string>armv7</string>
and<string>arkit</string>
).
This is where you place assets such as the images used in your App (Post Malone head) and icons. A file Content.json
is placed inside every directory to describe the assets.
Contains two story board files, that were borrowed from this example.:
LaunchScreen.storyboard
.Main.storyboard
.
Anchors are 3D points that correspond to real-world features that ARKit detects. Anchors are created in this class, together with the Sprite scene (Scene.sks). The class Scene
controls how the App is operating within the scenes. Rendering brings tracking and scene understanding together with your content.
For our App, we are:
- Defining the method
touchesBegan
, where we define what happens when we click the scene. - The sequence of movements is defined by
let sequence = SKAction.sequence([popSound, moveDown, moveDownFloating, moveToBottom])
. - When you touch the scene, a Post Malone Balloon head appears and starts to behave as a balloon (
moveDownFloating = ((arc4random() % 2)==0) ? moveLeftDown : moveRightDown
). - The balloon either pops (
let popSound = SKAction.playSoundFileNamed("pop", waitForCompletion: false)
) or fades after a second (fadeOut = SKAction.fadeOut(withDuration: 1.0)
). - An ARAnchor uses a 4×4 matrix represents the combined position, rotation or orientation, and scale of an object in three-dimensional space (as in
var translation = matrix_identity_float4x4
).
This view is managed by the class ViewController, which inherits from ARSKViewDelegate
so that we can create a sceneView
variable. This class has methods for:
-
Views
- Scaling and placing the view.
- View when it loads (and load the pre-defined scene from SKScene).
- View to appear and disappear.
- Run.
-
Sessions
- Session interrupted.
- Session ended.
This is where we call the class AppDelegate
, which answers for UIApplicationMain
. In this class, we create a variable that will work as the window UI and we have UI methods for:
- See if the application is about to move from active to inactive state (for example, pause ongoing tasks).
- Release shared resources and save user data.
- Change from the background to the active state.
- Restart any tasks that were paused while the application was inactive.
- Termination actions for when the application is about to terminate (for example, to save data if appropriate).
- ARKit Developer Portal (Apple): Apple’s developer portal for ARKit.
- ARCore Developer Portal (Google): Google’s developer portal for ARCore.
- AR Studio (Facebook): Facebook's tools.
- ViroAR for ARKit and ARCore.
- Apple's Human Interface Guidelines.
- The State of Mobile AR: A Platform Overview.
- AR-First Mobile second.
- How is ARCore better than ARKit.
- Why is ARKit better than the alternatives.
- Using ARKit with Metal.
- Augmented Reality With ARKit for iOS.
- ARKit and CoreLocation.
- How to build an interactive AR app in 5 mins w/ React Native & Viro AR.
- Building an iPhone AR Museum App in iOS 11 with Apple’s ARKit Image Recognition.
- Building an AR Game wit ARKit.
- Add Snapchat-like AR Lenses to any app w/ Viro AR in React Native.
- Tutorial: Image Recognition - How to use markers to showcase a product in AR.
- Image/Marker Recognition for ARCore (Android).
- ARKit by Example.
- Medium's Augmented Reality.
- r/ARKitCreators.
- r/LearnARdev.
- Building Your First AR Experience.
- Google's AR Experiments.
- The Complete iOS 11 ARKit Developer Course.
- iOS Augmented Reality curs on ArKit.
- Augmented Reality Portal using ARKit.
- ARKit Unity & Xcode - Build 7 Augmented Reality apps.
- Mastering ARKit for iOS.
- Learn ARKit 2 for iOS 12 from Scratch.
- Udacity's 'Learn ARKit'.
- Made with ARKit.
- A Software Engineer’s Guide to Unity and AR/VR Development.
- Building AR/VR with Javascript and HTML.
- iOS Learning Material.
- SwiftShot: Creating a Game for Augmented Reality.
- Awesome ARKit.
- Creating an Immersive AR Experience with Audio.
- Using Vision in Real Time with ARKit.
- Viro code samples.
- ARKit Sampler.
- Unity: uses C# as its primary programming language.
- Unreal: uses C++ and a node-based language called Blueprints Visual Scripting.
- React 260.
- WebVR: open standard with a JavaScript API that makes it possible to experience VR in your browser.
- A-frame: framework for building virtual reality experiences with HTML.
- React VR: library developed by Facebook based on React and React Native.
- treejs.
- Design guidelines from Google.
- UX Guide for VR.
- The UX of VR.
- UI / UX design patterns in virtual reality.
- VR Design best practices.
- Udemy Introduction to VR with Unity.
- Udemy Google's VR Course.
- Udemy Google's VR Software Development.
- VR Dev School.
- Coursera's VR Specialization.
- Unity 3D tutorials.
- Cat Like Coding tutorial.
- Google's Unreal Tutorial.
- Google's VR SDK for iOS.
- Google's VR SDK for Android.
- Creating Virtual Reality (VR) Apps.
- VRTalk.
- VR Heads.
- Quora.
- Oculus VR.
- Unity.
- /r/oculus.
- /r/augmentedreality.
- /r/vitrualreality.
- /r/Singularitarianism.
- /r/HTC_Vive.
- /r/learnVRdev.
-
Field of view: measured in degrees, is the extent of the observable world that is seen at any given moment (humans have a FOV of around 180°, but most HMDs offer between 50 and 110°).
-
Latency: In VR, a 20 millisecond latency is considered low and acceptable for a comfortable experience.
-
Haptics: recreate the sense of touch by applying forces, vibrations, or motions to the user, through feedback devices (example, vibrating game controllers).
-
Stitching: the process of combining multiple video sources with overlapping fields of view to produce a fully immersive 360°.
-
Visual Inertial Odometry: ARKit analyzes the phone camera and motion data in order to keep track of the world around i ARSession object that manages the motion tracking and image processing.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.