Giter Club home page Giter Club logo

googlesamples / arcore-depth-lab Goto Github PK

View Code? Open in Web Editor NEW
748.0 33.0 148.0 78.44 MB

ARCore Depth Lab is a set of Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. (UIST 2020)

Home Page: https://augmentedperception.github.io/depthlab/

License: Apache License 2.0

ShaderLab 18.37% C# 77.62% HLSL 2.53% GLSL 1.48%
arcore arcore-unity depth mobile ar interaction depth-api depthlab

arcore-depth-lab's Introduction

ARCore Depth Lab - Depth API Samples for Unity

Copyright 2020 Google LLC

Depth Lab is a set of ARCore Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. Some of these features have been used in this Depth API overview video.

DepthLab examples

ARCore Depth API is enabled on a subset of ARCore-certified Android devices. iOS devices (iPhone, iPad) are not supported. Find the list of devices with Depth API support (marked with Supports Depth API) here: https://developers.google.com/ar/devices. See the ARCore developer documentation for more information.

Download the pre-built ARCore Depth Lab app on Google Play Store today.

Get ARCore Depth Lab on Google Play

Branches

ARCore Depth Lab has two branches: master and arcore_unity_sdk.

The master branch contains a subset of Depth Lab features in v1.1.0 and is built upon the recommended AR Foundation 4.2.0 (preview 7) or newer. The master branch supports features including oriented 3D reticles, depth map visualization, collider with depth mesh, avatar locomotion, raw point cloud visualization, recording and playback.

The arcore_unity_sdk branch contains the full features of Depth Lab and is built upon ARCore SDK for Unity v1.24.0 or newer. We recommend using the master branch to build new projects with the AR Foundation SDK and refer to this branch when necessary.

Getting started

These samples target Unity 2020.3.6f1 and require AR Foundation 4.2.0-pre.7 or newer, ARCore Extensions 1.24 or newer. The ARCore Extensions sources are automatically included via the Unity package manager.

This project only builds with the Build Platform Android. Build the project to an Android device instead of using the Play button in the Unity editor.

Sample features

The sample scenes demonstrate three different ways to access depth. Supported features in the master branch is labeled with ⭐, while the rest features can be found in the arcore_unity_sdk branch.

  1. Localized depth: Sample single depth values at certain texture coordinates (CPU).
    • Oriented 3D reticles ⭐
    • Character locomotion on uneven terrain ⭐
    • Collision checking for AR object placement
    • Laser beam reflections
    • Rain and snow particle collision
  2. Surface depth: Create a connected mesh representation of the depth data (CPU/GPU).
    • Point cloud fusion ⭐
    • AR shadow receiver
    • Paint splat
    • Physics simulation
    • Surface retexturing
  3. Dense depth: Process depth data at every screen pixel (GPU).
    • False-color depth map ⭐
    • AR fog
    • Occlusions
    • Depth-of-field blur
    • Environment relighting
    • 3D photo

Building samples

Individual scenes can be built and run by enabling a particular scene (e.g., OrientedReticle to try out the oriented 3D reticle.) and the ARFDepthComponents object in the scene. Remember to disable the ARFDepthComponents object in individual scenes when building all demos with the DemoCarousel scene.

We also provide a demo user interface that allows users to seamlessly switch between examples. Please make sure to set the Build Platform to Android and verify that the main DemoCarousel scene is the first enabled scene in the Scenes In Build list under Build Settings. Enable all scenes that are part of the demo user interface.

Assets/ARRealismDemos/DemoCarousel/Scenes/DemoCarousel.unity Assets/ARRealismDemos/OrientedReticle/Scenes/OrientedReticle.unity Assets/ARRealismDemos/DepthEffects/Scenes/DepthEffects.unity Assets/ARRealismDemos/Collider/Scenes/Collider.unity Assets/ARRealismDemos/AvatarLocomotion/Scenes/AvatarLocomotion.unity Assets/ARRealismDemos/PointCloud/Scenes/RawPointClouds.unity

The following scenes can be found in the arcore_unity_sdk branch, but are not yet available with the AR Foundation SDK.

Assets/ARRealismDemos/MaterialWrap/Scenes/MaterialWrap.unity Assets/ARRealismDemos/Splat/Scenes/OrientedSplat.unity Assets/ARRealismDemos/LaserBeam/Scenes/LaserBeam.unity Assets/ARRealismDemos/Relighting/Scenes/PointsRelighting.unity Assets/ARRealismDemos/DepthEffects/Scenes/FogEffect.unity Assets/ARRealismDemos/SnowParticles/Scenes/ArCoreSnowParticles.unity Assets/ARRealismDemos/RainParticles/Scenes/RainParticlesScene.unity Assets/ARRealismDemos/DepthEffects/Scenes/DepthOfFieldEffect.unity Assets/ARRealismDemos/Water/Scenes/Water.unity Assets/ARRealismDemos/CollisionDetection/Scenes/CollisionAwareObjectPlacement.unity Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/ScreenSpaceDepthMesh.unity Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/StereoPhoto.unity

Sample project structure

The main sample assets are placed inside the Assets/ARRealismDemos folder. Each subfolder contains sample features or helper components.

AvatarLocomotion

The AR character in this scene follows user-set waypoints while staying close to the surface of an uneven terrain. This scene uses raycasting and depth lookups on the CPU to calculate a 3D point on the surface of the terrain.

Collider

This physics simulation playground uses screen-space depth meshes to enable collisions between Unity's rigid-body objects and the physical environment.

After pressing an on-screen button, a Mesh object is procedurally generated from the latest depth map. This is used to update the sharedMesh parameter of the MeshCollider object. A randomly selected primitive rigid-body object is then thrown into the environment.

CollisionDetection

This AR object placement scene uses depth lookups on the CPU to test collisions between the vertices of virtual objects and the physical environment.

Common

This folder contains scripts and prefabs that are shared between the feature samples. For more details, see the Helper Classes section below.

DemoCarousel

This folder contains the main scene, which provides a carousel user interface. This scene allows the user to seamlessly switch between different features. A scene can be selected by directly touching a preview thumbnail or dragging the carousel UI to the desired position.

DepthEffects

This folder contains three dense depth shader processing examples.

The DepthEffects scene contains a fragment-shader effect that can transition from the AR camera view to a false-color depth map. Warm colors indicate closer regions in the depth map. Cold colors indicate further regions.

The DepthOfFieldEffect scene contains a simulated Bokeh fragment-shader effect. This blurs the regions of the AR view that are not at the user-defined focus distance. The focus anchor is set in the physical environment by touching the screen. The focus anchor is a 3D point that is locked to the environment and always in focus.

The FogEffect scene contains a fragment-shader effect that adds a virtual fog layer on the physical environment. Close objects will be more visible than objects further away. A slider controls the density of the fog.

LaserBeam

This laser reflection scene allows the user to shoot a slowly moving laser beam by touching anywhere on the screen.

This uses:

  • The DepthSource.GetVertexInWorldSpaceFromScreenXY(..) function to look up a raycasted 3D point
  • The ComputeNormalMapFromDepthWeightedMeanGradient(..) function to look up the surface normal based on a provided 2D screen position.

MaterialWrap

This experience allows the user to change the material of real-world surfaces through touch. This uses depth meshes.

OrientedReticle

This sample uses depth hit testing to obtain the raycasted 3D position and surface normal of a raycasted screen point.

PointCloud

This sample computes a point cloud on the CPU using the depth array. Press the Update button to compute a point cloud based on the latest depth data.

RawPointClouds

This sample fuses point clouds with the raw depth maps on the CPU using the depth array. Drag the confidence slider to change the visibility of each point based on the confidence value of the corresponding raw depth.

RainParticles

This sample uses the GPU depth texture to compute collisions between rain particles and the physical environment.

Relighting

This sample uses the GPU depth texture to computationally re-light the physical environment through the AR camera. Areas of the physical environment close to the artificial light sources are lit, while areas farther away are darkened.

ScreenSpaceDepthMesh

This sample uses depth meshes. A template mesh containing a regular grid of triangles is created once on the CPU. The GPU shader displaces each vertex of the regular grid based on the reprojection of the depth values provided by the GPU depth texture. Press Freeze to take a snapshot of the mesh and press Unfreeze to revert back to the live updating mesh.

StereoPhoto

This sample uses depth meshes and ScreenSpaceDepthMesh. After freezing the mesh, we cache the current camera's projection and view matrices, circulate the camera around a circle, and perform projection mapping onto the depth mesh with the cached camera image. Press Capture to create the animated 3D photo and press Preview to go back to camera preview mode.

SnowParticles

This sample uses the GPU depth texture to compute collisions between snow particles, the physical environment, and the orientation of each snowflake.

Splat

This sample uses the Oriented Reticle and the depth mesh in placing a surface-aligned texture decal within the physical environment.

Water

This sample uses a modified GPU occlusion shader to create a flooding effect with artificial water in the physical environment.

Helper classes

DepthSource

A singleton instance of this class contains references to the CPU array and GPU texture of the depth map, camera intrinsics, and many other depth look up and coordinate transformation utilities. This class acts as a high-level wrapper for the MotionStereoDepthDataSource class.

DepthTarget

Each GameObject containing a DepthTarget becomes a subscriber to the GPU depth data. DepthSource will automatically update the depth data for each DepthTarget. At least one instance of DepthTarget has to be present in the scene in order for DepthSource to provide depth data.

MotionStereoDepthDataSource

This class contains low-level operations and direct access to the depth data. It should only be use by advanced developers.

User privacy requirements

You must prominently disclose the use of Google Play Services for AR (ARCore) and how it collects and processes data in your application. This information must be easily accessible to end users. You can do this by adding the following text on your main menu or notice screen: "This application runs on Google Play Services for AR (ARCore), which is provided by Google LLC and governed by the Google Privacy Policy".

Related publication

Please refer to https://augmentedperception.github.io/depthlab/ for our paper, supplementary material, and presentation published in ACM UIST 2020: "DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality".

References

If you use ARCore Depth Lab in your research, please reference it as:

@inproceedings{Du2020DepthLab,
  title = {{DepthLab: Real-time 3D Interaction with Depth Maps for Mobile Augmented Reality}},
  author = {Du, Ruofei and Turner, Eric and Dzitsiuk, Maksym and Prasso, Luca and Duarte, Ivo and Dourgarian, Jason and Afonso, Joao and Pascoal, Jose and Gladstone, Josh and Cruces, Nuno and Izadi, Shahram and Kowdle, Adarsh and Tsotsos, Konstantine and Kim, David},
  booktitle = {Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology},
  year = {2020},
  publisher = {ACM},
  pages = {829--843},
  series = {UIST '20}
  doi = {10.1145/3379337.3415881}
}

or

Ruofei Du, Eric Turner, Maksym Dzitsiuk, Luca Prasso, Ivo Duarte, Jason Dourgarian, Joao Afonso, Jose Pascoal, Josh Gladstone, Nuno Cruces, Shahram Izadi, Adarsh Kowdle, Konstantine Tsotsos, and David Kim. 2020. DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST '20), 829-843. DOI: http://dx.doi.org/10.1145/3379337.3415881.

We would like to also thank Levana Chen, Xinyun Huang, and Ted Bisson for integrating DepthLab with AR Foundation.

Additional information

You may use this software under the Apache 2.0 License.

arcore-depth-lab's People

Contributors

baobao avatar cs-util avatar kidavid avatar ruofeidu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

arcore-depth-lab's Issues

Continuous evironment mesh building

Hello, first of all - thank you for such a great project!

I have a question about continuous evironment mesh building - could you point me a way how to implement creating enviroment "scanner", i want to create approximate model of around environment - i see this like every new frame will fill and deform already existing "scanned" mesh and make it more accurate (in future - with environment texture mapping). I think the simpliest way is to subtract new scanned mesh from already scanned.

Maybe it can be a good and very interesting objective for a new sample!)

Doesn't compile in Unity v2019.3 > several scripts are missing

This is the output after importing using the arcore sdk in unity with the realism examples. Project won't compile past this and i checked to make sure all the scripts were imported. (none of the below scripts are included in this repo)

Assets/ARRealismDemos/Common/Scripts/AttachDepthTexture.cs(64,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/DepthTextureController.cs(90,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/DepthVisualizationEffect.cs(130,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/AttachDepthTextureToMaterial.cs(83,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/NoticeHelper.cs(101,22): error CS0117: 'Session' does not contain a definition for 'IsDepthModeSupported'
Assets/ARRealismDemos/Common/Scripts/NoticeHelper.cs(101,43): error CS0103: The name 'DepthMode' does not exist in the current context
Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scripts/DepthPulseEffect.cs(213,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/CollisionDetection/Scripts/FreeSpaceRenderer.cs(157,35): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'

Unreal engine

What about for Unreal engine?
It's been long time to support for Unreal engine with many features missing?

Is it deprecated for UE4? Or planning to release for UE5 with Depth API support?

How access dense depth texture

Hello

I read in this publication that "a dense depth texture created by arcore sdk in the gpu has every pixel in the color camera image with a depth value mapped to it" (page 5).

How can I get the dense depth texture and its depth values? This would help me get the depth value for each pixel in the high resolution camera background rgb texture.

I am guessing pairing the depth value for each pixel in the camera background rgb texture can be hard because the depth texture has very low resolution (160x120) versus the camera background rgb texture resolution (2560x1440 for pixel2xl), i.e. not 1:1 when comparing pixels

Thank you for your help, Sergio

Depth in WebCam on Web Browser

Great work with the project!

We would like to apply real time 3D AR interactions with the video stream from a WebCam in a browser.

Is this package available on JavaScript? It would greatly improve accessibility.

I was unable to find the 3D portrait that only works on humans. We are looking for the entire 3D surface so that we can create our robotic interface.

Many thanks!

App opens on Pixel 2 XL but does not run AR - Google Play Services for AR Required

Hello all

I'm on 2018.4.19f1 and a pixel 2 XL Android 11 and the build installed on the smartphone does not run the AR experience.

I have Google Play Services for AR installed and updated.

I can see the sprites to choose the AR experience but the rest of the screen is black. I build & run a single scene and when the app opens I get "this application requires th latest version of google play services for AR", however I have Play Services for AR installed and updated.

If I build all scenes with the caroussel I get the same message that I required Play Services for AR and after pressing ok it shows me the Play Services for AR app to install but there is nothing to do because it's already installed. So next I go back to the app I built that is open and it says "depth api is not supported on this device. Please make sure your device is compatible".

Pixel 2 XL is indeed an AR enabled device.

Furthermore, I can confirm I can run AR on this phone with Unity's AR Foundation and the ARCore Depth Lab example on the play store

I did a factory reset on the phone but I still get this behaviour.

I thought it may be a problem with the arcore-unity-sdk version but in trying to upgrade from arcore-unity-sdk-1.18 to 1.21 I get gradle errors and can't build.

Anyone else experiencing this? Any guidance really appreciated please

Depth-lab in WebXR

Can we use arcore-depth-lab with WebXR? I want to try 3d cursor which uses arcore-depth-lab for accurate placement and orientation of the raticle.

MaterialWrap and ScreenSpaceDepthMesh issue on Samsung S20+

Hi there,

I have a small issue regarding the MaterialWrap and ScreenSpaceDepthMesh scenes of the Depth Lab.
Instead of getting a sculpted mesh that corresponds with the colored depth map of the DepthEffects scene, I get a flat, oval shaped mesh (see image below). This happens with the MaterialWrap scene, and also with the ScreenSpaceDepthMesh scene when I press 'freeze'.

Screenshot_20200705-224526_Depth Lab - All

The issue happens on both the play store version of the app, as well as the version I built in Unity. I have a Samsung Galaxy S20 plus with a seemingly working ToF sensor in other scenes.

Is there a fix for this issue?

Thanks, Nick

ScreenSpaceDepthMesh example broken?

Hi,
I am trying ScreenSpaceDepthMesh examplese but nothing happens. I loaded ScreenSpaceDepthMesh and StereoPhoto scenes.

Is it expected? What do I miss?

Material doesn't have a texture property '_CurrentDepthTexture'

Hi,
I'm running depth lab samples on Unity 2019.4.9, ARCore SDK 1.18, on Pixel 2. Seems SetDepthTexture(DepthTarget) assumes the target material has a _CurrentDepthTexture property which is not always the case. In the collider sample, none of the depth target occlusion materials have a _CurrentDepthTexture property. This results in like huge numbers of log errors on the device monitor.

09-09 09:17:47.552: E/Unity(8720): Material doesn't have a texture property '_CurrentDepthTexture'
09-09 09:17:47.552: E/Unity(8720): DepthSource:SetDepthTexture(DepthTarget)
09-09 09:17:47.552: E/Unity(8720): DepthSource:Update()
09-09 09:17:47.552: E/Unity(8720): [./Runtime/Shaders/Material.cpp line 1452]
09-09 09:17:47.552: E/Unity(8720): (Filename: ./Runtime/Shaders/Material.cpp Line: 1452)

Offending code in DepthSource.cs:

private static readonly string k_CurrentDepthTexturePropertyName = "_CurrentDepthTexture";
........
private static void SetDepthTexture(DepthTarget target)
    {
        Texture2D depthTexture = DepthTexture;

        if (target.SetAsMainTexture)
        {
            if (target.DepthTargetMaterial.mainTexture != depthTexture)
            {
                target.DepthTargetMaterial.mainTexture = depthTexture;
            }
        }
        else if (target.DepthTargetMaterial.GetTexture(k_CurrentDepthTexturePropertyName) !=
            depthTexture)
        {
            target.DepthTargetMaterial.SetTexture(k_CurrentDepthTexturePropertyName,
                depthTexture);
        }
    }

Collider demo is not working properly..

Hi there,
When I build the collider scene to test physics simulation, the app is not responding on clicking the throw button. I tried tweaking the project settings but still those struggles didn't yield any result. The project is setup on unity 2019.4 LTS and I ran the app on pixel 2 xl. Can anybody help me in getting this work?
Thanks in advance,
Vishal

Is the deep solving algorithm open source

Thanks for the excellent work of open source,
For whether the algorithm for deep solving in the project is open source, look at the relevant code in that file。

No Shadows in Collision Scene

Nice work. I'm trying to get shadows from the throw objects in the Collision scene but they seem to be missing. They are there in the non AR Foundation version. Looks like the shadow receiving mesh is obscured by the AR Foundation occlusion.

Camera Choppiness/Jittering

When running the Depth API there is a very noticeable choppiness/jittering - https://youtu.be/1KoE8fglirM
When running the Depth Lab App from Play Store everything is very smooth - https://youtu.be/iRogGT21ke8
I tested this on a S9+ and S20 Ultra and had same issue.
I believe the camera is, for some reason, is at a very low FPS.
I checked the FPS settings and camera settings within the Depth API and everything appeared correct.
Not sure what is causing this.

Device compatibility

All ARCore compatible devices are compatible with this API? I have a Samsung Galaxy S8 and S9.

relighting implementation ignores camera intrinsic parameters

hi, I observe that your implementation ignore the camera intricsic parameters at

float3 result = SampleColor(uv);
float depth = SampleDepth(uv);
if (_RenderMode == kRenderCameraImage) {
return result;
}
if (_RenderMode == kRenderDepthMap) {
return TurboColormap(depth);
}
result = lerp(result, result * 0.5, _GlobalDarkness);
// Common inputs:
float2 aspectRatio = CalculateAspectRatio(uResolution);
// aspectRatio = _AspectRatio;
float2 normalizedUv = NormalizeCoord(uv, aspectRatio);
float3 samplePos = float3(normalizedUv, depth);

How much is the relighting results related to this calculation.
As the pinhole camera is one approximation of world imaging and the vertical and horizitional focal length most equal,

Given:
P = vec3( uv, depth)  
P = K*W  // K is one 3*4 matrix
so, 
W = K^-1*P // W is [x,y,z,1]

calculating W to perform raymarching may be not neccseary at all.

Is it possible to play recorded ARSessions in the Unity Editor or on an emulator?

Hello, first of all really great work on the arcore-depth-lab!

I was wondering whether it is possible to play recorded ARSessions with the ARPlaybackManager in the Unity Editor or on an android emulator. This would allow people to iterate faster when developing AR applications using ARFoundation.
When trying to test our application in an android studio emulator, the camera didn't show us anything in our application despite being set to a virtual scene. It did work in the camera app.
In case this feature doesn't work in both an emulator and the editor, will this be a planned feature?
Thank you in advance!

Rodion

Continuous evironment mesh building

hello @ruofeidu, thank you for all the effort you guys put into making this repo and the research behind it. i am doing something similar to #31, and I am looking at KinectFusion and some SLAM alternative.

One question though, how could I reuse the primitives that you have for obtaining the depth information and the point cloud? I have found resources online that i have to write my own shaders etc, and was wondering if there as an API I could use (I have not tried or used the ARCore Raw Depth API, could you maybe explain what is possible and what is not possible with it? what did you have to do on your own that I could maybe reuse from your work?

Depth API not supported: OnePlus 11

I'm using a OnePlus 11 and I installed this app from the Google Play Store. Upon opening the app, I get the warning: "Depth API is not supported on this device. Please make sure your device is compatible."
However, OnePlus 11 is listed as supporting the Depth API: https://developers.google.com/ar/devices
What could be causing this issue?

Linking ToF Depth data with main camera's image.

For doing turntable photogrammetry I'm wondering if there's an easy way to save the depth map from my LG V60's ToF sensor along with the RAW pic from my main camera. There's a way to sorta do it with an annoying workaround.. taking the photo using portrait mode then extracting the depth map from the meta data. This however is very time consuming but also you lose all control of the cameras manual settings. I'm taking the dataset back to Metashape on my PC so all the hardware intensive computing will be done on that machine. I cant believe how this simple task of tagging depth information beside a photo is so far impossible to do. How did that dumb fruit company implement ToF/Lidar hardware so much better than Android? Like, its not even close. Embarrassing.

can run sample app on Mac silicon?

Hello I run sample app with 2020.3.29f1 / 2021.3.0f1(Mac silicon)
and causing error

Assets/ARRealismDemos/Common/Scripts/Recorder.cs(24,7): error CS0246: The type or namespace name 'Google' could not be found (are you missing a using directive or an assembly reference?)

Assets/ARRealismDemos/Common/Scripts/Recorder.cs(38,12): error CS0246: The type or namespace name 'ARCoreExtensions' could not be found (are you missing a using directive or an assembly reference?)

Assets/ARRealismDemos/Common/Scripts/Recorder.cs(40,13): error CS0246: The type or namespace name 'ARRecordingManager' could not be found (are you missing a using directive or an assembly reference?)

what maybe the problem?

Point Cloud Example Crash when using Raw Depth

The point cloud example built on the android device crash when pressing
the update button to call ComputePointCloud() in PointCloudGenerator.cs

This example works fine without checking "Use Raw Depth" in the inspector

Branch : arcore_sdk_unity
Unity Version: 2020.3.9f1
Device : pixel4a

Implemetation of ARCore Depth API functions in AR Foundation 4.x.x?

I want to implement some depth functions of the ARCore Depth API into an already existing AR Foundation project.
Is it possible to copy the scripts and adapt the gradle settings?

I want to improve the functionalities to toss a dice on a ground with occlusions. You should be able to cover objects with your hand or a person.

Problem with updated packages

I faced difficulties assembling this project as per the provided instructions. The suggested versions of Unity (2020.3.6f1), AR Foundation (4.2.0-pre.7), and ARCore Extensions (1.24) didn't work for me. I had to use the latest available versions to make it functional: Unity (2023.1.0b20), AR Foundation (5.0.6), and ARCore Extensions (1.38.0).

Unfortunately, I encountered some issues with this updated configuration. In comparison to the Google Play version of project, objects tend to partly fall through surfaces and behave worse in my assembly. Overall, the project's performance is considerably worse, than Google Play version of project on the same device.

Given the nature of library updates, I understand that such discrepancies are expected. However, being new to this technology, I'm curious to know which specific aspect is causing the most difficulties.

I haven't made any changes to the code. The issues arose solely from updating the libraries, which were necessary for the project to function.
image

If it's necessary, I can provide all information about fail in build with requirements in readme.

Create a simple working demo

Hi everybody,
I'm having some problems with creating a simple working demo from 0.
My goal is to put some GameObject in the scene with a custom material and use the Depth map to occlude part of the cube when it's not visible. Which are the steps to get it?

  • Add prefabs for the camera (ARcore Device) and for illumination (Environmental Light).

  • Enable Depth Mode in the session config attached to the ARcore Device.

  • Add DepthSource component to the ARcore Device.

  • Add some Cubes with the DepthTarget component attached.

When I run, no depth info is used and cubes appear in the old way.
What's is my fault?

How does Depth Lab provide 0-65m depth values?

I noticed that the Depth Lab uses AR Foundation v.4 but provides depth estimation for 0-65m, although AR Foundation v4 does not support this.
Currently, I am working with AR Foundation v.5 since I need to work with the whole 0-65m depth estimation, but I also need to work with the ARCore Extensions, which are only available for AR Foundation v.4.
When I try to use AR Foundation v.4, I cannot access the 0-65m depth estimation but only the 0-8m depth estimation since everything above 8m is set to 0. However, when using the Depth Lab app and activating the Depth Map, it seems that Depth Lab is able to detect the depth even for higher values than 8m.
I looked into https://github.com/googlesamples/arcore-depth-lab/blob/22cd7f1ce4eb2ed73bda19a0ea1bf3e636831799/Assets/ARRealismDemos/Common/Scripts/DepthSource.cs, but I don't get it why Depth Lab is able to provide the 65m depth values.

It would be great if someone could help me with that. Maybe someone can explain where the values greater than 8m are set to 0 and how Depth Lab bypasses this. And perhaps also how I can do this as well.

I really appreciate any help you can provide.

Depth API iOS Platform support

i'm using Iphone 7, Running ios 15.8 and having single camera, My Question is that does this ARCore Depth API Supports IOS devices or not. if it supports i can use this in my app for sensing depth data.

device should be compatible

Hi,

I setup the DepthLab Project as described on my Samsung S8 (which should be compatible). In fact i ran a Sample that used the ARCore Depth feature thru ARFoundation and it worked fine. However in the DepthLab project the app compiles but states that my device is incompatible and does not receive a cameraimage (background is black).
I ran this on Unity 2019.4.4f1 with the XR Legacy Input Helpers and Multiplayer HLAPI packages. I did not change anything other than adding those packages and switching to android as a plattform.
Any Ideas on what is going wrong?

Thanks for your hard Work
ARPatrick

[Question] Collision Detection object placement

First of all, this is a very impressive collection of different use cases for the Depth-API. In particular the localized depth examples are very interesting.
I'm developing with ARCore and still "Sceneform" and therefore I'm using a traditional way to place anchors by a hittest and processing the hitresults. That is working great, but there are areas where it is not possible to retrieve a hitresult, or it is just a pain in the ** until something is returned. Exactly for that I adopted some ideas from Oriented 3D reticles and Collision checking for AR object placement. So far it is working but I'm not very satisfied with the anchoring. I'm using the session to create a anchor based of a Ray, which is a result of the Depth computation I made. The problem is, that the anchoring is not very stable and tend to move with the camera. If I compare the anchroing of the Collision checking for AR object placement example it feels more solid to me. So I'm not a Unity guy and I have some difficulties to understand all pieces of the written code.

  • How is the anchoring of the object achieved
  • Has someone a workaround to get a more stable anchoring

Collider runtime error after local Unity 2018 build/install (was: Build error in Unity)

I'm trying to Build and Run the ARRealismDemos/Collider/Scenes/Collider
for Android Platform
on Windows 10 version of
Unity 2018.4.22f1 Personal

I'm seeing build errors

Error building Player because scripts have compile errors in the editor
Build completed with a result of 'Failed'
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr) (at C:/buildslave/unity/build/Modules/IMGUI/GUIUtility.cs:179)
UnityEditor.BuildPlayerWindow+BuildMethodException: Error building Player because scripts have compile errors in the editor
at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (UnityEditor.BuildPlayerOptions options) [0x00242] in C:\buildslave\unity\build\Editor\Mono\BuildPlayerWindowBuildMethods.cs:194
at UnityEditor.BuildPlayerWindow.CallBuildMethods (System.Boolean askForBuildLocation, UnityEditor.BuildOptions defaultBuildOptions) [0x0007f] in C:\buildslave\unity\build\Editor\Mono\BuildPlayerWindowBuildMethods.cs:97
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr) (at C:/buildslave/unity/build/Modules/IMGUI/GUIUtility.cs:179)

Any suggestions on how to fix this ?

can't build

Hello,
Can't build on Unity 2018.4.19f1 android 11 on win10. Getting error
CommandInvokationFailure: Gradle build failed.
C:\Program Files\Unity\2018.4.29f1\Editor\Data\PlaybackEngines\AndroidPlayer/Tools\OpenJDK\Windows\bin\java.exe -classpath "C:\Program Files\Android\gradle-5.6.4\lib\gradle-launcher-5.6.4.jar" org.gradle.launcher.GradleMain "-Dorg.gradle.jvmargs=-Xmx4096m" "assembleDebug"

stderr[

FAILURE: Build failed with an exception.

  • Where:
    Build file 'E:\dirs\arcore-depth-lab\Temp\gradleOut\build.gradle' line: 136

  • What went wrong:
    Could not compile build file 'E:\dirs\arcore-depth-lab\Temp\gradleOut\build.gradle'.

startup failed:
build file 'E:\dirs\arcore-depth-lab\Temp\gradleOut\build.gradle': 136: expecting '}', found '' @ line 136, column 1.
1 error

S20 Ultra - InvalidOperationException("Invalid depth value");

When testing on S20 Ultra is throwing InvalidOperationException("Invalid depth value");
Objects and wraps are not being pinned and are just floating around when moving phone.
I did not encounter this issues when testing on a S9+.
Also S20 Ultra has no issue when running Depth Lab app.

Why exclude the Packages directory?

It makes this much more fragile and prone to breakage. The whole point of putting an entire Unity project in a repo is to have a reliable, reproducible set up. The vast majority of Unity projects on Github include Assets, Packages and ProjectSettings and in my experience the ones that don't are much less likely to work without some fixes at my side.

Noisy Point Cloud

Hi,
I tried the point cloud sample and I noticed "flying points" when there are depth discontinuities in the scene, for example if I want to visualize the point cloud of a chair in a room there will be points flying in the air behind the chair besides the ones correctly visualized. As you know this is a common problem with ToF sensors where you can adjust a confidence threshold in order to minimize the anomalies.
I tried the sample with a Samsung Galaxy S8 and with a Samsung Galaxy S20+ that has its own ToF camera with VGA resolution, they have the same problem:
https://www.youtube.com/watch?v=YlbYWrjLWAQ

Is there a way to access the confidence of the DepthSource?

PointCloud not getting updated each frame

Hi, I was trying to build an application using the RawPointCloud Scene, I want to save the point cloud generated inside a .ply file, while I was able to create a sample PLY which can be found here, what I noticed each frame is producing new point clouds not updating last points if users haven't moved.

So in my ply file, I can see layers of points with slightly different positions in world space, so are the depth points tracked across frames through some identifiers?

for creating .ply I'm creating a list that saves vertices and their color in a session and saves them as .ply

if(vertex != Vector3.zero) { updatedPoints.Add(vertex); updatedColors.Add(color); }

Building Error (Unity 2019.3.3f1)

I'm using Unity 2019.3.3f1 Personal. After opening arcore-depth-lab and importing arcore-unity-sdk-1.20.0.unitypackage I set the version of ARCore Foundation from 2.0.2 to 3.1.3.

But if I build the app I always receiving following errors:

BuildFailedException: GoogleARCore detected. Google's "ARCore SDK for Unity" and Unity's "ARCore XR Plugin" package cannot be used together. If you have already removed GoogleARCore, you may need to restart the Editor.
UnityEditor.XR.ARCore.ARCorePreprocessBuild.EnsureGoogleARCoreIsNotPresent () (at Library/PackageCache/[email protected]/Editor/ARCoreBuildProcessor.cs:79)
UnityEditor.XR.ARCore.ARCorePreprocessBuild.OnPreprocessBuild (UnityEditor.Build.Reporting.BuildReport report) (at Library/PackageCache/[email protected]/Editor/ARCoreBuildProcessor.cs:30)
UnityEditor.Build.BuildPipelineInterfaces+<>c__DisplayClass15_0.b__1 (UnityEditor.Build.IPreprocessBuildWithReport bpp) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEditor.Build.BuildPipelineInterfaces.InvokeCallbackInterfacesPair[T1,T2] (System.Collections.Generic.List1[T] oneInterfaces, System.Action1[T] invocationOne, System.Collections.Generic.List1[T] twoInterfaces, System.Action1[T] invocationTwo, System.Boolean exitOnFailure) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

BuildFailedException: 'Preferences > External Tools > Android > Gradle' is empty. ARCore SDK for Unity requires a customized Gradle with version >= 5.6.4.
GoogleARCoreInternal.ARCoreAndroidSupportPreprocessBuild.CheckGradleVersion () (at Assets/GoogleARCore/SDK/Scripts/Editor/ARCoreAndroidSupportPreprocessBuild.cs:123)
GoogleARCoreInternal.ARCoreAndroidSupportPreprocessBuild.OnPreprocessBuild (UnityEditor.BuildTarget target, System.String path) (at Assets/GoogleARCore/SDK/Scripts/Editor/ARCoreAndroidSupportPreprocessBuild.cs:59)
GoogleARCoreInternal.PreprocessBuildBase.OnPreprocessBuild (UnityEditor.Build.Reporting.BuildReport report) (at Assets/GoogleARCore/SDK/Scripts/Editor/PreprocessBuildBase.cs:52)
UnityEditor.Build.BuildPipelineInterfaces+<>c__DisplayClass15_0.b__1 (UnityEditor.Build.IPreprocessBuildWithReport bpp) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEditor.Build.BuildPipelineInterfaces.InvokeCallbackInterfacesPair[T1,T2] (System.Collections.Generic.List1[T] oneInterfaces, System.Action1[T] invocationOne, System.Collections.Generic.List1[T] twoInterfaces, System.Action1[T] invocationTwo, System.Boolean exitOnFailure) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

Error building Player: 2 errors

Build completed with a result of 'Failed'
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

UnityEditor.BuildPlayerWindow+BuildMethodException: 3 errors
at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (UnityEditor.BuildPlayerOptions options) [0x00275] in <9a184ab867bb42c296d20ace04f48df3>:0
at UnityEditor.BuildPlayerWindow.CallBuildMethods (System.Boolean askForBuildLocation, UnityEditor.BuildOptions defaultBuildOptions) [0x00080] in <9a184ab867bb42c296d20ace04f48df3>:0
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

I always receive following Android Resolver message:
Unbenannt

I'm using Android Studio 4.0.1 and I already have jdk-14.0.2. I don't understand the error.
How to fix the error?

Got Error when I tried to Build the master's code on my device( NullReferenceException)

I got the issue below and I can't fix it for a long time..!
How can I fix this?

I got this android's sample app code(master's one) from this page and tried to build on my android device.
I imported ARCore-Extension Package using Package Manager.
Thank you!

⑴ NullReferenceException: Object reference not set to an instance of an object
Google.XR.ARCoreExtensions.Internal.RuntimeConfig+<>c.b__7_0 (UnityEngine.Object x) (at Library/PackageCache/com.google.ar.core.arfoundation.extensions@c3bc1636a644-1622675792614/Runtime/Scripts/Internal/RuntimeConfig.cs:79)

⑵ Error building Player: NullReferenceException: Object reference not set to an instance of an object

⑶ Build completed with a result of 'Failed' in 0 seconds (268 ms)
UnityEditor.EditorApplication:Internal_CallGlobalEventHandler () (at /Users/bokken/buildslave/unity/build/Editor/Mono/EditorApplication.cs:428)

⑷ UnityEditor.BuildPlayerWindow+BuildMethodException: 2 errors
at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (UnityEditor.BuildPlayerOptions options) [0x002be] in /Users/bokken/buildslave/unity/build/Editor/Mono/BuildPlayerWindowBuildMethods.cs:190
at UnityEditor.BuildPlayerWindow.CallBuildMethods (System.Boolean askForBuildLocation, UnityEditor.BuildOptions defaultBuildOptions) [0x00080] in /Users/bokken/buildslave/unity/build/Editor/Mono/BuildPlayerWindowBuildMethods.cs:9

⑸ Asset Packages/com.google.ar.core.arfoundation.extensions/Editor/BuildResources/DependenciesTempFolder has no meta file, but it's in an immutable folder. The asset will be ignored.

NullReference to DepthSource

Hi @ruofeidu ,

I tried installing master code with Demo carousel as well as seperate scenes, but everytime I am getting NullReference to Object DepthSource, I tried adding loggers in awake method of DepthSource but it never gets printed.

In your documentation, you mentioned at least one Depth Target should be present in scene in order for DepthSource to work, Can you please explain for which game object DepthTarget is attached by default (I tried searching for same but could find).

NullReferenceException stack trace:
Scene : Collider
Script : DepthMeshCollider.cs
Method : private void Update() 
Line # 252 ->
 else
        {
            if (DepthSource.Initialized)


Can you please check.
Thanks and Regards,
Mihir

BUG: Occlusions is not working properly into HelloAR using Depth API on android app

Hello, I am using:
Unity 2018.4.22 for windows 2010
Arcore arcore-unity-sdk-1.18.0
Huawei p30 pro (Depth API supported)

I pasted the folder arcore-depth-lab-master including the project settings to the new unity project.
I could build the Depth Lab apk and running on my phone without a problem.

But when I try to build the HelloAR example enabling Depth API.

The occlusion does not work properly as Depth Lab app built from unity and run from the same phone.

image
Screenshot_20200721_005922_com google ar unity arcore_depth_lab 1
Screenshot_20200721_005922_com google ar unity arcore_depth_lab
Screenshot_20200721_010202_com google ar unity arcore_depth_lab

Sometimes it gets dark as a shadow or gets the color of the object. But I think it has to disappear the object´s body when a physical object is between it and camera as Avatar depth lab
Any solution of this?

v1.18.0?

I am excited about these samples, but v1.18.0 is still not available for us until now. So when will it be published?

URP?

Isn't it supporting URP?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.