Giter Club home page Giter Club logo

mixed-reality-extension-sdk's Introduction

Mixed Reality Extension SDK

The Mixed Reality Extension SDK lets developers and community members extend the AltspaceVR host app's worlds with multi-user games and other dynamic experiences.

Prerequisites

  • Install Git
  • Install Node.js 8.12 or newer, which includes NPM 6.4.1 or newer, from nodejs.org

Get Started

The easiest way to start with the MRE SDK is to head over to the mixed-reality-extension-sdk-samples repo, and build a sample.

If you want to build the actual SDK itself, or want to build the functional test showcase MRE, jump to Build and Run section of this document

To see the APIs, jump to SDK documentation

If you have made a game or application in Unity3D, and you want it to support MREs, or you want to debug into the client runtime code itself, go to the Mixed Reality Extension Unity repository.

Other resources are on AltspaceVR's developer page

Overview

  • Written in TypeScript, and built on top of Node.js.
  • Utilizes a traditional game engine-style client-server model. All logic runs on the server, but the client performs the most CPU intensive and latency-sensitive tasks: animation, collision, rigid body physics simulation, rendering, and input handling.
  • Hides the complexity of multi-user synchronization and prediction, so developers focus on adding content, not debugging networking code.
  • Designed to be secure for users, tolerate high latency, minimize server activity, and seamlessly blend with the host app's native content.
  • Quick to start: Clone the samples repo, deploy an MRE locally, and see the extension running in AltspaceVR or other host apps within minutes.
  • We welcome contributions. Please see CONTRIBUTING.md if you are interested in helping out.

Features

The SDK enables you to create extensions that can

  • Modify the scene graph by loading glTF assets and scene files, instantiating primitives or the host app's built-in assets, or programmatically build meshes.
  • Create actors with 3d meshes (static or skinned), realtime lights and text objects
  • Create, load, and trigger keyframe animations, sounds, music, textures
  • Assign rigid body properties, physics forces, collision geometry, and have objects collide naturally with the host app world, or with other extensions.
  • Filter actors and behaviors to groups of users, or even have single-user actors
  • Apply click detection behaviors and register event handlers on the behaviors.
  • Attach actors to host app's avatars
  • Make actors grabbable and clickable when held

Current State

Developer Preview

Limitations

The MRE SDK has been in developer preview in AltspaceVR since December 2018, and while the API surface is fairly stable, internals are regularly refactored. We strive to maintain backwards compatibility when possible, but until we ship the 1.0 release, there will be occasional breaking changes, which will require MRE redeploys.

The SDK does not have a feature rich set of APIs yet. We have focused on the networking and synchronization, rather than adding more APIs. Expect this to improve over time. Our

The SDK deploys anywhere Node.js works.

Goal

We want to deliver a feature-rich set of APIs, enabling creation of high quality, rich 3d experiences. There are many features we want to add, including

  • Rigid body constraints
  • Input latency improvements
  • Protocol optimization
  • 2D UI layout system with common UI controls

However, our highest priority is reliability, predictability, and ease of use, and as a consequence we spend much more time making sure we have a solid, flexible infrastructureare than adding shiny new features.

Major known Issues

  • Rigid body physics state syncronization is jittery.
  • Users can't reliably directly collide with rigid body objects, except by grabbing.
  • A number of client-side errors don't get send to the node log, which makes debugging hard. This includes glTF loading errors and using the wrong name when playing animations.

Roadmap

We are working towards 1.0 release. Please look at our project page for remaining work.

How to Build and Deploy the SDK and functional tests

From command prompt:

  • git clone http://github.com/microsoft/mixed-reality-extension-sdk
  • cd mixed-reality-extension-sdk
  • npm install -g lerna
  • npm install This will install all dependent packages. (and will do very little if there are no changes)
  • npm run build This should not report any errors.
  • npm start This should print "INF: Multi-peer Adapter listening on..."
  • See also: Using Visual Studio Code instead of command line

Testing an MRE In AltspaceVR

  • In AltspaceVR, go to your personal home
  • Make sure you are signed in properly, not a guest
  • Activate the World Editor
  • Click Basics group
  • Click on SDKApp
  • For the URL field, please enter the URL ws://localhost:3901
  • Click Confirm
  • After the app has been placed, you will see the MRE Anchor (the white box with red/green/blue spikes on it), and you can use it to move the MRE around, and you can see the status of the MRE connection by looking at the icon on the anchor. To hide the anchor, uncheck "Edit Mode".

You should now see a functional test load up inside AltspaceVR.

Pre-deployed MREs

We have deployed the hello world and functional test MREs to servers in the cloud. The URLs are

  • ws://mres.altvr.com/helloworld
  • ws://mres.altvr.com/solarsystem
  • ws://mres.altvr.com/tests/latest
  • ws://mres.altvr.com/tictactoe

Using Visual Studio Code

We recommend Visual Studio Code, a lightweight code editor, which is easy to use and offers full debugging capabilities for Node.js servers.

  • Install from here: Visual Studio Code
  • You may want to add the TSLint extension to get style tips - use View->Extensions(ctrl+shift+X), search for TSLint, click Install.
  • To build: use Tasks->Run Build Task... (ctrl+shift+B), and you can select npm: Build for some or all packages.
  • To choose which MRE to launch: go to debugger sidebar: (ctrl+shift+D), and from the dropdown choose desired MRE.
  • To launch the MRE server: use Debug->Start Debugging (F5). To stop the server: user Debug->Stop Debugging (shift+F5)

Hosting and Multi-User Testing

To learn about additional deployment options and multi-user testing in AltspaceVR, see DEPLOYING.md

Getting In Touch

To report issues and feature requests: Github issues page.

To chat with the team and other users: join the MRE SDK discord community.

Or attend the biweekly AltspaceVR developer meetups.


Reporting Security Issues

Security issues and bugs should be reported privately, via email, to the Microsoft Security Response Center (MSRC) at [email protected]. You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Further information, including the MSRC PGP key, can be found in the Security TechCenter.

License

Code licensed under the MIT License.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

mixed-reality-extension-sdk's People

Contributors

anthonyneacevr avatar dependabot[bot] avatar eanders-ms avatar eedmond avatar jabenk avatar jtfell avatar microsoftopensource avatar msftgits avatar nik0la31 avatar norybiak avatar olga-microsoft avatar patole avatar sorenhan avatar sorenhannibal avatar stevenvergenz avatar tombums avatar univrsity avatar willneedit avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mixed-reality-extension-sdk's Issues

Feature Request: portals/teleport

In-space teleporting and movement:

  • The possibility to moved one user to the desired location.
  • Setting a base velocity vector. (Allows for activities like downhill skiing, where the user is set in motion)
  • Reparenting the user to a given actor. (Allows for the user making use of a vehicle like a mine cart)

World teleporting:

  • Portals and teleporters

Input Behaviors are not received if multiple MREs have executed in same AltspaceVR session

Input Behaviors don't always get piped through to MRE. it seems related to having had multiple instances of the same MRE running within a single AltspaceVR session (but not necessarily in the same scene)

Repro steps:

  1. Instantiate the Hello World MRE in your AltspaceVR homespace. Observe that the cube reacts when you hover, click, and unhover.
  2. Enter another world in AltspaceVR, and instantiate the Hello World MRE.
    Observed: The cube does not react to hover, click, or unhover.
    Expected: The cube should react

Note that if going back to the homespace, the first MRE instance will launch and will be interacting properly. That leads me to guess this is an AltspaceVR integration or Client Lib issue, not a Server side issue.

Attach avatar to object - allow avatar position to be synced to object position.

I realize this might seem like its not a good idea, but without it people will find other much worse ways to achieve the same thing like putting avatars inside inverted boxes or moving them with planes.

Attach the avatar to a rigid body/object to move with it.
Allow the avatar to move about within a limited space - on a large vessel.
Allow multiple users to sync to the same object - maybe attachment slots.

This allows for users to be put inside things, cars, boats bikes, elevators etc.

Users should be able to disconnect at any time and it should only control the position , but possibly allow for the initial rotation to be set as an offset.

Reconnect when the URL changes

If an app is playing and you change the URL, it remains connected to the old app until you toggle the Is Playing off and back on.

Feature Request: Access AltspaceVR Space Editor placed entities

If I place a number of space components in a world, I'd want to be able to access and modify them from within my MRE.

Note that it needs to be able to restore space components when user reenters the space or stops/restarts the MRE.
One possible implementation is:
In the space editor assets are associated with the MRE (or tagged as Used By MRE).
When the MRE is in isPlaying state, the original space components are hidden from the scene, and when MRE stops playing/is removed, the original assets are unhidden.
the MRE itself could need to call CreateFromLibrary with a custom parameter, and it would clone all of the associated hidden space components.

Physics forces are in worldspace, not in MRE-space

Physics forces are in worldspace (absolute orientation and non-scaled) - but should be MRE relative (or alternatively local), both for orientation and scale.

Repro Steps

  1. Load functional test scene for rigid-body-test. If placed at identity orientation/scale it works as expected, but if MRE root has been rotated or scaled, it is off.. This can be simulated using the "change global root" function of the testbed.
    Observed: Projectile flies in different direction, and misses target, if MRE root is rotated
    Expected: Regardless of rotation/scale, projectile should hit the target.

Event listeners don't work in other world spaces

When trying to have an actor listening to events (like onHover and onClick), it only works if the item is created in the home space, but not if the item is created in a new world space.

Reproducible with hello-world example: Works okay (growth/shrink on hover, sideflip on click) in Homespace, but no reaction on user interaction when created in world space (just spins)

(UPDATE): Error is more generalized: When an item is used in more than one worldspace, the first instance I encountered worked correctly, not the second one. "Re-enter space" and "Reset Space" doesn't affect that bug; one has to completely exit AltspaceVR and restart the client.

Programmatically create functional tests

Right now we hardcode each functional test into the functional test level. This isn't scalable. We should instead read each test configuration from a json file and dynamically instantiate each test station. json file might look like:

{
    "host": "mre-hello-world.azurewebsites.net",
    "tests": {
        "rigid-body-test": {
            "name": "Rigid Body Test"
        },
        "text-test": {
            ...
        }
    }
}

CreateFromGLTF's collision volumes aren't enabled until actor has been updated

When creating box collision as a parameter for CreateFromGLTF, for some scenes (Suzanne),
it creates the rigid body, but it doesn't seem to be appear to be added properly to the unity physics scene. It isn't visible in the unity physics debugger view (and therefore doesn't collide) until the owning GO (or parent GO) has moved around, or the collision component has been modified.
It seems like a potential Unity physics bug.
There is no reliable repro steps for this yet.

gltf-gen does not build or publish documentation

The code is already documented for typedoc, but we need to automate its generation, and we need to decide where to publish it. I see two possibilities:

  1. Build docs to a markdown file and host it directly from the github repo.
  2. Build docs as HTML to /docs/gltf-gen, and link from the developer portal.

Feature Request: Audio

Key features

  • play one-shot sounds at 3d position
  • preload sounds
  • play audio stream at 3d position
  • Set/modify pitch and volume per sound instance
  • Automatically audio streams on multi-user join

Add functional test for scene hierarchy

We should add something that can easily show the scene graph parent/child relationships. For example something like:
Spawn 5 primitives in a row, parent them to each other, start spinning each of them slowly, and watch the children spin around each other, after a few seconds unparent them all, and watch them spin at the point where they were detached

Non-looping Keyframe animations doesn't always stop at the end frame

In AltspaceVR, it seems keyframe animations sometimes don't stop exactly at the last frame if they are in 'once' mode.
Repro steps:

  1. In the Hello World sample, in generateSpinKeyframes, remove the last frame.
  2. During the DoAFlip animation creation, change duration from 1.0 to 0.5, so the animation executes quicker
  3. Build, Launch, and instantiate the MRE
  4. Click on the cube
    Observed: Sometimes the cube stops spinning a little early (or a little late?), and sometimes it stops pointing straight up.
    Expected: After the rapid spin, the cube should always stop with the top pointing straight up

Material backface culling/no culling

Be able to dynamically create and apply materials to geometry.
set/update GLTF textures or colors (albedo/metalroughness/normal/occlusion/emissive)?

Lookat math is in worldspace, not in MRE space

Repro steps

  1. Instantiate the lookat functional test.
  2. move and rotate the MRE root to not be at identity anymore
    Observed: The Monkey Head dreamily stares off into space
    Expected: The Monkey Head stares you straight in the eyes

Feature Request: Player Masking

Need the ability to enable behaviors for one or all users, or possibly user group/teams.
Need the ability to enable actors for one user, all users, all users except one, or possibly user group/teams.

must be dynamic (as people join/leave teams)
Must support recording+Playback.

some use cases:
first person shooter - player sees their own first person rig vs everyone else sees a real 3d model,
when players die, they can see everyone?
card games - hide cards from anyone but player - but the moment player plays a card, it becomes visible to everyone.

Expose user's device performance characteristics

A user's basic performance characteristics can allow targeted assets that match the user's device's capabilities. I'm imagining a tiered system, with lower values for weaker hardware and higher values for more powerful devices. Initially, 1 for mobile, 2 for desktops with integrated graphics, and 3 for desktops with discrete GPUs.

Mac server support

Currently MREs can be deployed to Macs just fine, and the logic executes, but static files are not working

Repro steps

  1. Deploy Hello World MRE to a mac
  2. instantiate it in AltspaceVR
    Observed: Hello World text shows up and spins, but cube does not
    Expected: Hello World text and cube both show up and spin.

Report glTF load network errors

If your baseUrl is bad, or for some other reason clients cannot download the model, there is no indication anywhere (even in the Altspace log) that the load failed. The create promise just never resolves.

Rigid body patches are in scene coordinates, not in MRE coordinates

Rigid body patch data is all in global space, but should be relative to the MRE root.

The code shouldn't assume that MRE root is the same global offset for scene 1 as for scene 2. It is possible to share a session ID across multiple different AltspaceVR scenes, for example.

Feature request: Attach object to user avatars

Need the ability to attach MRE actors to an individual user's avatar - either to the root node or relative to different body parts.
It must work regardless of any avatar model that exists now or in the future, so system can't make assumptions about different models.

This should probably be done by having pre-defined attach points on the avatars for hat, backpack etc., it would allow easy customization of avatars, and we update all avatar rigs with these attach points

Rigid Body transform do not syncronize across multiple users

Rigid body simulations work fine with a single user. However, once a second user enters, the objects teleport away.

Repro steps:

  1. Instantiate an MRE that uses rigid body physics, for example the functional test named rigid-body-test
  2. Have a second user enter the same space.
  3. Observe rigid bodies

Add procedural glTF handlers to WebHost

Though gltf-gen exists, there's currently no way to use it in an MRE. I want to add a method to WebHost where you can get a binary blob registered with a URL for clients to load it. This would let you generate a buffer from a GLTFFactory, then load it via Actor.CreateFromGLTF.

AltspaceVR Android doesn't run

MRE connection fails in Android versions (GearVR etc.)

Repro steps:

  1. place an MRE in any space in AltspaceVR, and enter a valid URL, for example ws://mre-hello-world.azurewebsites.net.
  2. On Android platforms, the MRE does not show up. Entering the same space on Windows, the MRE shows up fine.

Figure out why functional test deploy to openode requires tslib dependency

The functional tests seem to have a dependency when deploying to openode.io. For some reason this does not show up as a dependency in local builds.
The package exists locally as it is used as a dev dependency by tslint/typescript, so it may be because of that, combined with Lerna?

Either way, we should figure out why it's needed and why our local build don't complain about it.

Send MREAPI.Logger.LogXXXXX() text through to Node.js

When an MRE is instantiated, and the client has errors, the errors should be sent back to the MRE, so they can be piped through to where developers expect it. By default I would expect it to show up in the Visual Studio Debug Console, and in Node.js' log files.

Examples where I would expect errors to show up
If an MRE client tries to play an animation that doesn't exist on an actor
If MRE fail to load a GLTF
If MRE tries to access an actor that doesn't exist

Feature Request: Exact Timing/Synchronization API

Currently all APIs say "do it now", and due to unpredictable latency between server and host app, it's not possible to do reliably timed API calls. We should implement a way to call things relative to some other time (ideally a timer that restarts for example every minute, but alternatively time-since-start). A couple of changes needed are:

  1. Server should add a timestamp into each packet it sends out
  2. Client should continually calculate its' latency, and try to delay server packages to be as smoothed out as possible to match the server timestamp. (this means added latency at the cost of smoother packages )
  3. Client should have a queue of commands to execute X frames into the future, rather than apply them instantly.

We'd need to be very careful to make sure all clients execute all packets in the exact same order between instant and delayed commands.

Feature Request: TeleportTo/InterpolateTo

Ability to instantly teleport (don't collide with anything between old and new positions) and interpolate (moving through space over time, and colliding with any collision geometry on the way)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.