Giter Club home page Giter Club logo

microsoft / mixedreality-webrtc Goto Github PK

View Code? Open in Web Editor NEW
889.0 53.0 275.0 30.82 MB

MixedReality-WebRTC is a collection of components to help mixed reality app developers integrate audio and video real-time communication into their application and improve their collaborative experience

Home Page: https://microsoft.github.io/MixedReality-WebRTC/

License: MIT License

C++ 40.80% C 3.54% C# 49.66% ShaderLab 0.46% PowerShell 2.62% CMake 0.41% Shell 1.87% Java 0.44% Objective-C 0.21%
mixedreality unity3d cpp csharp webrtc real-time-communications video-chat video audio hololens

mixedreality-webrtc's Introduction

MixedReality-WebRTC

Licensed under the MIT License Holodevelopers channel on Slack NuGet

MixedReality-WebRTC is a collection of libraries to help mixed reality app developers to integrate peer-to-peer real-time audio and video communication into their application and improve their collaborative experience.

  • Enables multi-track real-time audio / video / data communication with a remote peer
  • Provides an abstracted signaling interface to easily switch implementation
  • Exposes an API for C++ and C# to integrate into existing apps
  • Provides a set of Unity3D components for rapid prototyping and integration
  • Includes support for Microsoft HoloLens (x86) and Microsoft HoloLens 2 (ARM)
  • Allows easy use of Mixed Reality Capture (MRC) to stream the view point of the user for multi-device experiences

MixedReality-WebRTC is part of the collection of repositories developed and maintained by the Mixed Reality Sharing team.

Caution

MR-WebRTC has been deprecated. We're no longer committing development resources to it, taking pull requests for it, or planning a replacement for it.

  • If you want to continue updating it, you must fork and maintain your own branch.

Download

NuGet (C++, C#) and UPM (Unity) packages are available for stable releases (release/* branches). See the Release page on GitHub or the Download documentation page for details.

Note: The master branch may contain breaking API changes from the latest stable release. It's therefore not guaranteed to work with NuGet packages, which are only available for stable releases. In particular, the Unity library scripts are only guaranteed to be compatible with NuGet packages if copied from a release/* branch, though it is strongly recommended to use the UPM packages instead.

Branches

The release/2.0 branch contains the latest stable version of the API, from which the NuGet and UPM packages are published.

The master branch contains the latest developments. Care has been taken to keep this branch in a fairly clean state (branch can build, tests pass). However the master branch contains API breaking changes compared to the latest release, and therefore is not compatible with NuGet/UPM packages and should be built from sources instead (see Building from sources documentation).

Documentation

The official documentation is hosted at https://microsoft.github.io/MixedReality-WebRTC/.

User Manual

The User Manual contains a general overview of the various libraries of the project and some tutorials on how to use them.

  • The Hello, Unity world! tutorial introduces the Unity integration by building a simple audio and video chat client.
  • The C# tutorials introduce the .NET Standard 2.0 C# API, which can be used outside Unity.
    • Hello, C# world! (Desktop) shows how to build a simple console app in .NET Core 3.0, which runs as a Windows Desktop (Win32) app.
    • Hello, C# world! (UWP) shows how to build a GUI app with a UI based on WPF (XAML), including how to render the local and remote video.

API reference

An API reference is available for the C# library and the Unity integration.

Overview

The overall architecture is as follow:

MixedReality-WebRTC architecture

Library Lang Description
mrwebrtc C/C++ Native C/C++ library providing a low-level interface to the underlying WebRTC implementation from Google. Compared to the API exposed by the Google implementation (PeerConnection), the current interface is simplified to remove the burden of setup and configuring. It also tries to prevent common threading errors with the UWP wrappers. This library exposes are pure C API easily integrated into any C/C++ application.
Microsoft.MixedReality.WebRTC C# 7.3 C# .Net Standard 2.0 library providing access to the same API as the native C library, exposed with familiar C# concepts such as async / await and Task.
Microsoft.MixedReality.WebRTC.Unity C# 7.3 Unity3D integration - a set of Unity MonoBehaviour components with almost no required setup, to enable rapid prototyping and simplify integration into an existing app.
Microsoft.MixedReality.WebRTC.Unity.Examples C# 7.3 Unity3D samples showcasing typical use scenarios like a peer-to-peer video chat app.

MixedReality-WebRTC is currently available for Windows 10 Desktop and UWP, with or without Unity, and Android (Unity only).

Note - In the following and elsewhere in this repository the term "Win32" is used as a synonym for "Windows Desktop", the historical Windows API for Desktop application development, and in opposition to the "Windows UWP" API. However Microsoft Windows versions older than Windows 10 with Windows SDK 17134 (April 2018 Update, 1803) are not officially supported for this project. In particular, older versions of Windows (Windows 7, Windows 8, etc.) are explicitly not supported.

Sources

This repository follows the Pitchfork Layout in an attempt to standardize its hierarchy:

bin/               # Binary outputs (generated)
build/             # Intermediate build artifacts (generated)
docs/              # Documentation sources
+ manual/          # User manual sources
examples/          # Examples of use and sample apps
external/          # Third-party external dependencies (git submodules)
libs/              # Source code for the individual libraries
tests/             # Source code for feature tests
tools/             # Utility scripts
+ build/           # Build scripts for the various platforms
  + android/       # Android Studio project to build libmrwebrtc.so
  + libwebrtc/     # Android build scripts for Google's WebRTC library
  + mrwebrtc/      # Windows build tools to build mrwebrtc.dll
+ ci/              # CI Azure pipelines

The Microsoft.MixedReality.WebRTC.sln Visual Studio 2019 solution located at the root of the repository contains several projects:

  • The native C/C++ library mrwebrtc, which can be compiled:
    • for Windows Desktop with the mrwebrtc-win32 project
    • for UWP with the mrwebrtc-uwp project
  • A C/C++ library unit tests project mrwebrtc-win32-tests
  • The C# library project Microsoft.MixedReality.WebRTC
  • A C# unit tests project Microsoft.MixedReality.WebRTC.Tests
  • A UWP C# sample app project Microsoft.MixedReality.WebRTC.TestAppUWP based on WPF and XAML which demonstrates audio / video / data communication by mean of a simple video chat app.

Note - Currently due to CI limitations some projects are downgraded to VS 2017, as the Google M71 milestone the master and release/1.0 branches are building upon does not support VS 2019, and Azure DevOps CI agents do not support multiple Visual Studio versions on the same agent. This will be reverted to VS 2019 eventually (see #14).

Building MixedReality-WebRTC

See the user manual section on Building from sources.

Special considerations for HoloLens 2

  • Mixed Reality Capture (MRC) has some inherent limitations:
    • MRC only works up to 1080p (see the Mixed reality capture for developers documentation), but the default resolution of the webcam on HoloLens 2 is 2272 x 1278 (see the Locatable Camera documentation). In order to access different resolutions, one need to use a different video profile, like the VideoRecording or VideoConferencing ones. This is handled automatically in the Unity integration layer (see here) if WebcamSrouce.FormatMode = Automatic (default), but must be handled manually if using the C# library directly.
    • MRC requires special permission to record the content of the screen:
      • For shared apps (2D slates), this corresponds to the screenDuplication restricted capability, which cannot be obtained by third-party applications. In short, MRC is not available for shared apps. This is an OS limitation.
      • For exclusive-mode apps (fullscreen), there is no particular UWP capability, but the recorded content is limited to the application's own content.
  • Be sure to use PreferredVideoCodec = "H264" to make use of the hardware encoder present on the device; software encoding with e.g. VP8 or VP9 codecs is very CPU intensive and strongly discouraged.

Known Issues

The current version is a public preview which contains known issues:

  • By default, the WebRTC UWP implementation accesses video capture devices on the system in exclusive mode. This will cause an error when enabling video capture if another application is using the device.
  • ARM64 build: Not supported by WebRTC m71/WebRTC UWP SDK.
  • H.265: Not supported by WebRTC m71/WebRTC UWP SDK.
  • HoloLens 2 exhibits some small performance penalty due to the missing support (#157) for SIMD-accelerated YUV conversion in WebRTC UWP SDK on ARM.
  • H.264 hardware video encoding (UWP only) exhibits some quality degrading (blockiness). See #74 and #153 for details.
  • H.264 is not currently available on Desktop at all (even in software). Only VP8 and VP9 are available instead (software encoding/decoding).
  • The NuGet packages (v1.x) for the former C++ library Microsoft.MixedReality.WebRTC.Native include some WebRTC headers from the Google repository, which are not shipped with any of the NuGet packages themselves, but instead require cloning this repository and its dependencies (see #123).

In addition, the Debug config of WebRTC core implementation is known to exhibit some performance issues on most devices, including some higher-end PCs. Using the Release config of the core WebRTC implementation usually prevents this, and is strongly recommended when not debugging.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Reporting security issues and bugs

MixedReality-WebRTC builds upon the WebRTC implementation provided by Google. Security issues and bugs related to this implementation should be reported to Google.

Security issues and bugs related to MixedReality-WebRTC itself or to WebRTC UWP SDK should be reported privately, via email, to the Microsoft Security Response Center (MSRC) [email protected]. You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Further information, including the MSRC PGP key, can be found in the Security TechCenter.

mixedreality-webrtc's People

Contributors

adrianimajion avatar atosnicos avatar boenkemeyer avatar desto12 avatar djee-ms avatar eanders-ms avatar fibann avatar grbury avatar jacques-p-amiot avatar jongfeelkim-virnect avatar karenconsoli avatar microsoftopensource avatar msftgits avatar nmlgc avatar p4blom4rcos avatar pablomarcos avatar prothen avatar rygo6 avatar rygo6-ms avatar sanderaernouts avatar simonchiarel avatar sipsorcery avatar stephenatwork avatar thaivan avatar torepaulsson avatar wpmanoj avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mixedreality-webrtc's Issues

Upgrade to WebRTC m75

The WebRTC UWP team finished their upgrade to m75 and are now providing a branch for it. This notably includes support for ARM64.

Remove deprecated offer_to_receive_xxx options

From #51:

Not sure about if it has any effect on performance, it could also be interesting to be able to modify lines 275-276 and 291-291 in peer_connection.cpp

options.offer_to_receive_audio = true;
options.offer_to_receive_video = true;

Now this is a special case for my implementation, but I only either send OR receive video/audio on each peer connection, so previously to this plugin, I modified offer_to_receive_audio/video to false depending on the connection. I also changed sendrecv in the sdp to sendonly or recvonly depending on the use-case.

Those options are deprecated anyway, so should be removed entirely. This mainly needs testing.

API versioning

The RTCPeerConnection API utilizes RTCConfiguration in the RTCPeerConnection constructor.

In practice, RTCConfiguration has been extended to enable API versioning, so as to enable the transition from the "Plan B" SDP dialect to "Unified Plan".

How does versioning work in the Mixed Reality SDK? Does the MR SDK use "Plan B" or "Unified Plan" by default? Is it possible to specify a non-default API?

Note that the "Plan B" API dialect has been deprecated and could be slated for removal from Chrome and Edge as soon as 2020.

Add a unit test compiling the solution

See #32 - the fact that CI compiles project independently means we are missing solution-wide breaks. We should have a unit test which attempts to compile the entire solution.

Hololens Video Render is Mirrored

I just noticed this issue. The video that is sent form the hololens to a peer is mirrored left to right. Have noticed this when running with MRC turned on.

Enable "Allow 'unsafe' code" in Player Settings with custom MediaPlayer

When creating my own MediaPlayer for rendering the Video stream on UI instead of in a Renderer Unity makes me enable "Allow 'unsafe' code" in Player Settings because the script is using the unsafe statement.
Instead, when I use the MediaPlayer provided with the example it is not necessary to enable this option on Player Settings.
Are you using any external or additional file on Assets folder for avoiding this error with your MediaPlayer.cs script? Why I need to enable it when using it on my script but not when using yours?
Thanks

Can't compile solution by VS 2017

I try to compile solution by VS2017, but meet some issue. Many include file can not be found in pch.h.
image

I have install all the packages in NuGet (17 packages like Microsoft.MixedReality.WebRTC.Native.****) , and it can not work.

Do I miss something ? please help me, thanks~~

Add event with ice connection state or onSuccess negotiation

It would be nice to either have the whole iceConnectionState log available as a separate event, or at least a success event fired when iceConnectionState reaches Connected or Completed, and a failed event when iceConnectionState reaches Failed or hangs indefinitely on Checking.

The current event Connected in PeerConnection is fired even though ice candidates are not correctly resolved. I test this by just commenting out the AddIceCandidate method, thus never giving the peer connection any ice candidates to use.

Having these events will allow me to e.g.

  • Reconnect internally/automatically to the specific peer that fails
  • Develop UI that changes depending on if the peer is connecting vs connected
  • Debug reasons why specific connections won't work
  • Setup a stress-test, where I auto-reconnect a peer immediately after ice candidates are resolved (and then do this with 10 peers at the same time)

Speaking of Ice negotiation, would it be possible to open up for changing Ice gathering settings, such as bundlePolicy and iceTransportPolicy?
E.g. in my situation, I only need Relay Ice Candidates (usually 1 or 2), and not the ~17 candidates given to me through the IceCandidateReadytoSend event.
This would be easily fixed by changing iceTransportPolicy to Relay, instead I have to filter out the candidate if it contains the type "host".

Improve debuggability of C++ library

Currently the preview packages for the core input dependencies being very large are shipping without PDBs, which makes debugging difficult.

Possible solutions:

  1. Ship NuGet symbol packages to nuget.org, making the symbols available to debuggers through the nuget.org symbol server

    This is the standard solution to shipping PDBs, but is most suited for managed code. In the case of webrtc.lib, a single lib file is associated with many PDBs, which appears not to be supported by nuget.org. Moreover in the case of a static library the PDBs are needed during linking, not only at debug time, and I don't think symbol servers are used at that point.

  2. Ship NuGet packages with PDBs inside.

This is different from 1. in that the packages are regular .nupkg, not symbol packages (.symbols.nupkg or .snupkg). There, PDBs are treated as any other content file, and essentially ignored by nuget.org. This has the advantage to make the PDB files available at link time too.

Working with local webrtc.lib build:

In addition of the missing PDBs, the current NuGet packages contain .targets files to configure the header and library paths. Because those are .targets and not .props, they are included last, overriding any local path. This means users currently need to tweak the .vcxproj to make things work with the locally-built core dependencies, and uninstall the NuGet packages locally.

More options for SDP and video settings

Hi,

I'm currently initiating calls with the SDP i receive from the LocalSdpReadytoSend, but as I understand, it has already been set as local description at that time.
It would be beneficial if I could e.g. modify specific OPUS parameters as Max Average Bitrate, Max Capture Rate, Max Playback Rate and so forth. Same goes for video parameters such as resolution and framerate, but as I understand, that's already in the works. 👍

Maybe CreateOffer() could have an optional input variable for specific SDP options?

DataChannel crash on 32-bit platforms

The PeerConnection.MemCpy(void*, void*, ulong) signature uses the C# ulong (64-bit), but the C/C++ implementation mrsMemCpy(void* dst, const void* src, size_t size) uses std::size_t whose size is platform-dependent, resulting in stack corruption on 32-bit platforms.

See #47 for full discussion with @pablomarcos.

[Regression] Media Foundation video capture failing on HoloLens 1 and 2

There is currently a regression issue with the video capture, so far confirmed only on HoloLens 2, where the Media Foundation capture engine successfully open the video capture device and acquires some video samples, but fails to write those video samples to its sink for a yet unknown reason.

Symptoms are, on the remote peer, that the video track coming from HoloLens 2 is negotiated and added (TrackAdded signal fired, optionally predated by RenegotiationNeeded if appropriate) but no data is sent over the network and consequently the per-frame callback I420RemoteVideoFrameReady is never fired, which e.g. for TestAppUWP will prevent the media player from starting.

Stale webrtc-uwp-sdk dependency is missing the --cpp17 flag fix

The issue is simple but the solution requires some knowledge of git submodules, so detailing here for future reference.
See also the excellent https://tech.labs.oliverwyman.com/blog/2015/01/31/git-submodules/


Currently all branches are pointing at an earlier version of the releases/m71 branch of webrtc-uwp-sdk, namely webrtc-uwp/webrtc-uwp-sdk@cd7b538 from May 17th:

> git ls-files --stage -- external\webrtc-uwp-sdk
160000 cd7b538edafe414764a5bce0dc7b20d40ff1d7db 0       external/webrtc-uwp-sdk

The change introducing the --cpp17 flag into the build scripts was committed on June 28th, namely webrtc-uwp/webrtc-uwp-sdk@271ea0f.

This results in repositories freshly cloned following the instructions in the README.md not being able to compile:

> git clone --recursive https://github.com/microsoft/MixedReality-WebRTC.git
[...]
> cd tools\build
> build.ps1 -BuildConfig Debug -BuildArch x64 -BuildPlatform Win32
[...]
run.py: error: unrecognized arguments: --cpp17

There are multiple ways to fix that locally:

  1. Tell git to update the webrtc-uwp-sdk submodule to the latest commit on the releases/m71 branch:

    git submodule update --remote --force --recursive

    Note the --remote argument which reads the branch=releases/m71 config in .gitmodules and update to the associated commit.

  2. Explicitly checkout the releases/m71 branch and force a submodule sync:

    cd external\webrtc-uwp-sdk
    git checkout releases/m71
    git submodule update --force --recursive
    cd ..\..

To check that the submodule has been update, run again:

> git ls-files --stage -- external\webrtc-uwp-sdk
160000 134c00e8320133ba1e8fa1d5fe4e076b211653e5 0       external/webrtc-uwp-sdk

The fix simply consists in fixing locally then committing the change to external/webrtc-uwp-sdk, which shows the updated commit:

diff --git a/external/webrtc-uwp-sdk b/external/webrtc-uwp-sdk
index cd7b538..134c00e 160000
--- a/external/webrtc-uwp-sdk
+++ b/external/webrtc-uwp-sdk
@@ -1 +1 @@
-Subproject commit cd7b538edafe414764a5bce0dc7b20d40ff1d7db
+Subproject commit 134c00e8320133ba1e8fa1d5fe4e076b211653e5

Kurento Media Server Support

I am trying to understand exactly what differentiates a Kurento server client from a regular webRTC client. I'd be willing to give a go at creating that interoperability if I can figure out where to start. Moreso, I am looking to be able to stream from unity to Kurento.

Mixed Reality Collaboration question

Have a couple of questions. First, trying to build a remote collaboration capability and wondering how to access the data channel in Unity in order to send not only the Hololens spatial mesh but also allow sending holograms/annotations to devices. Second, is there a way to use a Websockets signalling server instead of the http server?

Unity Test App: CreateOffer button does not update RemotePeerId

The create offer button of NodeDssSignalerUI.cs checks if the entered id is valid but fails to actually assign it to the RemotePeerId variable of NodeDssSignaler.cs

Right now it is easy to workaround by manually entering it in Unity Editor, but I am guessing that the button should be performing that functionality.

Video stream to multiple peers

Hi,

Do you know what it would take to be able to stream your local video to several remote peers?
So a Peer A and Peer B both receiving video from Peer C. Is it supported at the moment, and if not will it be in the future?

A quick test with your Unity sample didn't seem to work, but I might be missing a step or two.

Thanks in advance.

Remove ISignaler as a required input for a new PeerConnection

Suggestion.

When creating a new WebRtc.PeerConnection, an ISignaler derived class is required. But looking at the code, the requirement is redundant. Most events are fired as stand-alone events, and if these events are correctly subscribed to in a generic signaler class, the class provided to PeerConnection isn't needed.

In fact, changing lines 429-433:

429        public PeerConnection(ISignaler signaler)
430        {
431            Signaler = signaler;
432            Signaler.OnMessage += Signaler_OnMessage;
433        }

to the following code

public PeerConnection(ISignaler signaler)
{
    if (signaler != null)
    {
        Signaler = signaler;
        Signaler.OnMessage += Signaler_OnMessage;
    }
}

would allow for a null signaler to be inserted and still work, since the rest of the ISignaler method calls in PeerConnection check for a null ISignaler, e.g.

1215        Signaler?.SendMessageAsync(msg);

Suitable for remote rendering?

Is this project suitable for remote rendering, or is it meant primarily to communicate between MR devices?

I'm looking for a solution similar to 3D Streaming Toolkit for Hololens 2 use. This project seems like it has better support though.

Works in Unity editor but doesn't work in build, DLL not found.

I follow the build instructions for the whole solution and then copied the DLLs into the include unity project. Everything open and ran correctly in the editor, however when I make a non-UWP windows x64 build I get a DllNotFoundException saying that it can't find the Microsoft.MixedReality.WebRTC.Native.dll. I tried this with the mono and IL2CPP scripting backends and tried a UWP build too and still got this exception. Do you have any idea why this might be happening?

TestAppUwp: Audio not working

Both audio playback and transmission of the TestAppUwp do not work; Video playback works correctly.

Tested by connecting to both another instance of TestAppUwp and the Unity example client. Unity example client is able to both transmit and receive audio when connected to another Unity example client.

Unity Build Error:The type 'Object' is defined in an assembly that is not referenced

Unity Version 2018.3.0f2
It always has error when I build unity project like this:
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Media\VideoSource.cs(34,32): error CS0012: The type 'Object' is defined in an assembly that is not referenced. You must add a reference to assembly 'netstandard, Version=2.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'.
I think I have lost something, but I don't know what is missing.
It worked well on Unity Editor.

Support iOS and Android SDK

Summary:

Support iOS and Android devices

Value proposition:

Creates standard communications model for 1st party and 3rd party devices (iOS, Android, Magic Leap) and introduces network effects that can be leveraged with existing investments in the cloud/edge infrastructure across providers (Azure, AWS, Verizon, etc).

Background:

Existing native SDKs:
iOS WebRTC SDK: https://webrtc.org/native-code/ios/
Android WebRTC SDK: https://webrtc.org/native-code/android/
Cross Platform support: TBD

Builds fail to ARM64

When building to ARM64, VS throws the error

LNK1104 cannot open file 'webrtc.lib' Microsoft.MixedReality.WebRTC.Native.UWP D:\MixedReality-WebRTC\libs\Microsoft.MixedReality.WebRTC.Native\src\uwp\

The other build targets work fine.

NodeDssSignaler Replacement

In the documentation, it is noted that NodeDssSignaler is simple and useful for getting started with WebRTC but should not be used in production. Would it be within the scope of this project to create something that could be used for production or do you have recommendations for a replacement?

Problem building - 'Could not copy the file "C:\mr-webrtc\bin\UWP\ARM\Debug\Microsoft.MixedReality.WebRTC.Native.dll" because it was not found'

I followed the instructions over at https://microsoft.github.io/MixedReality-WebRTC/manual/building.html
I encountered a problem using Visual Studio 2019 then I saw an Issue (#14) which seemed to suggest only Visual Studio 2017 was supported so I tried with Visual Studio 2017 after resetting my git status

Trying to build with VS 2017 v15.9.13 I get these three errors

Could not copy the file "C:\mr-webrtc\bin\UWP\ARM\Debug\Microsoft.MixedReality.WebRTC.Native.dll" because it was not found. Microsoft.MixedReality.WebRTC.TestAppUWP

Could not copy the file "C:\mr-webrtc\bin\UWP\ARM\Debug\Microsoft.MixedReality.WebRTC.Native.pdb" because it was not found. Microsoft.MixedReality.WebRTC.TestAppUWP

Cannot open include file: 'cryptopp/allocatorwithnul.h': No such file or directory Microsoft.MixedReality.WebRTC.Native.UWP c:\mr-webrtc\external\webrtc-uwp-sdk\webrtc\xplatform\zslib-eventing\zslib\eventing\types.h 40

Microsoft.MixedReality.WebRTC.Native.dll API functions are not imported correctly in Microsoft.MixedReality.WebRTC

C++ API functions imported in PeerConnection.cs can return unexpected value due to boolean type length mismatch between C++ an C#.

Tested in VS 2017, Debug x86

Steps to reproduce:

  1. Add the method ForceCodecsImportTest given below to PeerConnection.cs
  2. Run _peerConnection.ForceCodecsImportTest() in TestAppUwp
  3. Attach debugger to Microsoft.MixedReality.WebRTC.Native.dll and observe how SdpForceCodecs() C++ method returns false but, variable x in ForceCodecsImportTest() is assigned true.

Proposed Solution:
Add [return: MarshalAs(UnmanagedType.I1)] to Dll import e.g.:

[DllImport(dllPath, CallingConvention = CallingConvention.StdCall, CharSet = CharSet.Ansi, EntryPoint = "mrsSdpForceCodecs")] [return: MarshalAs(UnmanagedType.I1)] public static unsafe extern bool SdpForceCodecs(string message, string audioCodecName, string videoCodecName, StringBuilder messageOut, ref ulong messageOutLength);

Test function:

public void ForceCodecsImportTest()
{
    String kSdpFullString =
        "v=0\r\n" +
        "o=- 18446744069414584320 18446462598732840960 IN IP4 127.0.0.1\r\n" +
        "s=-\r\n" +
        "t=0 0\r\n" +
        "a=msid-semantic: WMS local_stream_1\r\n" +
        "m=audio 2345 RTP/SAVPF 111 103 104\r\n" +
        "c=IN IP4 74.125.127.126\r\n" +
        "a=rtcp:2347 IN IP4 74.125.127.126\r\n" +
        "a=candidate:a0+B/1 1 udp 2130706432 192.168.1.5 1234 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/1 2 udp 2130706432 192.168.1.5 1235 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/2 1 udp 2130706432 ::1 1238 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/2 2 udp 2130706432 ::1 1239 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/3 1 udp 2130706432 74.125.127.126 2345 typ srflx " +
        "raddr 192.168.1.5 rport 2346 " +
        "generation 2\r\n" +
        "a=candidate:a0+B/3 2 udp 2130706432 74.125.127.126 2347 typ srflx " +
        "raddr 192.168.1.5 rport 2348 " +
        "generation 2\r\n" +
        "a=ice-ufrag:ufrag_voice\r\na=ice-pwd:pwd_voice\r\n" +
        "a=mid:audio_content_name\r\n" +
        "a=sendrecv\r\n" +
        "a=rtcp-mux\r\n" +
        "a=rtcp-rsize\r\n" +
        "a=crypto:1 AES_CM_128_HMAC_SHA1_32 " +
        "inline:NzB4d1BINUAvLEw6UzF3WSJ+PSdFcGdUJShpX1Zj|2^20|1:32 " +
        "dummy_session_params\r\n" +
        "a=rtpmap:111 opus/48000/2\r\n" +
        "a=rtpmap:103 ISAC/16000\r\n" +
        "a=rtpmap:104 ISAC/32000\r\n" +
        "a=ssrc:1 cname:stream_1_cname\r\n" +
        "a=ssrc:1 msid:local_stream_1 audio_track_id_1\r\n" +
        "a=ssrc:1 mslabel:local_stream_1\r\n" +
        "a=ssrc:1 label:audio_track_id_1\r\n" +
        "m=video 3457 RTP/SAVPF 120\r\n" +
        "c=IN IP4 74.125.224.39\r\n" +
        "a=rtcp:3456 IN IP4 74.125.224.39\r\n" +
        "a=candidate:a0+B/1 2 udp 2130706432 192.168.1.5 1236 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/1 1 udp 2130706432 192.168.1.5 1237 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/2 2 udp 2130706432 ::1 1240 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/2 1 udp 2130706432 ::1 1241 typ host " +
        "generation 2\r\n" +
        "a=candidate:a0+B/4 2 udp 2130706432 74.125.224.39 3456 typ relay " +
        "generation 2\r\n" +
        "a=candidate:a0+B/4 1 udp 2130706432 74.125.224.39 3457 typ relay " +
        "generation 2\r\n" +
        "a=ice-ufrag:ufrag_video\r\na=ice-pwd:pwd_video\r\n" +
        "a=mid:video_content_name\r\n" +
        "a=sendrecv\r\n" +
        "a=crypto:1 AES_CM_128_HMAC_SHA1_80 " +
        "inline:d0RmdmcmVCspeEc3QGZiNWpVLFJhQX1cfHAwJSoj|2^20|1:32\r\n" +
        "a=rtpmap:120 VP8/90000\r\n" +
        "a=ssrc-group:FEC 2 3\r\n" +
        "a=ssrc:2 cname:stream_1_cname\r\n" +
        "a=ssrc:2 msid:local_stream_1 video_track_id_1\r\n" +
        "a=ssrc:2 mslabel:local_stream_1\r\n" +
        "a=ssrc:2 label:video_track_id_1\r\n" +
        "a=ssrc:3 cname:stream_1_cname\r\n" +
        "a=ssrc:3 msid:local_stream_1 video_track_id_1\r\n" +
        "a=ssrc:3 mslabel:local_stream_1\r\n" +
        "a=ssrc:3 label:video_track_id_1\r\n";
    ulong len = 32; // too short on purpose
    StringBuilder buffer = new StringBuilder((int)len);

    bool x = NativeMethods.SdpForceCodecs(kSdpFullString, "opus", "a very long codec name causing method return false", buffer, ref len);
    if( x == true)
    {
        throw new Exception("Something went horribly wrong");
    }
}

VideoChatDemo Build Errors in Unity 2019.1.0f2

I have been trying to build the VideoChatDemo in Unity to run on a Hololens but have been getting the following errors:

Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Media\RemoteVideoSource.cs(187,48): error CS0246: The type or namespace name 'I420AVideoFrame' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Media\LocalVideoSource.cs(125,47): error CS0246: The type or namespace name 'I420AVideoFrame' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Media\VideoSource.cs(34,16): error CS0246: The type or namespace name 'VideoFrameQueue<>' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Media\VideoSource.cs(34,32): error CS0246: The type or namespace name 'I420VideoFrameStorage' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Media\MediaPlayer.cs(64,16): error CS0246: The type or namespace name 'VideoFrameQueue<>' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Media\MediaPlayer.cs(64,32): error CS0246: The type or namespace name 'I420VideoFrameStorage' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Signaling\Signaler.cs(14,53): error CS0246: The type or namespace name 'ISignaler' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\PeerConnection.cs(106,23): error CS0234: The type or namespace name 'PeerConnection' does not exist in the namespace 'Microsoft.MixedReality.WebRTC' (are you missing an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\PeerConnection.cs(219,40): error CS0234: The type or namespace name 'PeerConnection' does not exist in the namespace 'Microsoft.MixedReality.WebRTC' (are you missing an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Signaling\Signaler.cs(28,29): error CS0246: The type or namespace name 'SignalerMessage' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Signaling\Signaler.cs(39,47): error CS0246: The type or namespace name 'SignalerMessage' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Signaling\NodeDssSignaler.cs(71,47): error CS0246: The type or namespace name 'SignalerMessage' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Signaling\Signaler.cs(48,26): error CS0234: The type or namespace name 'PeerConnection' does not exist in the namespace 'Microsoft.MixedReality.WebRTC' (are you missing an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Signaling\NodeDssSignaler.cs(160,42): error CS0246: The type or namespace name 'SignalerMessage' could not be found (are you missing a using directive or an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\PeerConnection.cs(208,24): error CS0234: The type or namespace name 'PeerConnection' does not exist in the namespace 'Microsoft.MixedReality.WebRTC' (are you missing an assembly reference?)
Assets\Microsoft.MixedReality.WebRTC.Unity\Scripts\Signaling\NodeDssSignaler.cs(179,49): error CS0246: The type or namespace name 'SignalerMessage' could not be found (are you missing a using directive or an assembly reference?)

Any ideas what I could be missing? Demo runs fine in editor.

Can work with ios&android?

master branch with webrtc-uwp-sdk m71, windows 10
Two question,

  1. ios can receive video from unity app which run on pc, but the unity app can't receive video from ios, from wirehshark run on pc, i known many video data being received,
    but the remote_video_observer_ not be callback, i want to debug the issue,
    "
    else if (trackKindStr == webrtc::MediaStreamTrackInterface::kVideoKind) {
    trackKind = TrackKind::kVideoTrack;
    if (auto* sink = remote_video_observer_.get()) {
    rtc::VideoSinkWants sink_settings{};
    sink_settings.rotation_applied =
    true; // no exposed API for caller to handle rotation
    auto video_track = static_castwebrtc::VideoTrackInterface*(track.get());
    video_track->AddOrUpdateSink(sink, sink_settings);
    }
    "

  2. So i want to build webrtc.lib from webrtc-uwp-sdk (m71), but unfortunately i can't compile it, many compile error, but i can build webrtc-uwp-sdk(m75), but MixedReality-WebRTC(feature/m75) can't work now, i don't know how to fix the issue.

anybody test webrtc-uwp with ios?

Unity crashes with speaker chosen as default playback device on Windows 10

For some reason, when my speaker is chosen as default playback device, the Unity crashes when trying to initialize the peerconnection. No error message is given, and attaching VS to Unity does not hit any exception breakpoints. Unity simply just closes.

image

This does not happen if any other playback device is chosen, as seen below

image

I'm still investigating the issue, but is there any reason why this should happen?

Unity 2019.1.14
Latest master branch
Windows 10 Build 18362

VideoChatDemo Example not working

Platform: Windows 10
Unity: 2018.3.14f1
Commit: de1db21

When I run the VideoChatDemo scene in Unity, LocalVideoSource correctly shows the webcam feed, and in the console I see ICE signals being received, but no video shows up on RemoteVideoSource. I added a Connected handler, but it is never called. Can you suggest a next step in debugging this failure?

VideoChatDemo data transfer failed

I've been trying to get the VideoChatDemo scene deployed to 2 clients and was able to get them both connected but once I tried to create the offer I didn't see any data transfered. The server printed out a lot of info that I'm not sure what to make of:

dss GET /data/123 404 - - 0.203 ms +409ms
dss GET /data/321 404 - - 0.112 ms +122ms
dss:body {"MessageType":1,"Data":"v=0\r\no=- 8056751785094099368 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS local_av_stream\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 102 0 8 106 105 13 110 112 113 126\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:3meE\r\na=ice-pwd:P2HnrFWRx5F/alVQdmKNIP36\r\na=ice-options:trickle\r\na=fingerprint:sha-256 7F:EB:C6:03:42:88:18:C4:97:34:00:F3:62:27:54:5D:C6:92:15:0D:F9:77:6C:FD:9D:3F:B8:C2:CF:C6:4E:1A\r\na=setup:actpass\r\na=mid:audio\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=sendrecv\r\na=rtcp-mux\r\na=rtpmap:111 opus/48000/2\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:103 ISAC/16000\r\na=rtpmap:104 ISAC/32000\r\na=rtpmap:9 G722/8000\r\na=rtpmap:102 ILBC/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:106 CN/32000\r\na=rtpmap:105 CN/16000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:112 telephone-event/32000\r\na=rtpmap:113 telephone-event/16000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:2387358855 cname:Uqgu1Ku8ixh5j3qk\r\na=ssrc:2387358855 msid:local_av_stream local_audio\r\na=ssrc:2387358855 mslabel:local_av_stream\r\na=ssrc:2387358855 label:local_audio\r\nm=video 9 UDP/TLS/RTP/SAVPF 96 97 98 99 100 101 127 124 125\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:3meE\r\na=ice-pwd:P2HnrFWRx5F/alVQdmKNIP36\r\na=ice-options:trickle\r\na=fingerprint:sha-256 7F:EB:C6:03:42:88:18:C4:97:34:00:F3:62:27:54:5D:C6:92:15:0D:F9:77:6C:FD:9D:3F:B8:C2:CF:C6:4E:1A\r\na=setup:actpass\r\na=mid:video\r\na=extmap:2 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:5 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=extmap:7 http://www.webrtc.org/experiments/rtp-hdrext/video-content-type\r\na=extmap:8 http://www.webrtc.org/experiments/rtp-hdrext/video-timing\r\na=extmap:10 http://tools.ietf.org/html/draft-ietf-avtext-framemarking-07\r\na=sendrecv\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtpmap:97 rtx/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:98 VP9/90000\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=fmtp:98 x-google-profile-id=0\r\na=rtpmap:99 rtx/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:100 multiplex/90000\r\na=rtcp-fb:100 goog-remb\r\na=rtcp-fb:100 transport-cc\r\na=rtcp-fb:100 ccm fir\r\na=rtcp-fb:100 nack\r\na=rtcp-fb:100 nack pli\r\na=fmtp:100 acn=VP9;x-google-profile-id=0\r\na=rtpmap:101 rtx/90000\r\na=fmtp:101 apt=100\r\na=rtpmap:127 red/90000\r\na=rtpmap:124 rtx/90000\r\na=fmtp:124 apt=127\r\na=rtpmap:125 ulpfec/90000\r\na=ssrc-group:FID 3584077471 988498931\r\na=ssrc:3584077471 cname:Uqgu1Ku8ixh5j3qk\r\na=ssrc:3584077471 msid:local_av_stream local_video\r\na=ssrc:3584077471 mslabel:local_av_stream\r\na=ssrc:3584077471 label:local_video\r\na=ssrc:988498931 cname:Uqgu1Ku8ixh5j3qk\r\na=ssrc:988498931 msid:local_av_stream local_video\r\na=ssrc:988498931 mslabel:local_av_stream\r\na=ssrc:988498931 label:local_video\r\n","IceDataSeparator":"|"} +0ms
dss POST /data/123 200 - - 37.884 ms +272ms
dss:body {"MessageType":3,"Data":"candidate:995267180 1 udp 2122063615 10.88.68.134 57769 typ host generation 0 ufrag 3meE network-id 4 network-cost 10|1|video","IceDataSeparator":"|"} +41ms
dss POST /data/123 200 - - 5.998 ms +13ms
dss:body {"MessageType":3,"Data":"candidate:1154973090 1 udp 2122197247 2001:4898:e0:2077:c409:46b6:64ab:b4d4 57763 typ host generation 0 ufrag 3meE network-id 6 network-cost 10|0|audio","IceDataSeparator":"|"} +11ms
dss POST /data/123 200 - - 5.801 ms +12ms
dss:body {"MessageType":3,"Data":"candidate:1203989023 1 udp 2122262783 2001:4898:e0:2077:810b:d91f:8e56:3b05 57762 typ host generation 0 ufrag 3meE network-id 5 network-cost 10|0|audio","IceDataSeparator":"|"} +10ms
dss POST /data/123 200 - - 3.194 ms +8ms
dss:body {"MessageType":3,"Data":"candidate:77142221 1 udp 2122129151 192.168.137.1 57764 typ host generation 0 ufrag 3meE network-id 1 network-cost 10|0|audio","IceDataSeparator":"|"} +8ms
dss POST /data/123 200 - - 5.917 ms +10ms
dss:body {"MessageType":3,"Data":"candidate:1154973090 1 udp 2122197247 2001:4898:e0:2077:c409:46b6:64ab:b4d4 57767 typ host generation 0 ufrag 3meE network-id 6 network-cost 10|1|video","IceDataSeparator":"|"} +8ms
dss POST /data/123 200 - - 3.164 ms +6ms
dss:body {"MessageType":3,"Data":"candidate:995267180 1 udp 2122063615 10.88.68.134 57765 typ host generation 0 ufrag 3meE network-id 4 network-cost 10|0|audio","IceDataSeparator":"|"} +7ms
dss POST /data/123 200 - - 6.374 ms +9ms
dss:body {"MessageType":3,"Data":"candidate:77142221 1 udp 2122129151 192.168.137.1 57768 typ host generation 0 ufrag 3meE network-id 1 network-cost 10|1|video","IceDataSeparator":"|"} +9ms
dss POST /data/123 200 - - 2.984 ms +7ms
dss:body {"MessageType":3,"Data":"candidate:1203989023 1 udp 2122262783 2001:4898:e0:2077:810b:d91f:8e56:3b05 57766 typ host generation 0 ufrag 3meE network-id 5 network-cost 10|1|video","IceDataSeparator":"|"} +8ms
dss POST /data/123 200 - - 7.342 ms +11ms
dss:body {"MessageType":3,"Data":"candidate:155227887 1 tcp 1518283007 2001:4898:e0:2077:810b:d91f:8e56:3b05 53518 typ host tcptype passive generation 0 ufrag 3meE network-id 5 network-cost 10|0|audio","IceDataSeparator":"|"} +10ms
dss POST /data/123 200 - - 2.401 ms +6ms
dss:body {"MessageType":3,"Data":"candidate:173310290 1 tcp 1518217471 2001:4898:e0:2077:c409:46b6:64ab:b4d4 53519 typ host tcptype passive generation 0 ufrag 3meE network-id 6 network-cost 10|0|audio","IceDataSeparator":"|"} +11ms
dss POST /data/123 200 - - 5.101 ms +13ms
dss:body {"MessageType":3,"Data":"candidate:1243276349 1 tcp 1518149375 192.168.137.1 53520 typ host tcptype passive generation 0 ufrag 3meE network-id 1 network-cost 10|0|audio","IceDataSeparator":"|"} +8ms
dss POST /data/123 200 - - 3.884 ms +7ms
dss:body {"MessageType":3,"Data":"candidate:155227887 1 tcp 1518283007 2001:4898:e0:2077:810b:d91f:8e56:3b05 53522 typ host tcptype passive generation 0 ufrag 3meE network-id 5 network-cost 10|1|video","IceDataSeparator":"|"} +8ms
dss POST /data/123 200 - - 5.332 ms +9ms
dss:body {"MessageType":3,"Data":"candidate:1976659612 1 tcp 1518083839 10.88.68.134 53521 typ host tcptype passive generation 0 ufrag 3meE network-id 4 network-cost 10|0|audio","IceDataSeparator":"|"} +7ms
dss POST /data/123 200 - - 2.332 ms +5ms
dss:body {"MessageType":3,"Data":"candidate:1243276349 1 tcp 1518149375 192.168.137.1 53524 typ host tcptype passive generation 0 ufrag 3meE network-id 1 network-cost 10|1|video","IceDataSeparator":"|"} +6ms
dss POST /data/123 200 - - 6.486 ms +10ms
dss:body {"MessageType":3,"Data":"candidate:173310290 1 tcp 1518217471 2001:4898:e0:2077:c409:46b6:64ab:b4d4 53523 typ host tcptype passive generation 0 ufrag 3meE network-id 6 network-cost 10|1|video","IceDataSeparator":"|"} +11ms
dss POST /data/123 200 - - 3.461 ms +7ms
dss:body {"MessageType":3,"Data":"candidate:1976659612 1 tcp 1518083839 10.88.68.134 53529 typ host tcptype passive generation 0 ufrag 3meE network-id 4 network-cost 10|1|video","IceDataSeparator":"|"} +8ms
dss POST /data/123 200 - - 7.485 ms +12ms
dss GET /data/123 200 - - 0.433 ms +5ms
dss GET /data/321 404 - - 0.169 ms +112ms

No Video Stream On Hololens

It worked well between two Unity Editor,but when i run it on hololens , I can not receive remote video & audio from Hololens on Unity Editor, and the Hololens has no remote video too.It took me a whole day but it has not progressed。
Is there something missing on Hololens? And whether IceServer is necessary?

How can I choose a specific webcam?

Hi,
As far as I understood, for the local video source the first available webcam is chosen. Is it possible to choose a specific one, either by name or id on the Unity side? or get all available webcam devices similar to WebCamTexture.devices in Unity.
Thanks.

Upgrade to Visual Studio 2019

The WebRTC UWP SDK project currently hard-codes the version of Visual Studio to 2017. For consistency with the MixedReality-WebRTC libraries, it would be best to also compile the core implementation with VS 2019.

This issue is tracked as #175 on the WebRTC UWP SDK project.

Distorted video in TestAppUWP

Remote video stream from hololens looks broken in TestAppUWP. The same stream works fine in Unity Editor. One can recognize the image, but it's replicated multiple overlapping times in the preview window. The attached image shows a logitech mouse.

2019-08-14 15_46_52-Microsoft MixedReality WebRTC (Running) - Microsoft Visual Studio

Missing Relay Ice Candidate

Hi,

I'm currently moving from an implementation built around the m71 release of webrtc-uwp-sdk to this one, and have stumbled upon an problem I'm not able to resolve. The solution works on webrtc-uwp-sdk.

I have a TURN/STUN Kurento server running on Azure, and when I connnect to it (or any peers through it) with the previous solution, my client always sent 5 ice candidates.
4 local and 1 relay. The relay one looks like this:

candidate:[number] 1 udp [priority] [Server IP] [WebRTC port] typ relay raddr [Local IP] rport [WebRTC port] generation 0 ufrag 9dPY network-id 1 network-cost 50

However, on this plugin, the relay candidate is never generated (or at least never given to me). I only receive the 4 local ones, and therefore can't setup a connection through the server.
I have added the ice servers to the peer as

iceServers = new List<string>
{
      $"stun: {url}{port}",
      $"turn: {url}{port}",
      $"turn: {url}{port}?transport=tcp"
};

and have also added Username and Credential for the TURN server.

image

Within the same environment, same computer/network/external peer, I can connect with the old solution both before and after trying this one.

Am I missing anything? Shouldn't this then generate the relay candidate? Or is the plugin somehow configured for local communication only?

While on the topic, would be amazing if we have access to modifying the transport- and bundle-policy of ice gathering.

Only Video Stream from Hololens

Hello, i use the VideoChatDemo once on the PC as UWP App and once on the Hololens. So far the transmission works fine.
But I would like to see only the video stream from the Hololens on the UWP app from the PC. I disabled the LocalVideoPlayer in the UWP app on the PC. Unfortunately still needs a webcam or Microphone on the PC. I disconnect it from the PC and start the app I get the following error message:

Audio/Video access failure: One or more errors occurred.

Is it possible to run the app without webcam or microphone? Only the video stream from the Hololens should be displayed.

Any suggestions?

Thanks for your help

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.