Comments (16)
I have been working on the Steam Deck haptics. There are limitations to what the Steam Deck haptics can do as opposed to the DualSense (especially since we have to translate the pcm signal on the fly which leads to a trade off between resolution / minimum frequency and latency [though we can cheat the future a little bit]) but the majority of the haptics seem to be working reasonably well though are a bit noisy atm. One thing I ran into was the gpu jungle section of Astro’s playroom where you are the monkey seems to break when using the enable dualsense option. Specifically, when you are climbing it doesn’t let you grab one of the higher ledges. It seems that it’s waiting for a trigger to go to the next stage for the adaptive triggers or something. Not sure if you have seen this @jbaiter? Also, in general have you made any updates to the haptics since the link you posted before? I have been using that to start with but will want to add any changes before the next release.
from chiaki4deck.
Thanks for adding the issue. I will definitely look into this.
from chiaki4deck.
This was fixed in a recent commit on my branch, I had flipped the branches on a ternary by accident 🙈
https://git.sr.ht/~jbaiter/chiaki/commit/5c58a6bf525f01f835629bd8a9226d8eb1ac24a8
The other update was support for hot plugging DS5 controllers, but that's not relevant for you, I think. The code is currently in review.
from chiaki4deck.
I think there's a way, did some googling and it seems you can talk directly to the haptics without going through the steamworks library, e.g. https://github.com/Roboron3042/SteamControllerSinger.
In other news, I got some starting help from @thestr4ng3r and @grill2010 and am now knee deep in implementing haptics and trigger effects for DualSense controllers attached via USB, it's probably going to take a while, but I think I'll get there :-)))
-- Update: Trigger effects are working, feels a bit weaker than in the official client, but the effects are all there.
from chiaki4deck.
As far as haptics for the Steam Deck controller, how would it switch between simulated haptics for rumble (what it’s doing now) and “real” haptics or do fake haptics get sent for ps4 games too and we can still use those for ps4 games? As far as the pcm to pwm, Here is an example of someone trying this in a different project but I’m not sure how much help it will be.
On the API level, rumble events are sent through SDL's gamepad routines to the HID device, while Haptics are sent as audio samples to the DualSense Audio device. When sending a rumble signal, SDL sets some bits to disable audio haptics, which are then cleared once the rumble effect is disabled. When emulating the rumble and haptics through the Steam Deck's touchpads, I think we can follow the same approach, use SDL for regular rumble, and direct USB access for the haptics. They will with large certainty never interfere with each other, since games that use rumble don't use haptics, and vice versa.
Thanks for the link, I'll take a look!
from chiaki4deck.
It sure will!
If you want to help, you can beta-test the feature by downloading the build here and reporting back (via e-mail, windows is probably out of scope for this issue) if it works for you :-)
https://ci.appveyor.com/api/buildjobs/tw8xu4p3hv02hv52/artifacts/Chiaki.zip
from chiaki4deck.
As for the trigger effects, is that something that you know when you get it and can then play the appropriate haptic or does it work some other way?
Yes, it's simply a discrete packet sent over the Remote Play protocol that tells you which effect to send to the triggers, just like rumble. The data is basically sent as-is (with some rearranging) via HID (or rather, via SDL2, which has an API for that) to the DualSense.
Also, how does this work with the sound output haptics? Are you guaranteed to only get one at a time or does one take precedence over another?
Those two are completely independent. Trigger effects are discrete events, haptic effects are implemented as a semi-continuous audio stream, just like the regular video and audio streams. "Semi-continuous", because silence is not encoded explicitly, i.e. when there are no haptics, no data is sent. The handling of those two is completely separated as well, there's no relationship at the protocol or application level between them. Haptics are resampled from 3KHz to 48KHz and sent as raw PCM to the DualSense audio device, separate from the HID stack.
Based on the files you provided, it seems like these effects would be something we would provide a specific function for and trigger at the appropriate time. Is this what you had in mind?
If we decide to map both the trigger effects and the regular haptics to the Steam Deck's haptics, we're going to need to interleave the two sources somehow, probably with a queue and some prioritization heuristics.
It's probably wiser to leave that for later and start with mapping only the haptics first :-)
Links to relevant code sections:
- Receiving trigger effects: https://git.sr.ht/~jbaiter/chiaki/tree/gui-haptics/item/gui/src/streamsession.cpp#L548-559
- Sending trigger effects to controller: https://git.sr.ht/~jbaiter/chiaki/tree/gui-haptics/item/gui/src/controllermanager.cpp#L337-352
- Setting up haptics reception: https://git.sr.ht/~jbaiter/chiaki/tree/gui-haptics/item/gui/src/streamsession.cpp#L150-155
- Setting up haptics audio system: https://git.sr.ht/~jbaiter/chiaki/tree/gui-haptics/item/gui/src/streamsession.cpp#L420-448
- Connecting a controller to the haptics: https://git.sr.ht/~jbaiter/chiaki/tree/gui-haptics/item/gui/src/streamsession.cpp#L458-490
- Receiving haptics and sending them to the controller: https://git.sr.ht/~jbaiter/chiaki/tree/gui-haptics/item/gui/src/streamsession.cpp#L499-523
from chiaki4deck.
@jbaiter as far as the hot plugging of the DualSense, I made 2 minor tweaks:
- changed from using guids to pid/vid since missing some guids resulted in controller working sometimes and not others (since it was random on plugin) set valid ids here and check id here
- Increased timer from 250ms to 1000ms for DualSense hot plug because 250ms sometimes wasn't enough time for haptics audio device to become available (resulting in it not being detected) here
from chiaki4deck.
This would be incredible, even if only for DualSense controllers connected to the deck. But I think it's going to require some reverse engineering, since the current Takion protocol definitions in chiaki have nothing related to haptics.
Supporting it for attached DualSense controllers is probably going to require some additional work as well, since Sony's kernel driver does not support haptics, and neither does the general kernel API for controllers.
However, from what I could find so far, Haptics on the DualSense seems to be implemented using audio data, i.e. the controller haptics function as a kind of speaker and haptic effects are sent as waveforms. See for example this PoC Windows implementation:
https://gist.github.com/Ohjurot/b0c04dfbd25fb71bc0da50947d313d1b
Maybe this means that if the hapticis data comes in as a PCM or compressed audio stream from the PlayStation, we could simply send it to the DualSense Audio device?
So tasks involved in this seem to be:
- Reverse Engineer Takion protocol to find out if and how it passes haptics data to clients
- Find out how to process the haptics data to map to the Steamworks
TriggerHapticPulse
/TriggerRepeatedHapticPulse
/TriggerVibration
APIs or native Linux APIs (implementing it via Steamworks probably means no Flatpak?) - Find out how to pass the haptics data to connected DualSense controllers
Here's a talk from chiaki's original author, @thestr4ng3r, on how he reverse engineered the protocol: https://mdco2.mini.debconf.org/talks/9-chiaki-bringing-playstation-4-remote-play-to-linux/
-- edit:
I did some more digging around in the binary of the official Remote Play app, and it seems that there have been quite a few revisions to the Takion protocol, Chiaki supports up to v12, while the official app is currently at v17. Does someone have older versions of the Remote Play app so we can check if haptics are available when using v12, or if we need to support a newer version of the protocol to get access to them?
-- edit:
The earliest version I could find in the Wayback machine was 4.5. I tested it with my PS5, and haptics and adaptive triggers are working perfectly. With some reversing I found out that it's using v14 of the protocol.
from chiaki4deck.
Thanks for starting to look into this @jbaiter … as for steamworks that is something we would have to get permission from @thestr4ng3r to use because it violates the gpl license (see https://partner.steamgames.com/doc/sdk/uploading/distributing_opensource) so if there is a way to do this without using steamworks that would be preferred.
from chiaki4deck.
As far as the Takion side of things - I haven't really tested it much, but in theory pyremoteplay now supports haptics - ktnrg45/pyremoteplay@master...rumble
As to Steamworks for haptics, a worst-case workaround might be to create a separate haptic daemon that links to the library and talks with chiaki using some form of IPC. But probably there's a sysfs trigger somewhere for the rumble?
from chiaki4deck.
As far as the Takion side of things - I haven't really tested it much, but in theory pyremoteplay now supports haptics - ktnrg45/pyremoteplay@master...rumble
That's only for the DS4 rumble emulation, not for the 'real' haptics, Chiaki already supports that.
As for the 'real' haptics, I managed to get them working with a connected DualSense just a few moments ago. Current requirement is using the pipewire
audio driver for SDL, but that should not be a problem on the Deck, since it's using that for Audio handling anyways, afaik.
As to Steamworks for haptics, a worst-case workaround might be to create a separate haptic daemon that links to the library and talks with chiaki using some form of IPC. But probably there's a sysfs trigger somewhere for the rumble?
I think the way to go is to use libusb
or hidapi
to send raw USB data (see the SteamControllerSinger example app mentioned above), this way we can do it directly from the app and still remain GPL-compliant.
The only remaining task now is to find a way to map the Haptics audio data (3kHz 16Bit Stereo PCM) to the Steam Deck's touchpad haptics in a way that feels good! From what I understand we need to convert our PCM samples into PWM that we can more easily translate to the Steam Deck's haptics parameters (pulse high, pulse low and repeat).
I'm afraid that I know too little about signal processing to do this justice, does somebody have any suggestions on how to best do this?
from chiaki4deck.
Awesome work @jbaiter!
I agree that libusb or hidapi is the way to go if not using SteamInput. It seems like it’s the only way at the moment based on what’s exposed outside of Steam Input so is the best path forward.
Any software dependencies that need to be added (such as pipewire) if not already included in the flatpak can be added to the flatpak container during build.
As far as haptics for the Steam Deck controller, how would it switch between simulated haptics for rumble (what it’s doing now) and “real” haptics or do fake haptics get sent for ps4 games too and we can still use those for ps4 games? As far as the pcm to pwm, Here is an example of someone trying this in a different project but I’m not sure how much help it will be.
from chiaki4deck.
Jumping back in only because I saw that @jbaiter is also contributing over on the main chiaki repo - in addition to this effort for the Deck, will haptics and adaptive trigger support for Dualsense be coming to the Win64 build of chiaki? Because holy crap, that would rule.
from chiaki4deck.
I absolutely will. 🙂
from chiaki4deck.
As per the discussion in #7, I have the some questions for when I start work on playing the files / inputs you get from the PS5 using the Steam Deck haptics using the play haptic functionality from my hidapi code.
Once I start that effort, a sample haptic file would be nice for testing. As for the trigger effects, is that something that you know when you get it and can then play the appropriate haptic or does it work some other way? Also, how does this work with the sound output haptics? Are you guaranteed to only get one at a time or does one take precedence over another? Based on the files you provided, it seems like these effects would be something we would provide a specific function for and trigger at the appropriate time. Is this what you had in mind?
from chiaki4deck.
Related Issues (20)
- [QUESTION] in regards to the upcoming "Remote Connection Using PSN for Setup" HOT 2
- IPV6 Support HOT 1
- Inquiry on Native Keyboard Input Support in Chiaki4deck HOT 8
- [BUG] Stream failed with d3d11va selected. HOT 2
- [BUG] 1.7.0 not launching HOT 11
- [BUG] Remote Play via PSN failed type field wasn’t LOCAL or STATIC HOT 61
- 1.7 update won’t work on Steamdeck HOT 11
- Remote Connection via PSN failed [BUG] HOT 10
- This keeps popping up[BUG] HOT 5
- [BUG] Remote Connection Via PSN Not Getting Responses HOT 38
- [BUG] Remote Play via PSN failing at data connection HOT 7
- [BUG] PS4 Not Detected with Remote Play via PSN HOT 22
- [BUG] HDR on Stellar Blade HOT 6
- [BUG] Can't build with system curl HOT 8
- [BUG] Frame drops and major stuttering when using a bluetooth controller HOT 1
- [BUG] Unable to launch streaming session with ps5. "Chiaki session has quit: unknown error in stream connection" HOT 5
- [BUG] Session closes after a few minutes when using remote connection via psn HOT 27
- [BUG] Can't Connect Outside Home Network - Wake Up Packets Getting Through HOT 27
- [BUG] AppImage version have bug with PS button and gyro on Dualshock4/Dualsense HOT 1
- Unable to connect via PSN (Mobile Hotspot used) HOT 18
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from chiaki4deck.