Giter Club home page Giter Club logo

curl-impersonate's Introduction

curl-impersonate Chrome Edge Firefox Safari

Build and test Docker images

A special build of curl that can impersonate the four major browsers: Chrome, Edge, Safari & Firefox. curl-impersonate is able to perform TLS and HTTP handshakes that are identical to that of a real browser.

curl-impersonate can be used either as a command line tool, similar to the regular curl, or as a library that can be integrated instead of the regular libcurl. See Usage below.

Why?

When you use an HTTP client with a TLS website, it first performs a TLS handshake. The first message of that handshake is called Client Hello. The Client Hello message that most HTTP clients and libraries produce differs drastically from that of a real browser.

If the server uses HTTP/2, then in addition to the TLS handshake there is also an HTTP/2 handshake where various settings are exchanged. The settings that most HTTP clients and libraries use differ as well from those of any real browsers.

For these reasons, some web services use the TLS and HTTP handshakes to fingerprint which client is accessing them, and then present different content for different clients. These methods are known as TLS fingerprinting and HTTP/2 fingerprinting respectively. Their widespread use has led to the web becoming less open, less private and much more restrictive towards specific web clients

With the modified curl in this repository, the TLS and HTTP handshakes look exactly like those of a real browser.

How?

To make this work, curl was patched significantly to resemble a browser. Specifically, The modifications that were needed to make this work:

  • Compiling curl with nss, the TLS library that Firefox uses, instead of OpenSSL. For the Chrome version, compiling with BoringSSL, Google's TLS library.
  • Modifying the way curl configures various TLS extensions and SSL options.
  • Adding support for new TLS extensions.
  • Changing the settings that curl uses for its HTTP/2 connections.
  • Running curl with some non-default flags, for example --ciphers, --curves and some -H headers.

The resulting curl looks, from a network perspective, identical to a real browser.

Read the full technical description in the blog posts: part a, part b.

Supported browsers

The following browsers can be impersonated.

Browser Version Build OS Target name Wrapper script
Chrome 99 99.0.4844.51 Windows 10 chrome99 curl_chrome99
Chrome 100 100.0.4896.75 Windows 10 chrome100 curl_chrome100
Chrome 101 101.0.4951.67 Windows 10 chrome101 curl_chrome101
Chrome 104 104.0.5112.81 Windows 10 chrome104 curl_chrome104
Chrome 107 107.0.5304.107 Windows 10 chrome107 curl_chrome107
Chrome 110 110.0.5481.177 Windows 10 chrome110 curl_chrome110
Chrome 116 116.0.5845.180 Windows 10 chrome116 curl_chrome116
Chrome 99 99.0.4844.73 Android 12 chrome99_android curl_chrome99_android
Edge 99 99.0.1150.30 Windows 10 edge99 curl_edge99
Edge 101 101.0.1210.47 Windows 10 edge101 curl_edge101
Firefox 91 ESR 91.6.0esr Windows 10 ff91esr curl_ff91esr
Firefox 95 95.0.2 Windows 10 ff95 curl_ff95
Firefox 98 98.0 Windows 10 ff98 curl_ff98
Firefox 100 100.0 Windows 10 ff100 curl_ff100
Firefox 102 102.0 Windows 10 ff102 curl_ff102
Firefox 109 109.0 Windows 10 ff109 curl_ff109
Firefox 117 117.0.1 Windows 10 ff117 curl_ff117
Safari 15.3 16612.4.9.1.8 MacOS Big Sur safari15_3 curl_safari15_3
Safari 15.5 17613.2.7.1.8 MacOS Monterey safari15_5 curl_safari15_5

This list is also available in the browsers.json file.

Basic usage

For each supported browser there is a wrapper script that launches curl-impersonate with all the needed headers and flags. For example:

curl_chrome116 https://www.wikipedia.org

You can add command line flags and they will be passed on to curl. However, some flags change curl's TLS signature which may cause it to be detected.

Please note that the wrapper scripts use a default set of HTTP headers. If you want to change these headers, you may want to modify the wrapper scripts to fit your own purpose.

See Advanced usage for more options, including using libcurl-impersonate as a library.

Documentation

More documentation is available in the docs/ directory.

Installation

There are two versions of curl-impersonate for technical reasons. The chrome version is used to impersonate Chrome, Edge and Safari. The firefox version is used to impersonate Firefox.

Pre-compiled binaries

Pre-compiled binaries for Linux and macOS (Intel) are available at the GitHub releases page. Before you use them you need to install nss (Firefox's TLS library) and CA certificates:

  • Ubuntu - sudo apt install libnss3 nss-plugin-pem ca-certificates
  • Red Hat/Fedora/CentOS - yum install nss nss-pem ca-certificates
  • Archlinux - pacman -S nss ca-certificates
  • macOS - brew install nss ca-certificates

Also ensure you have zlib installed on your system. zlib is almost always present, but on some minimal systems it might be missing.

The pre-compiled binaries contain libcurl-impersonate and a statically compiled curl-impersonate for ease of use.

The pre-compiled Linux binaries are built for Ubuntu systems. On other distributions if you have errors with certificate verification you may have to tell curl where to find the CA certificates. For example:

curl_chrome116 https://www.wikipedia.org --cacert /etc/ssl/certs/ca-bundle.crt

Also make sure to read Notes on Dependencies.

Building from source

See INSTALL.md.

Docker images

Docker images based on Alpine Linux and Debian with curl-impersonate compiled and ready to use are available on Docker Hub. The images contain the binary and all the wrapper scripts. Use like the following:

# Firefox version, Alpine Linux
docker pull lwthiker/curl-impersonate:0.6-ff
docker run --rm lwthiker/curl-impersonate:0.6-ff curl_ff109 https://www.wikipedia.org

# Chrome version, Alpine Linux
docker pull lwthiker/curl-impersonate:0.6-chrome
docker run --rm lwthiker/curl-impersonate:0.6-chrome curl_chrome110 https://www.wikipedia.org

Distro packages

AUR packages are available to Archlinux users:

Unofficial Homebrew receipts for Mac (Chrome only) are available here:

brew tap shakacode/brew
brew install curl-impersonate

Advanced usage

libcurl-impersonate

libcurl-impersonate.so is libcurl compiled with the same changes as the command line curl-impersonate. It has an additional API function:

CURLcode curl_easy_impersonate(struct Curl_easy *data, const char *target,
                               int default_headers);

You can call it with the target names, e.g. chrome116, and it will internally set all the options and headers that are otherwise set by the wrapper scripts. If default_headers is set to 0, the built-in list of HTTP headers will not be set, and the user is expected to provide them instead using the regular CURLOPT_HTTPHEADER libcurl option.

Calling the above function sets the following libcurl options:

  • CURLOPT_HTTP_VERSION
  • CURLOPT_SSLVERSION, CURLOPT_SSL_CIPHER_LIST, CURLOPT_SSL_EC_CURVES, CURLOPT_SSL_ENABLE_NPN, CURLOPT_SSL_ENABLE_ALPN
  • CURLOPT_HTTPBASEHEADER, if default_headers is non-zero (this is a non-standard HTTP option created for this project).
  • CURLOPT_HTTP2_PSEUDO_HEADERS_ORDER, CURLOPT_HTTP2_NO_SERVER_PUSH (non-standard HTTP/2 options created for this project).
  • CURLOPT_SSL_ENABLE_ALPS, CURLOPT_SSL_SIG_HASH_ALGS, CURLOPT_SSL_CERT_COMPRESSION, CURLOPT_SSL_ENABLE_TICKET (non-standard TLS options created for this project).
  • CURLOPT_SSL_PERMUTE_EXTENSIONS (non-standard TLS options created for this project). Note that if you call curl_easy_setopt() later with one of the above it will override the options set by curl_easy_impersonate().

Using CURL_IMPERSONATE env var

If your application uses libcurl already, you can replace the existing library at runtime with LD_PRELOAD (Linux only). You can then set the CURL_IMPERSONATE env var. For example:

LD_PRELOAD=/path/to/libcurl-impersonate.so CURL_IMPERSONATE=chrome116 my_app

The CURL_IMPERSONATE env var has two effects:

  • curl_easy_impersonate() is called automatically for any new curl handle created by curl_easy_init().
  • curl_easy_impersonate() is called automatically after any curl_easy_reset() call.

This means that all the options needed for impersonation will be automatically set for any curl handle.

If you need precise control over the HTTP headers, set CURL_IMPERSONATE_HEADERS=no to disable the built-in list of HTTP headers, then set them yourself with curl_easy_setopt(). For example:

LD_PRELOAD=/path/to/libcurl-impersonate.so CURL_IMPERSONATE=chrome116 CURL_IMPERSONATE_HEADERS=no my_app

Note that the LD_PRELOAD method will NOT WORK for curl itself because the curl tool overrides the TLS settings. Use the wrapper scripts instead.

Notes on dependencies

If you intend to copy the self-compiled artifacts to another system, or use the Pre-compiled binaries provided by the project, make sure that all the additional dependencies are met on the target system as well. In particular, see the note about the Firefox version.

Contents

This repository contains two main folders:

  • chrome - Scripts and patches for building the Chrome version of curl-impersonate.
  • firefox - Scripts and patches for building the Firefox version of curl-impersonate.

The layout is similar for both. For example, the Firefox directory contains:

  • Dockerfile - Used to build curl-impersonate with all dependencies.
  • curl_ff91esr, curl_ff95, curl_ff98 - Wrapper scripts that launch curl-impersonate with the correct flags.
  • curl-impersonate.patch - The main patch that makes curl use the same TLS extensions as Firefox. Also makes curl compile statically with libnghttp2 and libnss.

Other files of interest:

  • tests/signatures - YAML database of known browser signatures that can be impersonated.

Contributing

If you'd like to help, please check out the open issues. You can open a pull request with your changes.

This repository contains the build process for curl-impersonate. The actual patches to curl are maintained in a separate repository forked from the upstream curl. The changes are maintained in the impersonate-firefox and impersonate-chrome branches.

Sponsors

Sponsors help keep this project open and maintained. If you wish to become a sponsor, please contact me directly at: lwt at lwthiker dot com.

Logo

curl-impersonate's People

Contributors

alkarex avatar bjia56 avatar djoldman avatar izzues avatar jwilk avatar lilyinstarlight avatar lwthiker avatar matheusfillipe avatar nicoandmee avatar peterupfold avatar rizialdi avatar weebdatahoarder avatar wrobelda avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

curl-impersonate's Issues

CURLOPT_USERAGENT ignored when using CURL_IMPERSONATE env var

Reported originally by @momala454 in #42 :

i have a small problem with your pull request (actually, with the whole libcurl version). When i'm using CURL_IMPERSONATE=chrome98, i'm unable to overwrite the default user-agent header to Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4896.75 Safari/537.36
it stays with Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36

when I set CURLOPT_USERAGENT, it does not overwrite the default user-agent set with "CURL_IMPERSONATE" env variable. It only reset it if I modify the header "user-agent" using CURLOPT_HTTPHEADER

Response Headers

Hello there, Is it possible to get response headers ? the example on readme curl_chrome99 https://www.wikipedia.org only return body of the response, It would be nice if we could get full response with put some additional flag in cli, Some of websites return useful cookies.

Fedora packaging?

Hey there,

This project is super useful, however manually updating binaries/building from source can be tedious. Would it be possible if this could be packaged for Fedora, perhaps utilising Copr (effectively Fedora user repos)?

(while im here, it might be nice to replace yum with dnf in the readme :3)

Thanks,
Elliott

curl: /usr/local/lib/libcurl.so.4: no version information available (required by curl)

When using libcurl-impersonate.so (not sure which one to use, there is .so, .so.4, .so.4.7.0) compiled and replaced, curl will show an error with version not available. I'm not sure if I did something wrong ? I compiled it using the docker instructions.

curl --version
curl: /usr/local/lib/libcurl.so.4: no version information available (required by curl)
curl 7.68.0 (x86_64-pc-linux-gnu) libcurl/7.81.0 BoringSSL zlib/1.2.11 brotli/1.0.9 nghttp2/1.46.0
Release-Date: 2020-01-08
Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp smb smbs smtp smtps telnet tftp
Features: alt-svc AsynchDNS brotli HTTP2 HTTPS-proxy IPv6 Largefile libz NTLM NTLM_WB SSL UnixSockets
WARNING: curl and libcurl versions do not match. Functionality may be affected.
ldd /usr/bin/curl
/usr/bin/curl: /usr/local/lib/libcurl.so.4: no version information available (required by /usr/bin/curl)
        linux-vdso.so.1 (0x00007ffc4bf3c000)
        libcurl.so.4 => /usr/local/lib/libcurl.so.4 (0x00007f68424ca000)
        libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f68424a4000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f6842481000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f684228f000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f68427c3000)

Musl .so version

Is it possible to get a musl version?

/etc/supervisor/conf.d # LD_PRELOAD='/var/www/resources/binary/curl/libcurl-impersonate-chrome.so' curl --version
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __fdelt_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __memcpy_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __vsnprintf_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __strcpy_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __memset_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __fprintf_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __sprintf_chk: symbol not found
/etc/supervisor/conf.d # curl --version
curl 7.83.1 (x86_64-alpine-linux-musl) libcurl/7.83.1 OpenSSL/1.1.1q zlib/1.2.12 brotli/1.0.9 nghttp2/1.47.0
Release-Date: 2022-05-11
Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp smb smbs smtp smtps telnet tftp 
Features: alt-svc AsynchDNS brotli HSTS HTTP2 HTTPS-proxy IPv6 Largefile libz NTLM NTLM_WB SSL TLS-SRP UnixSockets

User agent client hints also sent in http

https://wicg.github.io/ua-client-hints/#security-privacy
Client Hints will not be delivered to non-secure endpoints (see the secure transport requirements in Section 2.2.1 of [[RFC8942]](https://wicg.github.io/ua-client-hints/#biblio-rfc8942)).

The headers sec-ch-xxx must not be sent when the url is http://, only https://
but if i set CURL_IMPERSONATE=chrome98 env variable, it will always set those use agent headers even on http

	putenv('CURL_IMPERSONATE=chrome98');
	$ch = curl_init();
	curl_setopt($ch, CURLOPT_URL, 'http://headers.cf');
	curl_setopt($ch, CURLINFO_HEADER_OUT, 1);
	curl_setopt( $ch, CURLOPT_ENCODING, "" );
	curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, false );
	curl_setopt( $ch, CURLOPT_ENCODING, "" );
	curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
	//curl_setopt( $ch, CURLOPT_HTTPHEADER, ['Host: abc.com']);
	curl_setopt( $ch, CURLOPT_AUTOREFERER, true );
	

	
	
	echo curl_exec($ch);
	print_r(curl_getinfo($ch));

(take note that the website redirect to https version, but we are not following the redirect)
Headers sent

GET / HTTP/1.1
Host: headers.cf
Connection: Upgrade, HTTP2-Settings
Upgrade: h2c
HTTP2-Settings: AAEAAQAAAAMAAAPoAAQAYAAAAAYABAAAjau_38Px
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9

real headers sent by chrome :

GET / HTTP/1.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Accept-Encoding: gzip, deflate
Accept-Language: fr
Connection: keep-alive
Host: headers.cf
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4896.88 Safari/537.36

you can also see some differences, like, only on curl I see "HTTP2-Settings" headers.

Also, there are lot of user agent hints headers. Once the website tells you that they want that you send more headers, chrome will send them :
https://headers.cf/
Go to the website, a few headers are sent. Refresh the page, a lot of headers are sent. Curl-impersonate only send the minimal of the first request.
The browser keep in cache the list of headers that the domain wants.

I don't know also if the website send an header of "Accept-CH" empty, if chrome doesn't send the 3 base headers sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98" sec-ch-ua-mobile: ?0 sec-ch-ua-platform: "Windows" or if chrome still send them.
If chrome doesn't send them, that's another way to detect a spoofed chrome. But this only works on the second request as the browser must know which headers the domain support

"curl-impersonate-chrome: No such file or directory" (but the file exists)

Hey,

I downloaded the binaries from the Releases page. It works on Debian, but on NixOS I'm getting the following error:

$ ./curl_chrome99
# [...]/curl_chrome99: line 10: [...]/curl-impersonate-chrome: No such file or directory

When I run ldd [...]/curl-impersonate-chrome, this is what I see:

    linux-vdso.so.1 (0x00007ffe85ba6000)
    libz.so.1 => not found
    libpthread.so.0 => /nix/store/f6kvkdzp6qfjm6h94d0jgfvm4r06xcaq-glibc-2.34-210/lib/libpthread.so.0 (0x00007f03352ed000)
    libc.so.6 => /nix/store/f6kvkdzp6qfjm6h94d0jgfvm4r06xcaq-glibc-2.34-210/lib/libc.so.6 (0x00007f03350ef000)
    /lib64/ld-linux-x86-64.so.2 => /nix/store/f6kvkdzp6qfjm6h94d0jgfvm4r06xcaq-glibc-2.34-210/lib64/ld-linux-x86-64.so.2 (0x00007f0335606000)

Unsure if related: I see a test is performed in the build script, but it's not checking for zlib? In addition, I don't think it works as expected (if I'm not mistaken, grep -q returns success if any of the patterns is found):

RUN ! (ldd ./out/curl-impersonate | grep -q -e libcurl -e nghttp2 -e brotli -e ssl -e crypto)

Any problematic flags?

Hi, thanks for this nice project!

I was looking to use the program with some extra curl flags such as --cookie/--ipv4 and wondered if any of these could cause the default fingerprint to change. I heard that even the order of the headers sent could affect the fingerprint so I am wondering if there are any issues using additional flags such as the above. Are there any known "problematic" flags which could possibly affect the default fingerprint?

Non-dockerized build script

We should have a build script that builds curl-impersonate and its dependencies on the local system, and not within a container. This should allow compiling curl-impersonate for a broader range of platforms, including Mac OS, other Linuxes, etc.

Once there's a build script it could be used in the GitHub Actions workflow to automatically publish compiled binaries for multiple platforms.

User supplied HTTP headers are wrongly ordered when using libcurl-impersonate

When using libcurl-impersonate (either with CURL_IMPERSONATE env var or curl_easy_impersonate(), user-supplied HTTP headers will be placed after the built-in list of HTTP headers that libcurl-impersonate uses.

If the user supplies a HTTP header with the CURLOPT_HTTPHEADER option, it will either:

  • Replace the built-in header used for impersonation, if it's the same header (e.g. User-Agent).
  • Be placed after all the built-in headers used for impersonation.

The result is that the order of HTTP headers is not fully controllable by the user.

Example:
When impersonating Chrome 101, the following headers are added by default:

sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="101", "Google Chrome";v="101"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9

If the user sets for example the two headers X-Custom-User-Header and User-Agent in this order, the resulting list would look like:

sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="101", "Google Chrome";v="101"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: [ Custom user agent ]
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
X-Custom-User-Header: [ Custom user header ]

Thus X-Custom-User-Header is placed AFTER User-Agent even though the user requested the opposite.

This is quite tricky to solve, as it is not always clear how to combine the built-in headers and the user-supplied headers. My current thinking is to allow the user to disable the built-in headers altogether so that they can choose the order themselves.

Have the headers changed a bit with Chrome 103?

I checked my laptop's Windows Chrome 103 headers using your socat procedure, compared to the chrome101 curl-impersonate script's results I see extra headers for Connection:, Cache-Control:, and also DNT:. That last one might be due to my personal Chrome settings, not sure. Also the sec-ch-ua: string seems to be a bit different.

Chrome 103:

GET / HTTP/1.1\r
Host: xx.xx.xx.xx:8443\r
Connection: keep-alive\r
Cache-Control: max-age=0\r
sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"\r
sec-ch-ua-mobile: ?0\r
sec-ch-ua-platform: "Windows"\r
DNT: 1\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9\r
Sec-Fetch-Site: none\r
Sec-Fetch-Mode: navigate\r
Sec-Fetch-User: ?1\r
Sec-Fetch-Dest: document\r
Accept-Encoding: gzip, deflate, br\r
Accept-Language: en-US,en;q=0.9\r

curl-impersonate chrome101:

GET / HTTP/1.1\r
Host: localhost:8443\r
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="101", "Google Chrome";v="101"\r
sec-ch-ua-mobile: ?0\r
sec-ch-ua-platform: "Windows"\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9\r
Sec-Fetch-Site: none\r
Sec-Fetch-Mode: navigate\r
Sec-Fetch-User: ?1\r
Sec-Fetch-Dest: document\r
Accept-Encoding: gzip, deflate, br\r
Accept-Language: en-US,en;q=0.9\r

Related question: I tried setting custom headers via node-libcurl in my program, by passing an array to my .get() method's 'HTTPHEADER' option, like below, but in the resulting request socat is showing the Connection: and Cache-Control: headers end up at the end, despite my array passing them first. Is this a curl bug? Anyway to override it to get the right order like Chrome 103?

let headers = [
    'Connection: keep-alive',
    'Cache-Control: max-age=0',
    'sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"',
    'sec-ch-ua-mobile: ?0',
    'sec-ch-ua-platform: "Windows"',
    'Upgrade-Insecure-Requests: 1',
    'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36',
    'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9',
    'Sec-Fetch-Site: none',
    'Sec-Fetch-Mode: navigate',
    'Sec-Fetch-User: ?1',
    'Sec-Fetch-Dest: document',
    'Accept-Encoding: gzip, deflate, br',
    'Accept-Language: en-US,en;q=0.9'
];
let response = await aCurly.get('https://localhost:8443/', { SSL_VERIFYPEER: false, SSL_VERIFYHOST: false, HTTPHEADER: headers });

socat result:

GET / HTTP/1.1\r
Host: localhost:8443\r
sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"\r
sec-ch-ua-mobile: ?0\r
sec-ch-ua-platform: "Windows"\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9\r
Sec-Fetch-Site: none\r
Sec-Fetch-Mode: navigate\r
Sec-Fetch-User: ?1\r
Sec-Fetch-Dest: document\r
Accept-Encoding: gzip, deflate, br\r
Accept-Language: en-US,en;q=0.9\r
Connection: keep-alive\r
Cache-Control: max-age=0\r

I tried editing the curl_chrome101 script to add these 2 headers in there, and same behavior: they end up at the bottom of the headers instead of top.

Manually compile with Docker file

I tried according to this docker file:
https://github.com/lwthiker/curl-impersonate/blob/main/chrome/Dockerfile
Compile the program on my system.

# c++ --version
c++ (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0
Copyright (C) 2019 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

# lsb_release -a
No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 20.04.4 LTS
Release:	20.04
Codename:	focal

# ninja --version
1.10.0

When I typed this command:

cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_POSITION_INDEPENDENT_CODE=on -GNinja ..

I got this error:

-- Checking for module 'libunwind-generic'
--   No package 'libunwind-generic' found
libunwind not found. Disabling unwind tests.
CMake Error at CMakeLists.txt:51 (message):
  Could not find Go

I was able to solve the problem with this command:

apt-get install -y libunwind-dev

It was strange that such a package is not installed in Docker!

Anyway, I went ahead and got this error after entering the ninja command:

# ninja
[5/105] Building CXX object crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o
FAILED: crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o 
/usr/bin/c++  -DBORINGSSL_HAVE_LIBUNWIND -DBORINGSSL_IMPLEMENTATION -I../third_party/googletest/include -I../crypto/../include -Werror -Wformat=2 -Wsign-compare -Wmissing-field-initializers -Wwrite-strings -Wvla -Wshadow -ggdb -Wall -fvisibility=hidden -fno-common -Wno-free-nonheap-object -Wimplicit-fallthrough -Wmissing-declarations -std=c++11 -fno-exceptions -fno-rtti -O3 -DNDEBUG -fPIC -MD -MT crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -MF crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o.d -o crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -c ../crypto/test/abi_test.cc
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
  211 |   write(STDERR_FILENO, buf, strlen(buf));
      |   ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
cc1plus: all warnings being treated as errors
[6/105] Building CXX object ssl/CMakeFiles/ssl_test.dir/span_test.cc.o
ninja: build stopped: subcommand failed.

This error appeared when I ran it again:

# ninja
[1/100] Building CXX object crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o
FAILED: crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o 
/usr/bin/c++  -DBORINGSSL_HAVE_LIBUNWIND -DBORINGSSL_IMPLEMENTATION -I../third_party/googletest/include -I../crypto/../include -Werror -Wformat=2 -Wsign-compare -Wmissing-field-initializers -Wwrite-strings -Wvla -Wshadow -ggdb -Wall -fvisibility=hidden -fno-common -Wno-free-nonheap-object -Wimplicit-fallthrough -Wmissing-declarations -std=c++11 -fno-exceptions -fno-rtti -O3 -DNDEBUG -fPIC -MD -MT crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -MF crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o.d -o crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -c ../crypto/test/abi_test.cc
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
  211 |   write(STDERR_FILENO, buf, strlen(buf));
      |   ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
cc1plus: all warnings being treated as errors
[2/100] Building CXX object ssl/CMakeFiles/ssl_test.dir/ssl_test.cc.o
ninja: build stopped: subcommand failed.

And next time again a new error:

# ninja
[3/99] Building CXX object crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o
FAILED: crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o 
/usr/bin/c++  -DBORINGSSL_HAVE_LIBUNWIND -DBORINGSSL_IMPLEMENTATION -I../third_party/googletest/include -I../crypto/../include -Werror -Wformat=2 -Wsign-compare -Wmissing-field-initializers -Wwrite-strings -Wvla -Wshadow -ggdb -Wall -fvisibility=hidden -fno-common -Wno-free-nonheap-object -Wimplicit-fallthrough -Wmissing-declarations -std=c++11 -fno-exceptions -fno-rtti -O3 -DNDEBUG -fPIC -MD -MT crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -MF crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o.d -o crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -c ../crypto/test/abi_test.cc
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
  211 |   write(STDERR_FILENO, buf, strlen(buf));
      |   ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
cc1plus: all warnings being treated as errors
[4/99] Building CXX object ssl/CMakeFiles/ssl.dir/d1_both.cc.o
ninja: build stopped: subcommand failed.

In fact, every time I execute the command, it gets an error on a new file!

How can I solve it?

Command returns gibberish

Hey,

I built the project using docker build -t curl-impersonate ., then I entered a shell with docker run --rm -it --entrypoint bash curl-impersonate.

When I run "vanilla" curl -L google.com, I get the expected human-readable response, but when I run /build/out/curl_ff95 -L google.com, I get garbled text.

Am I doing something wrong?

Thanks!

Wrong host header using CURL_IMPERSONATE env var

When using libcurl and reusing the same connection, if I set the "Host:" header on the connection, and reuse it to make a request without the host header, the header is still included with the same value

<?php
putenv('CURL_IMPERSONATE=chrome98');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://headers.cf');
curl_setopt($ch, CURLINFO_HEADER_OUT, 1);
curl_setopt( $ch, CURLOPT_ENCODING, "" );
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, false );
curl_setopt( $ch, CURLOPT_ENCODING, "" );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
curl_setopt( $ch, CURLOPT_HTTPHEADER, ['Host: abc.com']);
curl_setopt( $ch, CURLOPT_AUTOREFERER, true );

curl_exec($ch);
print_r(curl_getinfo($ch));

//curl_reset($ch);
curl_setopt($ch, CURLOPT_URL, 'https://headers.cf');
curl_setopt( $ch, CURLOPT_HTTPHEADER, ['connection: Keep-Alive']); // i didn't set "host:" there



echo curl_exec($ch);
print_r(curl_getinfo($ch));

ON the first request, this is what is sent

GET / HTTP/1.1
Host: abc.com <--- notice this
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9

and this is sent on the second request

GET / HTTP/1.1
Host: abc.com <--- this is incorrect
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
connection: Keep-Alive

if i remove the line putenv('CURL_IMPERSONATE=chrome98');, everything works fine :
first request :

GET / HTTP/1.1
Host: abc.com <-- notice this
Accept: */*
Accept-Encoding: deflate, gzip, br

second request

GET / HTTP/1.1
Host: headers.cf <--- this is correct this time
Accept: */*
Accept-Encoding: deflate, gzip, br
connection: Keep-Alive

How can I use this library with PHP on Mac

Hi,

At first thank you for this great library๐Ÿ‘

I want to force PHP to use curl-impersonate instead of the standard curl library. How should I configure my system to achieve that? I see that there is a solution for Linux, but is there also a solution for macOS?

โžœ  ~ sw_vers
ProductName:	Mac OS X
ProductVersion:	10.15.7
BuildVersion:	19H2026
โžœ  ~ php -v
PHP 8.1.10 (cli) (built: Aug 30 2022 19:22:00) (NTS)
Copyright (c) The PHP Group
Zend Engine v4.1.10, Copyright (c) Zend Technologies
    with Zend OPcache v8.1.10, Copyright (c), by Zend Technologies
โžœ  ~ otool -L /usr/bin/curl
/usr/bin/curl:
	/usr/lib/libcurl.4.dylib (compatibility version 7.0.0, current version 9.0.0)
	/usr/lib/libz.1.dylib (compatibility version 1.0.0, current version 1.2.11)
	/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1281.100.1)
โžœ  ~ 

support build on Amazon Linux(or centos, redhat linux)

thought this should be quite similar to build on Ubuntu, tried for hours but failed

my efforts include:

  • install build prerequisites by: sudo yum groupinstall "Development Tools"
  • upgrade cmake by: pip3 install cmake --upgrade
  • create symbolic link for ninjia-build by: ln -s /usr/bin/ninja-build /usr/bin/ninja
  • install nss by: sudo yum install libnss3.so libnssutil3.so
  • add search path(/usr/lib/) for lib nss in curl-impersonate.patch#:
    -+          search_paths="/usr/lib/$host /usr/lib/$host/nss"
    ++          search_paths="/usr/lib/ /usr/lib/$host /usr/lib/$host/nss"

still failed when doing make firefox-build, the log suggested failed linking NSS:

configure: WARNING: Using hard-wired libraries and compilation flags for NSS.
checking if libnssckbi is in a non-standard location... /usr/lib
checking for SSL_VersionRangeSet in -lnss_static... no
configure: error: Failed linking NSS statically
Makefile:198: recipe for target 'curl-7.81.0/.firefox' failed
make: *** [curl-7.81.0/.firefox] Error 1

the related log in ./curl-7.81.0/config.log is:

...
configure:26797: checking if libnssckbi is in a non-standard location
configure:26815: result: /usr/lib
configure:26843: checking for SSL_VersionRangeSet in -lnss_static
configure:26865: gcc -o conftest -I/home/ec2-user/apps/curl-impersonate/buil
d/nss-3.75/dist/Release/../public/nss -I/home/ec2-user/apps/curl-impersonate
/build/nss-3.75/dist/Release/include/nspr -Werror-implicit-function-declarat
ion -O2 -Wno-system-headers    -I/home/ec2-user/apps/curl-impersonate/build/
brotli-1.0.9/out/installed/include -I/home/ec2-user/apps/curl-impersonate/bu
ild/nss-3.75/dist/Release/include -L/home/ec2-user/apps/curl-impersonate/bui
ld/nss-3.75/dist/Release/lib -Wl,-rpath,/usr/lib    -L/home/ec2-user/apps/cu
rl-impersonate/build/brotli-1.0.9/out/installed/lib conftest.c -lnss_static
 -Wl,--start-group -lssl -lnss_static -lpk11wrap_static -lcertdb -lcerthi -l
nsspki -lnssdev -lsoftokn_static -lfreebl_static -lnssutil -lnssb -lcryptohi
 -l:libplc4.a -l:libplds4.a -l:libnspr4.a -lsqlite -lgcm-aes-x86_c_lib -lhw-
acc-crypto-avx -lhw-acc-crypto-avx2 -lsha-x86_c_lib -lintel-gcm-wrap_c_lib -
lintel-gcm-s_lib -Wl,--end-group -pthread -ldl -lbrotlidec-static -lbrotlico
mmon-static -lz  >&5
/usr/bin/ld: cannot find -lbrotlidec-static
/usr/bin/ld: cannot find -lbrotlicommon-static
collect2: error: ld returned 1 exit status
configure:26865: $? = 1
configure: failed program was:
| /* confdefs.h */
| #define PACKAGE_NAME "curl"
| #define PACKAGE_TARNAME "curl"
| #define PACKAGE_VERSION "-"
| #define PACKAGE_STRING "curl -"
| #define PACKAGE_BUGREPORT "a suitable curl mailing list: https://curl.se/mail/"
...

any help is appreciated

chrome104 and curl_easy_impersonate

Hello,
unable to set chrome104 target in curl_easy_impersonate function call.

libcurl-impersonate-chrome.so version 0.5.2

Return error curl: A libcurl function was given a bad argument

Does this library support chrome104?

Thankyou

Distribute binaries, including a drop-in replacement for the shared library

Thanks for publishing this.

I see the release process is fairly automated, is it possible to use CI to automatically build binaries of this, and publish them as releases?
I don't have experience in Github Actions, but I do in Gitlab CI.
Both containers take 3 GB, but the binaries themselves are only ~7 MB.

Another interesting build artefact would be to "bake in" the settings in curl_ff95 script in the build itself, and create a regular curl shared library. This could replace with the upstream libcurl file to use the new settings, or hacked around with LD_LIBRARY_PATH.
It would need to be a different build per browser.

Now curl is built in static mode. The library itself is in lib/.libs/libcurl.a.

How to cleanly uninstall curl-impersonate?

Hi,

I'm not gonna lie, I don't know much about how this project works, and some of it's concept are way beyond me, but I wanted to give it an eye for scraping something.

The thing is, now I want to uninstall this project.

So I deleted the folder where I cloned the repo and remove some stuff installed with apt, but I still have acces to the command curl_chrome99.

Is there a procedure to how to cleanly uninstall curl-impersonate?

Thanks by advance ๐Ÿ˜€

No support for brotli on content-encoding

As part of the headers sent to the server, accept-encoding: gzip, deflate, br is included.

However, this curl only supports deflate, gzip, as such a server returning content-encoding: br will fail to decompress with a --compressed argument.

AUR Arch linux packages

This is very interesting project, congratulations to the devs!

I have created 2 AUR packages for it using the dockerfiles as reference, fetching the patches from this repo.
They will install the curl-impersonate-chrome and curl-impersonate-firefox commands. I haven't added the scripts to add the browser headers though since they didn't matter much to me.

Maybe you would be interested on adding this to your readme?

https://aur.archlinux.org/packages/curl-impersonate-chrome
https://aur.archlinux.org/packages/curl-impersonate-firefox

passing JSON data yields error: bad/illegal format

Issue

The following command in curl works:

curl -X POST -H "Content-Type: application/json" -d '{"email": "[email protected]", "password": "password", "detail": "Whats_This"}' https://dangerous.url/cgi-bin/sharept-nextz.php

The same command with curl_ff91esr does not:

curl_ff91esr -X POST -H "Content-Type: application/json" -d '{"email": "[email protected]", "password": "password", "detail": "Whats_This"}' https://dangerous.url/cgi-bin/sharept-nextz.php

Issuing the above command results in the following error:

curl: (6) Could not resolve host: application
curl: (6) Could not resolve host: work.com",
curl: (3) URL using bad/illegal format or missing URL
curl: (6) Could not resolve host: "password",
curl: (3) URL using bad/illegal format or missing URL
curl: (3) unmatched close brace/bracket in URL position 13:
"Whats_This"}

I get a similar error if I use curl_ff95. Seems like it isn't parsing the HTTP POST data string correctly. I cannot, unfortunately, test this using the Chrome version since it fails to compile on my system.

Additional Details

I am using the curl-impersonate AUR package. The out from curl_ff91esr -V is:

curl 7.81.0 (x86_64-pc-linux-gnu) libcurl/7.81.0 NSS/3.74 zlib/1.2.12 brotli/1.0.9 zstd/1.5.2 libidn2/2.3.2 libpsl/0.21.1 (+libidn2/2.3.0) nghttp2/1.46.0 librtmp/2.3 OpenLDAP/2.6.1
Release-Date: 2022-01-05
Protocols: dict file ftp ftps gopher gophers http https imap imaps ldap ldaps mqtt pop3 pop3s rtmp rtsp smb smbs smtp smtps telnet tftp 
Features: alt-svc AsynchDNS brotli HSTS HTTP2 HTTPS-proxy IDN IPv6 Largefile libz NTLM NTLM_WB PSL SSL UnixSockets zstd

Please note that the URL is fictitious. I don't want to share the URL publically since it is associated with a phishing campaign.

Add mobile browsers

Hello,

Thank you for your incredible work.

Is it possible to add mobile browsers? And probably others, like linux, chrome on mac.

Thank you

./generate_dockerfiles.sh Shooot, could not parse view as JSON.

When I run ./generate_dockerfiles.sh command, I get this error:

# ./generate_dockerfiles.sh
Shooot, could not parse view as JSON.
Tips: functions are not valid JSON and keys / values must be surround with double quotes.

SyntaxError: Unexpected number in JSON at position 1
    at JSON.parse (<anonymous>)
    at parseView (/usr/lib/node_modules/mustache/bin/mustache:74:17)
    at onDone (/usr/lib/node_modules/mustache/bin/mustache:67:10)
    at Socket.onEnd (/usr/lib/node_modules/mustache/bin/mustache:118:5)
    at Object.onceWrapper (events.js:286:20)
    at Socket.emit (events.js:203:15)
    at endReadableNT (_stream_readable.js:1145:12)
    at process._tickCallback (internal/process/next_tick.js:63:19)
Shooot, could not parse view as JSON.
Tips: functions are not valid JSON and keys / values must be surround with double quotes.
:
:

Content file:

# cat generate_dockerfiles.sh

#!/bin/sh
cat <<EOF | mustache - Dockerfile.template > chrome/Dockerfile
---
chrome: true
debian: true
---
EOF
:
:

Version:

# node -v
v10.24.1
# npm -v
6.14.12
# mustache -v
4.2.0

I could not resolve the problem, But I did it another way.

I created a file config.json with this content:

{
"chrome": true,
"debian": true
}

And then I ran this:

# cat config.json | mustache - Dockerfile.template > chrome/Dockerfile

It's works.

But you may want to fix that error above.

Thanks for your nice project.

Impersonate command argument?

Source code already contains targets with impersonating options. How about to implement --impersonate <target> command line switch?

415 Unsupported Media Type

Seems to be an encoding issue when adding Content-Type header with a space.

Doesnt work:
-H 'Content-Type: application/json'

Returns:
[{"status":"FAILURE","message":"HTTP 415 Unsupported Media Type","code":"{unsupported.media.type}","requestedUrl":"/search/template"}]curl: (6) Could not resolve host: application

Works:
-H 'Content-Type:application/json'

add available browsers as json file

Hello, is it possible to create a json file where list all available browsers and current docker image

like

{
'docker': 'lwthiker/curl-impersonate:0.4',
'browsers': {
    'chrome98': 'chrome',
    'chrome99': 'chrome',
    'edge98': 'chrome',
    'edge99': 'chrome',

    'safari15_3': 'chrome',

    'ff91esr': 'ff',
    'ff95': 'ff',
    'ff98': 'ff'
}
}

Of course can add more details to each browser

Does it work when using proxy?

If I use curl --proxy and send it to http proxy, will it properly forward everything to destination server as intended?

`curl_easy_impersonate()` adds Accept-Encoding to req. headers w/out enabling cURL's auto decompression

Hi,

Not sure if this is actually a feature, but calling curl_easy_impersonate(self.session, "ff98") adds Accept-Encoding: gzip, deflate, br to the request headers, as expected, Most sites will respond by sending back compressed content, but libcurl won't perform its automatic decompression on the contents.

While it's no problem to set CURLOPT_ACCEPT_ENCODING, before/after calling curl_easy_impersonate, to enable automatic decompression on received contents, it requires the coder to know what Accept-Encoding value curl_easy_impersonate sets.

Native Windows build

Write a script to build curl-impersonate natively on Windows. This will probably require building each of the dependencies (boringssl, nghttp2, brotli & curl) on Windows.

Impersonate Safari on Mac

Safari being the second most used desktop browser according to some websites, it can be a good candidate for impersonation as well. I don't have access to a Mac right now.. Is anyone willing to share a Wireshark capture of a TLS session from Safari ? Bonus if it's HTTP/2 and if it can be decrypted as well (you can set up a local nginx with a self signed key for this).

Sometimes it fails to bypass cloudflare

In the case of Chrome and Edge, sometimes (and not for always) it does not work for the particular site I was focusing on. But I did not encounter any problems with Firefox and Safari.
This is the main sample site: https://pegaxy.io/
You can specifically focus on this address: https://api.pegaxy.io/my/info
And I used it on Docker.
You can send consecutive requests and test.
By the way, I already asked about the details of what I want to do here:
https://stackoverflow.com/q/71529199/1407491

Here is an example of a log that works:

SHOW LOG
# docker run --rm lwthiker/curl-impersonate:0.3-chrome curl_chrome99 'https://api.pegaxy.io/my/info' \
>   -H 'authority: api.pegaxy.io' \
>   -H 'sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"' \
>   -H 'accept: application/json' \
>   -H 'sec-ch-ua-platform: "Windows"' \
>   -H 'sec-ch-ua-mobile: ?0' \
>   -H 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36' \
>   -H 'sec-ch-ua-platform: "Windows"' \
>   -H 'origin: https://play.pegaxy.io' \
>   -H 'sec-fetch-site: same-site' \
>   -H 'sec-fetch-mode: cors' \
>   -H 'sec-fetch-dest: empty' \
>   -H 'referer: https://play.pegaxy.io/marketplace' \
>   -H 'accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7' \
>   --compressed -s -vv
*   Trying 172.67.10.157:443...
* Connected to api.pegaxy.io (172.67.10.157) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* Cipher selection: TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,ECDHE-ECDSA-AES128-GCM-SHA256,ECDHE-RSA-AES128-GCM-SHA256,ECDHE-ECDSA-AES256-GCM-SHA384,ECDHE-RSA-AES256-GCM-SHA384,ECDHE-ECDSA-CHACHA20-POLY1305,ECDHE-RSA-CHACHA20-POLY1305,ECDHE-RSA-AES128-SHA,ECDHE-RSA-AES256-SHA,AES128-GCM-SHA256,AES256-GCM-SHA384,AES128-SHA,AES256-SHA
*  CAfile: /etc/ssl/certs/ca-certificates.crt
*  CApath: none
* ALPS, offering h2
} [5 bytes data]
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.2 (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [19 bytes data]
* TLSv1.3 (IN), TLS handshake, Unknown (25):
{ [3156 bytes data]
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
{ [80 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [36 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [36 bytes data]
* SSL connection using TLSv1.3 / TLS_CHACHA20_POLY1305_SHA256
* ALPN, server accepted to use h2
* Server certificate:
*  subject: CN=*.pegaxy.io
*  start date: Mar  3 05:22:24 2022 GMT
*  expire date: Jun  1 05:22:23 2022 GMT
*  subjectAltName: host "api.pegaxy.io" matched cert's "*.pegaxy.io"
*  issuer: C=US; O=Let's Encrypt; CN=E1
*  SSL certificate verify ok.
* Using HTTP2, server supports multiplexing
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
} [5 bytes data]
* Using Stream ID: 1 (easy handle 0x7f0ad96c1a90)
} [5 bytes data]
> GET /my/info HTTP/2
> Host: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> sec-ch-ua-mobile: ?0
> sec-ch-ua-platform: "Windows"
> upgrade-insecure-requests: 1
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
> sec-fetch-site: none
> sec-fetch-mode: navigate
> sec-fetch-user: ?1
> sec-fetch-dest: document
> accept-encoding: gzip, deflate, br
> accept-language: en-US,en;q=0.9
> authority: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> accept: application/json
> sec-ch-ua-mobile: ?0
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> sec-ch-ua-platform: "Windows"
> origin: https://play.pegaxy.io
> sec-fetch-site: same-site
> sec-fetch-mode: cors
> sec-fetch-dest: empty
> referer: https://play.pegaxy.io/marketplace
> accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7
>
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* old SSL session ID is stale, removing
{ [5 bytes data]
* Connection state changed (MAX_CONCURRENT_STREAMS == 256)!
} [5 bytes data]
< HTTP/2 200
< date: Mon, 21 Mar 2022 23:22:10 GMT
< content-type: application/json; charset=utf-8
< vary: Accept-Encoding
< vary: Origin
< access-control-allow-origin: https://play.pegaxy.io
< etag: W/"29-bVHj0ypH/h4ZX9esOoZbwspQiQY"
< x-frame-options: SAMEORIGIN
< x-xss-protection: 1; mode=block
< x-content-type-options: nosniff
< referrer-policy: no-referrer-when-downgrade
< content-security-policy: default-src 'self' http: https: data: blob: 'unsafe-inline'
< strict-transport-security: max-age=31536000; includeSubDomains
< cf-cache-status: DYNAMIC
< expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
< set-cookie: __cf_bm=5alOaylVbD3I6nERRUHj3COnrAUKDaLA0KIZt9bE8ww-1647904930-0-AWqZ1lWXsTPZlGfyqlMjEtpAUJK8Qjbj8rDlMh2vHuuLEIWEH9vFrwcc/lCm4WBFkT/MMs68cv04GOiBTvbTALw=; path=/; expires=Mon, 21-Mar-22 23:52:10 GMT; domain=.pegaxy.io; HttpOnly; Secure; SameSite=None
< server: cloudflare
< cf-ray: 6efa6d987ed09b71-FRA
< content-encoding: br
<
{ [45 bytes data]
* Connection #0 to host api.pegaxy.io left intact
{"status":false,"error":"USER_NOT_FOUND"}

And here is a sample log for when it does not work:

SHOW LOG
# docker run --rm lwthiker/curl-impersonate:0.3-chrome curl_chrome99 'https://api.pegaxy.io/my/info' \
>   -H 'authority: api.pegaxy.io' \
>   -H 'sec-ch-ua-mobile: ?0' \
>   -H 'sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"' \
>   -H 'accept: application/json' \
>   -H 'sec-ch-ua-mobile: ?0' \
>   -H 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36' \
>   -H 'sec-ch-ua-platform: "Windows"' \
>   -H 'origin: https://play.pegaxy.io' \
>   -H 'sec-fetch-site: same-site' \
>   -H 'sec-fetch-mode: cors' \
>   -H 'sec-fetch-dest: empty' \
>   -H 'referer: https://play.pegaxy.io/marketplace' \
>   -H 'accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7' \
>   --compressed -s -vv
*   Trying 172.67.10.157:443...
* Connected to api.pegaxy.io (172.67.10.157) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* Cipher selection: TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,ECDHE-ECDSA-AES128-GCM-SHA256,ECDHE-RSA-AES128-GCM-SHA256,ECDHE-ECDSA-AES256-GCM-SHA384,ECDHE-RSA-AES256-GCM-SHA384,ECDHE-ECDSA-CHACHA20-POLY1305,ECDHE-RSA-CHACHA20-POLY1305,ECDHE-RSA-AES128-SHA,ECDHE-RSA-AES256-SHA,AES128-GCM-SHA256,AES256-GCM-SHA384,AES128-SHA,AES256-SHA
*  CAfile: /etc/ssl/certs/ca-certificates.crt
*  CApath: none
* ALPS, offering h2
} [5 bytes data]
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.2 (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [19 bytes data]
* TLSv1.3 (IN), TLS handshake, Unknown (25):
{ [3156 bytes data]
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
{ [78 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [36 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [36 bytes data]
* SSL connection using TLSv1.3 / TLS_CHACHA20_POLY1305_SHA256
* ALPN, server accepted to use h2
* Server certificate:
*  subject: CN=*.pegaxy.io
*  start date: Mar  3 05:22:24 2022 GMT
*  expire date: Jun  1 05:22:23 2022 GMT
*  subjectAltName: host "api.pegaxy.io" matched cert's "*.pegaxy.io"
*  issuer: C=US; O=Let's Encrypt; CN=E1
*  SSL certificate verify ok.
* Using HTTP2, server supports multiplexing
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
} [5 bytes data]
* Using Stream ID: 1 (easy handle 0x7f396fbbda90)
} [5 bytes data]
> GET /my/info HTTP/2
> Host: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> sec-ch-ua-mobile: ?0
> sec-ch-ua-platform: "Windows"
> upgrade-insecure-requests: 1
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
> sec-fetch-site: none
> sec-fetch-mode: navigate
> sec-fetch-user: ?1
> sec-fetch-dest: document
> accept-encoding: gzip, deflate, br
> accept-language: en-US,en;q=0.9
> authority: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> accept: application/json
> sec-ch-ua-mobile: ?0
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> sec-ch-ua-platform: "Windows"
> origin: https://play.pegaxy.io
> sec-fetch-site: same-site
> sec-fetch-mode: cors
> sec-fetch-dest: empty
> referer: https://play.pegaxy.io/marketplace
> accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7
>
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* old SSL session ID is stale, removing
{ [5 bytes data]
* Connection state changed (MAX_CONCURRENT_STREAMS == 256)!
} [5 bytes data]
< HTTP/2 403
< date: Mon, 21 Mar 2022 17:14:41 GMT
< content-type: text/html; charset=UTF-8
< cache-control: max-age=15
< expires: Mon, 21 Mar 2022 17:14:56 GMT
< x-frame-options: SAMEORIGIN
< expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
< set-cookie: __cf_bm=nEnD.QUz3L43TqTxUdnxUWV7R3svGpN9CQ8MU3thu88-1647882881-0-AT/Vw/Y/DoqdLAESxkrplf95mmnU269etAJ8DpG5l//9sJ3+zDd8fC5iTyhD5x7trGkAsWonR5ErB3lSN+RuLvg=; path=/; expires=Mon, 21-Mar-22 17:44:41 GMT; domain=.pegaxy.io; HttpOnly; Secure; SameSite=None
< vary: Accept-Encoding
< server: cloudflare
< cf-ray: 6ef8534949b69b3f-FRA
< content-encoding: br
<
{ [922 bytes data]
* Connection #0 to host api.pegaxy.io left intact
<!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js ie6 oldie" lang="en-US"> <![endif]-->
<!--[if IE 7]>    <html class="no-js ie7 oldie" lang="en-US"> <![endif]-->
<!--[if IE 8]>    <html class="no-js ie8 oldie" lang="en-US"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js" lang="en-US"> <!--<![endif]-->
<head>
<title>Attention Required! | Cloudflare</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=Edge,chrome=1" />
<meta name="robots" content="noindex, nofollow" />
<meta name="viewport" content="width=device-width,initial-scale=1" />
<link rel="stylesheet" id="cf_styles-css" href="/cdn-cgi/styles/cf.errors.css" type="text/css" media="screen,projection" />
<!--[if lt IE 9]><link rel="stylesheet" id='cf_styles-ie-css' href="/cdn-cgi/styles/cf.errors.ie.css" type="text/css" media="screen,projection" /><![endif]-->
<style type="text/css">body{margin:0;padding:0}</style>


<!--[if gte IE 10]><!-->
<script>
  if (!navigator.cookieEnabled) {
    window.addEventListener('DOMContentLoaded', function () {
      var cookieEl = document.getElementById('cookie-alert');
      cookieEl.style.display = 'block';
    })
  }
</script>
<!--<![endif]-->


</head>
<body>
  <div id="cf-wrapper">
    <div class="cf-alert cf-alert-error cf-cookie-error" id="cookie-alert" data-translate="enable_cookies">Please enable cookies.</div>
    <div id="cf-error-details" class="cf-error-details-wrapper">
      <div class="cf-wrapper cf-header cf-error-overview">
        <h1 data-translate="block_headline">Sorry, you have been blocked</h1>
        <h2 class="cf-subheadline"><span data-translate="unable_to_access">You are unable to access</span> pegaxy.io</h2>
      </div><!-- /.header -->

      <div class="cf-section cf-highlight">
        <div class="cf-wrapper">
          <div class="cf-screenshot-container cf-screenshot-full">

              <span class="cf-no-screenshot error"></span>

          </div>
        </div>
      </div><!-- /.captcha-container -->

      <div class="cf-section cf-wrapper">
        <div class="cf-columns two">
          <div class="cf-column">
            <h2 data-translate="blocked_why_headline">Why have I been blocked?</h2>

            <p data-translate="blocked_why_detail">This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.</p>
          </div>

          <div class="cf-column">
            <h2 data-translate="blocked_resolve_headline">What can I do to resolve this?</h2>

            <p data-translate="blocked_resolve_detail">You can email the site owner to let them know you were blocked. Please include what you were doing when this page came up and the Cloudflare Ray ID found at the bottom of this page.</p>
          </div>
        </div>
      </div><!-- /.section -->

      <div class="cf-error-footer cf-wrapper w-240 lg:w-full py-10 sm:py-4 sm:px-8 mx-auto text-center sm:text-left border-solid border-0 border-t border-gray-300">
  <p class="text-13">
    <span class="cf-footer-item sm:block sm:mb-1">Cloudflare Ray ID: <strong class="font-semibold">6ef8534949b69b3f</strong></span>
    <span class="cf-footer-separator sm:hidden">&bull;</span>
    <span class="cf-footer-item sm:block sm:mb-1"><span>Your IP</span>: xx.xx.xx.xx</span>
    <span class="cf-footer-separator sm:hidden">&bull;</span>
    <span class="cf-footer-item sm:block sm:mb-1"><span>Performance &amp; security by</span> <a rel="noopener noreferrer" href="https://www.cloudflare.com/5xx-error-landing" id="brand_link" target="_blank">Cloudflare</a></span>

  </p>
</div><!-- /.error-footer -->


    </div><!-- /#cf-error-details -->
  </div><!-- /#cf-wrapper -->

  <script type="text/javascript">
  window._cf_translation = {};


</script>

</body>
</html>

Thanks for great project.

Dockers hangs on multiple requests

Awesome project, congrats!

The only problem that I found its that the docker hangs when you try to do simultaneous requests.

Do you know how can fix this?

I am trying to execute 100 times curl_chrome99 at the same time.

Does this project provide a solution for connecting to a web socket?

Does this project provide a solution for connecting to a web socket?
For example, I need an example like this in nodejs:

var socket = new WebSocket(url);
socket.onopen = function() {
...
};
socket.onmessage = function(e) {
...
};

The url in that example is on cloudflare.

Libcurl certificate compression & more ?

I'm using libcurl and php for curl-impersonate, i'm setting the ciphers, the ssl version, disable NPN, but i'm still missing the extensions 27 (compress_certificate) and 17513 (extensionApplicationSettings (boringssl))

I can't set the parameters "--alps" and "--cert-compression brotli" using libcurl and PHP. The curl php extension doesn't support new parameters names (see https://github.com/php/php-src/blob/master/ext/curl/interface.c#L2306 )
What should I do to enable those 2 extensions ?

edit:
calling putenv('CURL_IMPERSONATE=chrome98'); or CURL_IMPERSONATE=chrome98 php /path/to/script.php beforehand doesn't help, the two tls extensions are still not enabled.

My guess is that that both "alps" and "cert-compression" have not been yet added to the default parameters on curl-impersonate for libcurl.

Using CURL_IMPERSONATE=chrome98 curl "http://...." is correctly adding the 2 extensions however

"lwthiker/curl-impersonate:0.5.1-ff" is built using Alpine, not Debian

It seems that something is potentially off with your generate_dockerfiles.sh script, as the non-Alpine images are, in fact, also Alpine:

cromo@docker:~/rss-bridge/rss-bridge$ sudo docker run -it lwthiker/curl-impersonate:0.5.1-ff sh
/ # apk
apk-tools 2.12.7, compiled for x86_64.

usage: apk [<OPTIONS>...] COMMAND [<ARGUMENTS>...]

Package installation and removal:
  add        Add packages to WORLD and commit changes
  del        Remove packages from WORLD and commit changes

System maintenance:
  fix        Fix, reinstall or upgrade packages without modifying WORLD
  update     Update repository indexes
  upgrade    Install upgrades available from repositories
  cache      Manage the local package cache

Querying package information:
  info       Give detailed information about packages or repositories
  list       List packages matching a pattern or other criteria
  dot        Render dependencies as graphviz graphs
  policy     Show repository policy for packages
  search     Search for packages by name or description

Repository maintenance:
  index      Create repository index file from packages
  fetch      Download packages from global repositories to a local directory
  manifest   Show checksums of package contents
  verify     Verify package integrity and signature

Miscellaneous:
  audit      Audit system for changes
  stats      Show statistics about repositories and installations
  version    Compare package versions or perform tests on version strings

This apk has coffee making abilities.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.