Giter Club home page Giter Club logo

mtproto_proxy's Introduction

Erlang mtproto proxy

This part of code was extracted from @socksy_bot.

Support: https://t.me/erlang_mtproxy .

Features

  • Promoted channels. See tag option.
  • "secure" randomized-packet-size protocol (34-symbol secrets starting with 'dd') to prevent detection by DPI
  • Fake-TLS protocol ('ee'/base64 secrets) - another protocol to prevent DPI detection
  • Secure-only mode (only allow connections with 'dd' or fake-TLS). See allowed_protocols option.
  • Connection limit policies - limit number of connections by IP / tls-domain / port; IP / tls-domain blacklists / whitelists
  • Multiple ports with unique secret and promo tag for each port
  • Very high performance - can handle tens of thousands connections! Scales to all CPU cores. 1Gbps, 90k connections on 4-core/8Gb RAM cloud server.
  • Supports multiplexing (Many connections Client -> Proxy are wrapped to small amount of connections Proxy -> Telegram Server) - lower pings and better OS network utilization
  • Protection from replay attacks used to detect proxies in some countries
  • Automatic telegram configuration reload (no need for restarts once per day)
  • IPv6 for client connections
  • All configuration options can be updated without service restart
  • Small codebase compared to official one, code is covered by automated tests
  • A lots of metrics could be exported (optional)

How to install - one-line interactive installer

This command will run interactive script that will install and configure proxy for your Ubuntu / Debian / CentOS server. It will ask if you want to change default port/secret/ad-tag/protocols:

curl -L -o mtp_install.sh https://git.io/fj5ru && bash mtp_install.sh

You can also just provide port/secret/ad-tag/protocols/tls-domain as command line arguments:

curl -L -o mtp_install.sh https://git.io/fj5ru && bash mtp_install.sh -p 443 -s d0d6e111bada5511fcce9584deadbeef -t dcbe8f1493fa4cd9ab300891c0b5b326 -a dd -a tls -d s3.amazonaws.com

It does the same as described in How to start OS-install - detailed, but generates config-file for you automatically.

How to start - Docker

To run with default settings

docker run -d --network=host seriyps/mtproto-proxy

To run on single port with custom port, secret and ad-tag

docker run -d --network=host seriyps/mtproto-proxy -p 443 -s d0d6e111bada5511fcce9584deadbeef -t dcbe8f1493fa4cd9ab300891c0b5b326

or via environment variables

docker run -d --network=host -e MTP_PORT=443 -e MTP_SECRET=d0d6e111bada5511fcce9584deadbeef -e MTP_TAG=dcbe8f1493fa4cd9ab300891c0b5b326 seriyps/mtproto-proxy

Where

  • -p 443 / MTP_PORT=… proxy port
  • -s d0d6e111bada5511fcce9584deadbeef / MTP_SECRET=… proxy secret (don't append dd! it should be 32 chars long!)
  • -t dcbe8f1493fa4cd9ab300891c0b5b326 / MTP_TAG=… ad-tag that you get from @MTProxybot
  • -a dd / MTP_DD_ONLY=t only allow "secure" connections (dd-secrets)
  • -a tls / MTP_TLS_ONLY=t only allow "fake-TLS" connections (base64 secrets)

It's ok to provide both -a dd -a tls to allow both protocols. If no -a option provided, all protocols will be allowed.

To run with custom config-file

  1. Get the code git clone https://github.com/seriyps/mtproto_proxy.git && cd mtproto_proxy/
  2. Copy config templates cp config/{vm.args.example,prod-vm.args}; cp config/{sys.config.example,prod-sys.config}
  3. Edit configs. See Settings.
  4. Build docker build -t mtproto-proxy-erl .
  5. Start docker run -d --network=host mtproto-proxy-erl

Installation via docker can work well for small setups (10-20k connections), but for more heavily-loaded setups it's recommended to install proxy directly into your server's OS (see below).

How to start OS-install - quick

You need at least Erlang version 20! Recommended OS is Ubuntu 18.04.

sudo apt install erlang-nox erlang-dev build-essential
git clone https://github.com/seriyps/mtproto_proxy.git
cd mtproto_proxy/
cp config/vm.args.example config/prod-vm.args
cp config/sys.config.example config/prod-sys.config
# configure your port, secret, ad_tag. See [Settings](#settings) below.
nano config/prod-sys.config
make && sudo make install
sudo systemctl enable mtproto-proxy
sudo systemctl start mtproto-proxy

How to start OS-install - detailed

Install deps

Ubuntu 18.xx / Ubuntu 19.xx / Debian 10:

sudo apt install erlang-nox erlang-dev  make sed diffutils tar

CentOS 7

# Enable "epel" and "Erlang solutions" repositories
sudo yum install wget \
             https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm \
             https://packages.erlang-solutions.com/erlang-solutions-1.0-1.noarch.rpm
# Install Erlang
sudo yum install erlang-compiler erlang-erts erlang-kernel erlang-stdlib erlang-syntax_tools \
     erlang-crypto erlang-inets erlang-sasl erlang-ssl

You need Erlang version 20 or higher! If your version is older, please, check Erlang solutions esl-erlang package or use kerl.

Get the code:

git clone https://github.com/seriyps/mtproto_proxy.git
cd mtproto_proxy/

Create config file

see Settings.

Build and install

make && sudo make install

This will:

  • install proxy into /opt/mtp_proxy
  • create a system user
  • install systemd service
  • create a directory for logs in /var/log/mtproto-proxy
  • Configure ulimit of max open files and CAP_NET_BIND_SERVICE by systemd

Try to start in foreground mode

This step is optional, but it can be useful to test if everything works as expected

./start.sh

try to run ./start.sh -h to learn some useful options.

Start in background and enable start on system start-up

sudo systemctl enable mtproto-proxy
sudo systemctl start mtproto-proxy

Done! Proxy is up and ready to serve now!

Stop / uninstall

Stop:

sudo systemctl stop mtproto-proxy

Uninstall:

sudo systemctl stop mtproto-proxy
sudo systemctl disable mtproto-proxy
sudo make uninstall

Logs can be found at

/var/log/mtproto-proxy/application.log

Settings

All available documented configuration options could be found in src/mtproto_proxy.app.src. Do not edit this file!

To change configuration, edit config/prod-sys.config:

Comments in this file start with %%. Default port is 1443 and default secret is d0d6e111bada5511fcce9584deadbeef.

Secret key and proxy URLs will be printed on start.

Apply config changes without restart

It's possible to reload config file without service restart (but if you want to update ad_tag on existing port, all clients of this port will be disconnected).

This method doesn't work for Docker!

To do that, make changes in config/prod-sys.config and run following command:

sudo make update-sysconfig && sudo systemctl reload mtproto-proxy

Change default port / secret / ad tag

To change default settings, change mtproto_proxy section of prod-sys.config as:

 {mtproto_proxy,
  %% see src/mtproto_proxy.app.src for examples.
  [
   {ports,
    [#{name => mtp_handler_1,
       listen_ip => "0.0.0.0",
       port => 1443,
       secret => <<"d0d6e111bada5511fcce9584deadbeef">>,
       tag => <<"dcbe8f1493fa4cd9ab300891c0b5b326">>}
    ]}
   ]},

 {lager,
<...>

(so, remove %%s) and replace port / secret / tag with yours.

Listen on multiple ports / IPs

You can start proxy on many IP addresses or ports with different secrets/ad tags. To do so, just add more configs to ports section, separated by comma, eg:

 {mtproto_proxy,
  %% see src/mtproto_proxy.app.src for examples.
  [
   {ports,
    [#{name => mtp_handler_1,
       listen_ip => "0.0.0.0",
       port => 1443,
       secret => <<"d0d6e111bada5511fcce9584deadbeef">>,
       tag => <<"dcbe8f1493fa4cd9ab300891c0b5b326">>},
     #{name => mtp_handler_2,
       listen_ip => "0.0.0.0",
       port => 2443,
       secret => <<"100000000000000000000000000000001">>,
       tag => <<"cf8e6baff125ed5f661a761e69567711">>}
    ]}
   ]},

 {lager,
<...>

Each section should have unique name!

Only allow connections with 'dd'-secrets

This protocol uses randomized packet sizes, so it's more difficult to detect on DPI by packet sizes. It might be useful in Iran, where proxies are detected by DPI. You should disable all protocols other than mtp_secure by providing allowed_protocols option:

  {mtproto_proxy,
   [
    {allowed_protocols, [mtp_secure]},
    {ports,
     [#{name => mtp_handler_1,
      <..>

Only allow fake-TLS connections with ee/base64-secrets

Another censorship circumvention technique. MTPRoto proxy protocol pretends to be HTTPS web traffic (technically speaking, TLSv1.3 + HTTP/2). It's possible to only allow connections with this protocol by changing allowed_protocols to be list with only mtp_fake_tls.

  {mtproto_proxy,
   [
    {allowed_protocols, [mtp_fake_tls]},
    {ports,
     [#{name => mtp_handler_1,
      <..>

Connection limit policies

Proxy supports flexible connection limit rules. It's possible to limit number of connections from single IP or to single fake-TLS domain or to single port name; or any combination of them. It also supports whitelists and blacklists: you can allow or forbid to connect from some IP or IP subnet or with some TLS domains.

Policy is set as value of policy config key and the value is the list of policy structures. If list is empty, no limits will be checked.

Following policies are supported:

  • {in_table, KEY, TABLE_NAME} - only allow connections if KEY is present in TABLE_NAME (whitelist)
  • {not_in_table, KEY, TABLE_NAME} - only allow connections if KEY is not present in TABLE_NAME (blacklist)
  • {max_connections, KEYS, NUMBER} - EXPERIMENTAL! if there are more than NUMBER connections with KEYS to the proxy, new connections with those KEYS will be rejected. Note: number of connections is not the same as number of unique "users". When someone connects to proxy with telegram client, Telegram opens from 3 to 8 connections! So, you need to set this at least 8 * number of unique users.

Where:

  • KEY is one of:
    • port_name - proxy port name
    • client_ipv4 - client's IPv4 address; ignored on IPv6 ports!
    • client_ipv6 - client's IPv6 address; ignored on IPv4 ports!
    • {client_ipv4_subnet, MASK} - client's IPv4 subnet; mask is from 8 to 32
    • {client_ipv6_subnet, MASK} - client's IPv6 subnet; mask is from 32 to 128
    • tls_domain - lowercase domain name from fake-TLS secret; ignored if connection with non-fake-TLS protocol
  • KEYS is a list of one or more KEY, eg, [port, tls_domain]
  • TABLE_NAME is free-form text name of special internal database table, eg, my_table. Tables will be created automatically when proxy is started; data in tables is not preserved when proxy is restarted! You can add or remove new values from table dynamically at any moment with commands like:
    • /opt/mtp_proxy/bin/mtp_proxy eval 'mtp_policy_table:add(my_table, tls_domain, "google.com").' to add
    • /opt/mtp_proxy/bin/mtp_proxy eval 'mtp_policy_table:del(my_table, tls_domain, "google.com").' to remove

Some policy recipes / examples below

Limit max connections to proxy port from single IP

Here we allow maximum 100 concurrent connections from single IP to proxy port (as it was said earlier, it's not the same as 100 unique "users"! Each telegram client opens up to 8 connections; usually 3):

{mtproto_proxy,
 [
  {policy,
    [{max_connections, [port_name, client_ipv4], 100}]},
  {ports,
    <..>

Disallow connections from some IPs

{mtproto_proxy
 [
   {policy,
     [{not_in_table, client_ipv4, ip_blacklist}]},
   {ports,
     <..>

And then add IPs to blacklist with command:

/opt/mtp_proxy/bin/mtp_proxy eval '
mtp_policy_table:add(ip_blacklist, client_ipv4, "203.0.113.1").'

Remove from blacklist:

/opt/mtp_proxy/bin/mtp_proxy eval '
mtp_policy_table:del(ip_blacklist, client_ipv4, "203.0.113.1").'

Personal proxy / multi-secret proxy

We can limit number of connections with single fake-TLS domain and only allow connections with fake-TLS domains from whitelist.

{mtproto_proxy
 [
   {policy,
     [{max_connections, [port_name, tls_domain], 15},
      {in_table, tls_domain, customer_domains}]},
   {ports,
     <..>

Now we can assign each customer unique fake-TLS domain, eg, my-client1.example.com and give them unique TLS secret. Because we only allow 10 connections with single fake-TLS secret, they will not be able to share their credentials with others. To add client's fake domain to whitelist:

/opt/mtp_proxy/bin/mtp_proxy eval '
mtp_policy_table:add(customer_domains, tls_domain, "my-client1.example.com").'

And then use http://seriyps.ru/mtpgen.html to generate unique link for them. Be aware that domains table will be reset if proxy is restarted! Make sure you re-add them when proxy restarts (eg, via systemd hook script).

IPv6

Currently proxy only supports client connections via IPv6, but can only connect to Telegram servers using IPv4.

To enable IPv6, you should put IPv6 address in listen_ip config key. If you want proxy to accept clients on the same port with both IPv4 and IPv6, you should have 2 ports sections with the same port, secret and tag, but with different names and different listen_ip (one v4 and one v6):

 {mtproto_proxy,
  %% see src/mtproto_proxy.app.src for examples.
  [
   {ports,
    [#{name => mtp_handler_all_ipv4,
       listen_ip => "0.0.0.0",  % IPv4 address, eg 203.0.113.1
       port => 1443,
       secret => <<"d0d6e111bada5511fcce9584deadbeef">>,
       tag => <<"dcbe8f1493fa4cd9ab300891c0b5b326">>},
     #{name => mtp_handler_all_ipv6,
       listen_ip => "::",  % IPv6 address, eg "2001:db8:85a3::8a2e:370:7334"
       port => 1443,
       secret => <<"d0d6e111bada5511fcce9584deadbeef">>,
       tag => <<"dcbe8f1493fa4cd9ab300891c0b5b326">>}
    ]}
   ]},

 {lager,
<...>

Tune resource consumption

If your server have low amount of RAM, try to set

{upstream_socket_buffer_size, 5120},
{downstream_socket_buffer_size, 51200},
{replay_check_session_storage, off},
{init_timeout_sec, 10},
{hibernate_timeout_sec, 30},
{ready_timeout_sec, 120},  % close connection after 2min of inactivity

this may make proxy slower, it can start to consume bit more CPU, will be vulnerable to replay attacks, but will use less RAM. You should also avoid max_connections policy because it uses RAM to track connections.

If your server have lots of RAM, you can make it faster (users will get higher uppload/download speed), it will use less CPU and will be better protected from replay attacks, but will use more RAM:

{max_connections, 128000},
{upstream_socket_buffer_size, 20480},
{downstream_socket_buffer_size, 512000},
{replay_check_session_storage, on},
{replay_check_session_storage_opts,
  #{max_memory_mb => 2048,
    max_age_minutes => 1440}},

One more option to decrease CPU usage is to disable CRC32 checksum check:

{mtp_full_check_crc32, false},

Also, for highload setups it's recommended to increase sysctl parameters:

sudo sysctl net.ipv4.tcp_max_orphans=128000
sudo sysctl 'net.ipv4.tcp_mem=179200 256000 384000'

Values for tcp_mem are in pages. Size of one page can be found by getconf PAGESIZE and is most likely 4kb.

If you have installed proxy via Docker or use some NAT firewall settings, you may want to increase netfilter conntrack limits to be at least the max number of connections you expect:

sudo sysctl net.netfilter.nf_conntrack_max=128000

Helpers

Number of connections

/opt/mtp_proxy/bin/mtp_proxy eval 'lists:sum([proplists:get_value(all_connections, L) || {_, L} <- ranch:info()]).'

mtproto_proxy's People

Contributors

kianmeng avatar seriyps avatar t1me avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mtproto_proxy's Issues

Problems on uploading pictures

Hello
I installed the fake-tls branch and everything works fine except that with fake-tls I cannot upload photos. (But I can with secure mode with dd).
Also the alexbers source works correctly.

limit on secret

how can i limit connection on a secret?
or is possible see count connection on a secret in logs?

Explicitly use ipv4 when querying ip_lookup_services

Right now proxy doesn't specify IP family that should be used when connecting to IP lookup services

{ok, {{_, 200, _}, _, Body}} =
httpc:request(get, {Url, Headers}, [{timeout, 3000}], []),

Because of that, if:

  1. Proxy host does support ipv6
  2. Proxy host's DNS is configured in such a way when AAAA DNS records have higher priority
  3. IP lookup service have both A and AAAA records for their URL

proxy may connect to IP lookup service by IPv6 and will get IPv6 address back. Right now proxy doesn't support IPv6 for backend connections, so it will fail to connect to Telegram servers.

It should be possible to do like that:

httpc:request(get, {"http://v6.ident.me/", []}, [{timeout, 3000}], [{socket_opts,[{ipfamily, inet6}]}]).
httpc:request(get, {"http://v4.ident.me/", []}, [{timeout, 3000}], [{socket_opts,[{ipfamily, inet}]}]).

but this option seems broken in OTP below 20.3.4: erlang/otp@2dc08b4#diff-466315c4bb1fe3a262041717e0f9c37f

What is storage cleaned ?

Hi,
On regular intervals, I am receiving a message like this in log :
[info] <0.633.0>@mtp_session_storage:handle_info:112 storage cleaned: []; remaining: 3129
I got suspicious of it because the remaining number is just going up and never decreases.
Also the [] seems to show an array which is empty !
Can you please explain what exactly it does and is this behavior normal ?

Failed to start mtproto-proxy.service: Unit epmd.service not found.

Ubuntu 16.04, by default erlang 17, so, I installed Erlang from binaries.

Result:

Failed to start mtproto-proxy.service: Unit epmd.service not found.

Compete installation log:

# cd /opt
# wget https://packages.erlang-solutions.com/erlang/esl-erlang/FLAVOUR_1_general/esl-erlang_21.2.4-1~ubuntu~xenial_amd64.deb 

# dpkg -i esl-erlang_21.2.4-1\~ubuntu\~xenial_amd64.deb

Selecting previously unselected package esl-erlang.
(Reading database ... 165794 files and directories currently installed.)
Preparing to unpack esl-erlang_21.2.4-1~ubuntu~xenial_amd64.deb ...
Unpacking esl-erlang (1:21.2.4-1) ...
dpkg: dependency problems prevent configuration of esl-erlang:
 esl-erlang depends on libwxbase2.8-0 | libwxbase3.0-0 | libwxbase3.0-0v5; however:
  Package libwxbase2.8-0 is not installed.
  Package libwxbase3.0-0 is not installed.
  Package libwxbase3.0-0v5 is not installed.
 esl-erlang depends on libwxgtk2.8-0 | libwxgtk3.0-0 | libwxgtk3.0-0v5; however:
  Package libwxgtk2.8-0 is not installed.
  Package libwxgtk3.0-0 is not installed.
  Package libwxgtk3.0-0v5 is not installed.
 esl-erlang depends on libsctp1; however:
  Package libsctp1 is not installed.

dpkg: error processing package esl-erlang (--install):
 dependency problems - leaving unconfigured
Errors were encountered while processing:
 esl-erlang

# apt install -f

Reading package lists... Done
Building dependency tree       
Reading state information... Done
Correcting dependencies... Done
The following additional packages will be installed:
  libdrm-amdgpu1 libdrm-intel1 libdrm-nouveau2 libdrm-radeon1 libgl1-mesa-dri libgl1-mesa-glx libglapi-mesa libllvm6.0 libnotify4
  libpciaccess0 libsctp1 libsensors4 libwxbase3.0-0v5 libwxgtk3.0-0v5 libx11-xcb1 libxcb-dri2-0 libxcb-dri3-0 libxcb-glx0 libxcb-present0
  libxcb-sync1 libxshmfence1 libxxf86vm1
Suggested packages:
  lksctp-tools lm-sensors
Recommended packages:
  libtxc-dxtn-s2tc | libtxc-dxtn-s2tc0 | libtxc-dxtn0 notification-daemon
The following NEW packages will be installed:
  libdrm-amdgpu1 libdrm-intel1 libdrm-nouveau2 libdrm-radeon1 libgl1-mesa-dri libgl1-mesa-glx libglapi-mesa libllvm6.0 libnotify4
  libpciaccess0 libsctp1 libsensors4 libwxbase3.0-0v5 libwxgtk3.0-0v5 libx11-xcb1 libxcb-dri2-0 libxcb-dri3-0 libxcb-glx0 libxcb-present0
  libxcb-sync1 libxshmfence1 libxxf86vm1
0 upgraded, 22 newly installed, 0 to remove and 0 not upgraded.
1 not fully installed or removed.
Need to get 26.1 MB of archives.
After this operation, 226 MB of additional disk space will be used.
Do you want to continue? [Y/n] 
...

# dpkg -i esl-erlang_21.2.4-1\~ubuntu\~xenial_amd64.deb
(Reading database ... 170229 files and directories currently installed.)
Preparing to unpack esl-erlang_21.2.4-1~ubuntu~xenial_amd64.deb ...
Unpacking esl-erlang (1:21.2.4-1) over (1:21.2.4-1) ...
Setting up esl-erlang (1:21.2.4-1) ...

# erl --version
Erlang/OTP 21 [erts-10.2.3] [source] [64-bit] [smp:1:1] [ds:1:1:10] [async-threads:1] [hipe]

Eshell V10.2.3  (abort with ^G)

# git clone https://github.com/seriyps/mtproto_proxy.git
# cd mtproto_proxy/
# cp config/vm.args.example config/prod-vm.args
# cp config/sys.config.example config/prod-sys.config
# nano config/prod-sys.config 

# make && sudo make install

# sudo systemctl start mtproto-proxy

Failed to start mtproto-proxy.service: Unit epmd.service not found.

Originally posted by @KarelWintersky in #4 (comment)

process crash on Debian 9

"Kernel pid terminated",application_controller,"{application_start_failure,mtproto_proxy,{{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;

Fake TLS Client

How a client can connect with TLS fake protocol enabled? everything works fine but when i use allowed protocol fake TLS, none of telegram versions can connect, i have tested it even on beta versions

Unknown upstream & upstream sock error: etimedout

I have setup a new proxy and used it a few minutes for testing myself and I can already see some errors like these in the application.log :

[info] <0.623.0>@mtp_down_conn:handle_upstream_closed:240 Unknown upstream <0.721.0>
[warning] <0.719.0>@mtp_handler:handle_info:195 upstream sock error: etimedout

When it says upstream, does it mean connections to telegram proxy servers ?
Is this normal ?

crash debian9 and debian10 both try

Exec: /opt/mtp_proxy/erts-10.3.5.6/bin/erlexec -noshell -noinput +Bd -boot /opt/mtp_proxy/releases/0.1.0/mtp_proxy -mode embedded -boot_var ERTS_LIB_DIR /opt/mtp_proxy/lib -config /opt/mtp_proxy/releases/0.1.0/sys.config -args_file /opt/mtp_proxy/releases/0.1.0/vm.args -- foreground -mtproto_proxy allowed_protocols [mtp_fake_tls,mtp_secure] -mtproto_proxy ports [#{name => mtproto_proxy, port => 5555, secret => <<"5e0f1916a79433f3bc26cca1799dff63">>, tag => <<"837e719d9d328a07105dad7997367a46">>}]
Root: /opt/mtp_proxy
/opt/mtp_proxy
=SUPERVISOR REPORT==== 17-Dec-2019::16:51:21.359214 ===
supervisor: {local,mtproto_proxy_sup}
errorContext: start_error
reason: {{badmatch,
{error,
{failed_connect,
[{to_address,{"core.telegram.org",443}},
{inet,[inet],timeout}]}}},
[{mtp_config,http_get,1,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,277}]},
{mtp_config,update_key,1,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,192}]},
{mtp_config,update,2,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,181}]},
{mtp_config,init,1,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,144}]},
{gen_server,init_it,2,[{file,"gen_server.erl"},{line,374}]},
{gen_server,init_it,6,[{file,"gen_server.erl"},{line,342}]},
{proc_lib,init_p_do_apply,3,
[{file,"proc_lib.erl"},{line,249}]}]}
offender: [{pid,undefined},
{id,mtp_config},
{mfargs,{mtp_config,start_link,[]}},
{restart_type,permanent},
{shutdown,5000},
{child_type,worker}]
=CRASH REPORT==== 17-Dec-2019::16:51:21.358782 ===
crasher:
initial call: mtp_config:init/1
pid: <0.587.0>
registered_name: []
exception error: no match of right hand side value
{error,
{failed_connect,
[{to_address,{"core.telegram.org",443}},
{inet,[inet],timeout}]}}
in function mtp_config:http_get/1 (/build/mtproto_proxy/src/mtp_config.erl, line 277)
in call from mtp_config:update_key/1 (/build/mtproto_proxy/src/mtp_config.erl, line 192)
in call from mtp_config:update/2 (/build/mtproto_proxy/src/mtp_config.erl, line 181)
in call from mtp_config:init/1 (/build/mtproto_proxy/src/mtp_config.erl, line 144)
in call from gen_server:init_it/2 (gen_server.erl, line 374)
in call from gen_server:init_it/6 (gen_server.erl, line 342)
ancestors: [mtproto_proxy_sup,<0.583.0>]
message_queue_len: 0
messages: []
links: [<0.584.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 22919
neighbours:

=CRASH REPORT==== 17-Dec-2019::16:51:21.366260 ===
crasher:
initial call: application_master:init/4
pid: <0.582.0>
registered_name: []
exception exit: {bad_return,
{{mtproto_proxy_app,start,[normal,[]]},
{'EXIT',
{{badmatch,
{error,
{shutdown,
{failed_to_start_child,mtp_config,
{{badmatch,
{error,
{failed_connect,
[{to_address,{"core.telegram.org",443}},
{inet,[inet],timeout}]}}},
[{mtp_config,http_get,1,
[{file,
"/build/mtproto_proxy/src/mtp_config.erl"},
{line,277}]},
{mtp_config,update_key,1,
[{file,
"/build/mtproto_proxy/src/mtp_config.erl"},
{line,192}]},
{mtp_config,update,2,
[{file,
"/build/mtproto_proxy/src/mtp_config.erl"},
{line,181}]},
{mtp_config,init,1,
[{file,
"/build/mtproto_proxy/src/mtp_config.erl"},
{line,144}]},
{gen_server,init_it,2,
[{file,"gen_server.erl"},{line,374}]},
{gen_server,init_it,6,
[{file,"gen_server.erl"},{line,342}]},
{proc_lib,init_p_do_apply,3,
[{file,"proc_lib.erl"},{line,249}]}]}}}}},
[{mtproto_proxy_app,start,2,
[{file,
"/build/mtproto_proxy/src/mtproto_proxy_app.erl"},
{line,34}]},
{application_master,start_it_old,4,
[{file,"application_master.erl"},{line,277}]}]}}}}
in function application_master:init/4 (application_master.erl, line 138)
ancestors: [<0.581.0>]
message_queue_len: 1
messages: [{'EXIT',<0.583.0>,normal}]
links: [<0.581.0>,<0.479.0>]
dictionary: []
trap_exit: true
status: running
heap_size: 1598
stack_size: 27
reductions: 275
neighbours:

=INFO REPORT==== 17-Dec-2019::16:51:21.370606 ===
application: mtproto_proxy
exited: {bad_return,
{{mtproto_proxy_app,start,[normal,[]]},
{'EXIT',
{{badmatch,
{error,
{shutdown,
{failed_to_start_child,mtp_config,
{{badmatch,
{error,
{failed_connect,
[{to_address,{"core.telegram.org",443}},
{inet,[inet],timeout}]}}},
[{mtp_config,http_get,1,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,277}]},
{mtp_config,update_key,1,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,192}]},
{mtp_config,update,2,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,181}]},
{mtp_config,init,1,
[{file,"/build/mtproto_proxy/src/mtp_config.erl"},
{line,144}]},
{gen_server,init_it,2,
[{file,"gen_server.erl"},{line,374}]},
{gen_server,init_it,6,
[{file,"gen_server.erl"},{line,342}]},
{proc_lib,init_p_do_apply,3,
[{file,"proc_lib.erl"},{line,249}]}]}}}}},
[{mtproto_proxy_app,start,2,
[{file,"/build/mtproto_proxy/src/mtproto_proxy_app.erl"},
{line,34}]},
{application_master,start_it_old,4,
[{file,"application_master.erl"},{line,277}]}]}}}}
type: permanent
=SUPERVISOR REPORT==== 17-Dec-2019::16:51:21.410506 ===
supervisor: {local,gr_counter_sup}
errorContext: child_terminated
reason: killed
offender: [{pid,<0.550.0>},
{id,gr_lager_default_tracer_counters},
{mfargs,{gr_counter,start_link,
[gr_lager_default_tracer_counters]}},
{restart_type,transient},
{shutdown,brutal_kill},
{child_type,worker}]
=SUPERVISOR REPORT==== 17-Dec-2019::16:51:21.410904 ===
supervisor: {local,gr_param_sup}
errorContext: child_terminated
reason: killed
offender: [{pid,<0.549.0>},
{id,gr_lager_default_tracer_params},
{mfargs,{gr_param,start_link,[gr_lager_default_tracer_params]}},
{restart_type,transient},
{shutdown,brutal_kill},
{child_type,worker}]
{"Kernel pid terminated",application_controller,"{application_start_failure,mtproto_proxy,{bad_return,{{mtproto_proxy_app,start,[normal,[]]},{'EXIT',{{badmatch,{error,{shutdown,{failed_to_start_child,mtp_config,{{badmatch,{error,{failed_connect,[{to_address,{"core.telegram.org",443}},{inet,[inet],timeout}]}}},[{mtp_config,http_get,1,[{file,"/build/mtproto_proxy/src/mtp_config.erl"},{line,277}]},{mtp_config,update_key,1,[{file,"/build/mtproto_proxy/src/mtp_config.erl"},{line,192}]},{mtp_config,update,2,[{file,"/build/mtproto_proxy/src/mtp_config.erl"},{line,181}]},{mtp_config,init,1,[{file,"/build/mtproto_proxy/src/mtp_config.erl"},{line,144}]},{gen_server,init_it,2,[{file,"gen_server.erl"},{line,374}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,342}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,249}]}]}}}}},[{mtproto_proxy_app,start,2,[{file,"/build/mtproto_proxy/src/mtproto_proxy_app.erl"},{line,34}]},{application_master,start_it_old,4,[{file,"application_master.erl"},{line,277}]}]}}}}}"}
Kernel pid terminated (application_controller) ({application_start_failure,mtproto_proxy,{bad_return,{{mtproto_proxy_app,start,[normal,[]]},{'EXIT',{{badmatch,{error,{shutdown,{failed_to_start_child,m

Crash dump is being written to: erl_crash.dump...done

fake tls for ipv6

hey i edit config file and put my ipv6 but i can not connect proxy with ipv6 (ipv4 is working..)..
how can i create fake tls for ipv6 ?

Server does not start

When deploying a new server, a problem arises that I have not yet encountered.

I provide the logs below:

crash.log 2019-01-18 01:19:38 =CRASH REPORT==== crasher: initial call: mtp_config:init/1 pid: <0.532.0> registered_name: [] exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.37:80;\nproxy_for 2 149.154.162.31:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.166:8888;\nproxy_for 4 91.108.4.177:8888;\nproxy_for 4 91.108.4.214:8888;\nproxy_for 4 91.108.4.219:8888;\nproxy_for 4 91.108.4.189:8888;\nproxy_for 4 91.108.4.157:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.215:8888;\nproxy_for 4 91.108.4.164:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.180:8888;\nproxy_for 5 91.108.56.107:8888;\nproxy_for -5 91.108.56.180:8888;\nproxy_for -5 91.108.56.107:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]} ancestors: [mtproto_proxy_sup,<0.530.0>] messages: [] links: [<0.531.0>] dictionary: [] trap_exit: false status: running heap_size: 4185 stack_size: 27 reductions: 1118 neighbours: 2019-01-18 01:19:38 =SUPERVISOR REPORT==== Supervisor: {local,mtproto_proxy_sup} Context: start_error Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.37:80;\nproxy_for 2 149.154.162.31:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.166:8888;\nproxy_for 4 91.108.4.177:8888;\nproxy_for 4 91.108.4.214:8888;\nproxy_for 4 91.108.4.219:8888;\nproxy_for 4 91.108.4.189:8888;\nproxy_for 4 91.108.4.157:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.215:8888;\nproxy_for 4 91.108.4.164:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.180:8888;\nproxy_for 5 91.108.56.107:8888;\nproxy_for -5 91.108.56.180:8888;\nproxy_for -5 91.108.56.107:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]} Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:41 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.30:80;\nproxy_for 2 149.154.162.22:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for -2 149.154.162.22:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.188:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.215:8888;\nproxy_for 4 91.108.4.161:8888;\nproxy_for 4 91.108.4.170:8888;\nproxy_for 4 91.108.4.175:8888;\nproxy_for 4 91.108.4.151:8888;\nproxy_for 4 91.108.4.212:8888;\nproxy_for 4 91.108.4.181:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.170:8888;\nproxy_for 5 91.108.56.118:8888;\nproxy_for -5 91.108.56.170:8888;\nproxy_for -5 91.108.56.118:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:41 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.30:80;\nproxy_for 2 149.154.162.22:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for -2 149.154.162.22:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.188:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.215:8888;\nproxy_for 4 91.108.4.161:8888;\nproxy_for 4 91.108.4.170:8888;\nproxy_for 4 91.108.4.175:8888;\nproxy_for 4 91.108.4.151:8888;\nproxy_for 4 91.108.4.212:8888;\nproxy_for 4 91.108.4.181:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.170:8888;\nproxy_for 5 91.108.56.118:8888;\nproxy_for -5 91.108.56.170:8888;\nproxy_for -5 91.108.56.118:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:43 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.23:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.23:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.175:8888;\nproxy_for 4 91.108.4.222:8888;\nproxy_for 4 91.108.4.157:8888;\nproxy_for 4 91.108.4.181:8888;\nproxy_for 4 91.108.4.156:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.219:8888;\nproxy_for 4 91.108.4.209:8888;\nproxy_for 4 91.108.4.144:8888;\nproxy_for 4 91.108.4.141:8888;\nproxy_for -4 149.154.166.120:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for 5 91.108.56.175:8888;\nproxy_for 5 91.108.56.102:8888;\nproxy_for -5 91.108.56.175:8888;\nproxy_for -5 91.108.56.102:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:43 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.23:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.23:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.175:8888;\nproxy_for 4 91.108.4.222:8888;\nproxy_for 4 91.108.4.157:8888;\nproxy_for 4 91.108.4.181:8888;\nproxy_for 4 91.108.4.156:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.219:8888;\nproxy_for 4 91.108.4.209:8888;\nproxy_for 4 91.108.4.144:8888;\nproxy_for 4 91.108.4.141:8888;\nproxy_for -4 149.154.166.120:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for 5 91.108.56.175:8888;\nproxy_for 5 91.108.56.102:8888;\nproxy_for -5 91.108.56.175:8888;\nproxy_for -5 91.108.56.102:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:46 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.31:80;\nproxy_for 2 149.154.162.34:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for -2 149.154.162.34:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.200:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.225:8888;\nproxy_for 4 91.108.4.205:8888;\nproxy_for 4 91.108.4.161:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.134:8888;\nproxy_for 4 91.108.4.224:8888;\nproxy_for 4 91.108.4.165:8888;\nproxy_for 4 91.108.4.203:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for 5 91.108.56.200:8888;\nproxy_for 5 91.108.56.195:8888;\nproxy_for -5 91.108.56.200:8888;\nproxy_for -5 91.108.56.195:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:46 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.31:80;\nproxy_for 2 149.154.162.34:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for -2 149.154.162.34:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.200:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.225:8888;\nproxy_for 4 91.108.4.205:8888;\nproxy_for 4 91.108.4.161:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.134:8888;\nproxy_for 4 91.108.4.224:8888;\nproxy_for 4 91.108.4.165:8888;\nproxy_for 4 91.108.4.203:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for 5 91.108.56.200:8888;\nproxy_for 5 91.108.56.195:8888;\nproxy_for -5 91.108.56.200:8888;\nproxy_for -5 91.108.56.195:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:48 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.24:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.24:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.220:8888;\nproxy_for 4 91.108.4.192:8888;\nproxy_for 4 91.108.4.218:8888;\nproxy_for 4 91.108.4.153:8888;\nproxy_for 4 91.108.4.224:8888;\nproxy_for 4 91.108.4.191:8888;\nproxy_for 4 91.108.4.204:8888;\nproxy_for 4 91.108.4.222:8888;\nproxy_for 4 91.108.4.217:8888;\nproxy_for 4 91.108.4.202:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for 5 91.108.56.133:8888;\nproxy_for 5 91.108.56.106:8888;\nproxy_for -5 91.108.56.133:8888;\nproxy_for -5 91.108.56.106:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:48 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.24:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.24:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.220:8888;\nproxy_for 4 91.108.4.192:8888;\nproxy_for 4 91.108.4.218:8888;\nproxy_for 4 91.108.4.153:8888;\nproxy_for 4 91.108.4.224:8888;\nproxy_for 4 91.108.4.191:8888;\nproxy_for 4 91.108.4.204:8888;\nproxy_for 4 91.108.4.222:8888;\nproxy_for 4 91.108.4.217:8888;\nproxy_for 4 91.108.4.202:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for 5 91.108.56.133:8888;\nproxy_for 5 91.108.56.106:8888;\nproxy_for -5 91.108.56.133:8888;\nproxy_for -5 91.108.56.106:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:51 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.38:80;\nproxy_for 2 149.154.162.36:80;\nproxy_for -2 149.154.162.38:80;\nproxy_for -2 149.154.162.36:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.169:8888;\nproxy_for 4 91.108.4.178:8888;\nproxy_for 4 91.108.4.199:8888;\nproxy_for 4 91.108.4.140:8888;\nproxy_for 4 91.108.4.132:8888;\nproxy_for 4 91.108.4.175:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.204:8888;\nproxy_for 4 91.108.4.161:8888;\nproxy_for 4 91.108.4.138:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.111:8888;\nproxy_for 5 91.108.56.144:8888;\nproxy_for -5 91.108.56.111:8888;\nproxy_for -5 91.108.56.144:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:51 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.38:80;\nproxy_for 2 149.154.162.36:80;\nproxy_for -2 149.154.162.38:80;\nproxy_for -2 149.154.162.36:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.169:8888;\nproxy_for 4 91.108.4.178:8888;\nproxy_for 4 91.108.4.199:8888;\nproxy_for 4 91.108.4.140:8888;\nproxy_for 4 91.108.4.132:8888;\nproxy_for 4 91.108.4.175:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.204:8888;\nproxy_for 4 91.108.4.161:8888;\nproxy_for 4 91.108.4.138:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.111:8888;\nproxy_for 5 91.108.56.144:8888;\nproxy_for -5 91.108.56.111:8888;\nproxy_for -5 91.108.56.144:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:53 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.28:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.28:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.211:8888;\nproxy_for 4 91.108.4.181:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.174:8888;\nproxy_for 4 91.108.4.195:8888;\nproxy_for 4 91.108.4.205:8888;\nproxy_for 4 91.108.4.201:8888;\nproxy_for 4 91.108.4.144:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.164:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for 5 91.108.56.151:8888;\nproxy_for 5 91.108.56.179:8888;\nproxy_for -5 91.108.56.151:8888;\nproxy_for -5 91.108.56.179:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:53 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.28:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.28:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.211:8888;\nproxy_for 4 91.108.4.181:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.174:8888;\nproxy_for 4 91.108.4.195:8888;\nproxy_for 4 91.108.4.205:8888;\nproxy_for 4 91.108.4.201:8888;\nproxy_for 4 91.108.4.144:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.164:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for 5 91.108.56.151:8888;\nproxy_for 5 91.108.56.179:8888;\nproxy_for -5 91.108.56.151:8888;\nproxy_for -5 91.108.56.179:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:56 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.26:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.26:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.187:8888;\nproxy_for 4 91.108.4.170:8888;\nproxy_for 4 91.108.4.197:8888;\nproxy_for 4 91.108.4.198:8888;\nproxy_for 4 91.108.4.169:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.186:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for -4 149.154.166.120:8888;\nproxy_for 5 91.108.56.186:8888;\nproxy_for 5 91.108.56.102:8888;\nproxy_for -5 91.108.56.186:8888;\nproxy_for -5 91.108.56.102:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:56 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.26:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.26:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.187:8888;\nproxy_for 4 91.108.4.170:8888;\nproxy_for 4 91.108.4.197:8888;\nproxy_for 4 91.108.4.198:8888;\nproxy_for 4 91.108.4.169:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.186:8888;\nproxy_for -4 149.154.164.250:8888;\nproxy_for -4 149.154.166.120:8888;\nproxy_for 5 91.108.56.186:8888;\nproxy_for 5 91.108.56.102:8888;\nproxy_for -5 91.108.56.186:8888;\nproxy_for -5 91.108.56.102:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:19:58 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.35:80;\nproxy_for 2 149.154.162.31:80;\nproxy_for -2 149.154.162.35:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.185:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.132:8888;\nproxy_for 4 91.108.4.199:8888;\nproxy_for 4 91.108.4.224:8888;\nproxy_for 4 91.108.4.223:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.153:8888;\nproxy_for 4 91.108.4.215:8888;\nproxy_for 4 91.108.4.204:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.166.120:8888;\nproxy_for 5 91.108.56.117:8888;\nproxy_for 5 91.108.56.159:8888;\nproxy_for -5 91.108.56.117:8888;\nproxy_for -5 91.108.56.159:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:19:58 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.35:80;\nproxy_for 2 149.154.162.31:80;\nproxy_for -2 149.154.162.35:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.185:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.132:8888;\nproxy_for 4 91.108.4.199:8888;\nproxy_for 4 91.108.4.224:8888;\nproxy_for 4 91.108.4.223:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.153:8888;\nproxy_for 4 91.108.4.215:8888;\nproxy_for 4 91.108.4.204:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.166.120:8888;\nproxy_for 5 91.108.56.117:8888;\nproxy_for 5 91.108.56.159:8888;\nproxy_for -5 91.108.56.117:8888;\nproxy_for -5 91.108.56.159:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:20:01 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.37:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.193:8888;\nproxy_for 4 91.108.4.190:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.221:8888;\nproxy_for 4 91.108.4.151:8888;\nproxy_for 4 91.108.4.209:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.212:8888;\nproxy_for 4 91.108.4.155:8888;\nproxy_for 4 91.108.4.227:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for 5 91.108.56.112:8888;\nproxy_for 5 91.108.56.199:8888;\nproxy_for -5 91.108.56.112:8888;\nproxy_for -5 91.108.56.199:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:20:01 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.37:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.193:8888;\nproxy_for 4 91.108.4.190:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.221:8888;\nproxy_for 4 91.108.4.151:8888;\nproxy_for 4 91.108.4.209:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.212:8888;\nproxy_for 4 91.108.4.155:8888;\nproxy_for 4 91.108.4.227:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for 5 91.108.56.112:8888;\nproxy_for 5 91.108.56.199:8888;\nproxy_for -5 91.108.56.112:8888;\nproxy_for -5 91.108.56.199:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:20:03 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.21:80;\nproxy_for 2 149.154.162.25:80;\nproxy_for -2 149.154.162.21:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.164:8888;\nproxy_for 4 91.108.4.189:8888;\nproxy_for 4 91.108.4.136:8888;\nproxy_for 4 91.108.4.151:8888;\nproxy_for 4 91.108.4.130:8888;\nproxy_for 4 91.108.4.214:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.191:8888;\nproxy_for 4 91.108.4.212:8888;\nproxy_for 4 91.108.4.140:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.196:8888;\nproxy_for 5 91.108.56.133:8888;\nproxy_for -5 91.108.56.196:8888;\nproxy_for -5 91.108.56.133:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:20:03 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.21:80;\nproxy_for 2 149.154.162.25:80;\nproxy_for -2 149.154.162.21:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.164:8888;\nproxy_for 4 91.108.4.189:8888;\nproxy_for 4 91.108.4.136:8888;\nproxy_for 4 91.108.4.151:8888;\nproxy_for 4 91.108.4.130:8888;\nproxy_for 4 91.108.4.214:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.191:8888;\nproxy_for 4 91.108.4.212:8888;\nproxy_for 4 91.108.4.140:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for 5 91.108.56.196:8888;\nproxy_for 5 91.108.56.133:8888;\nproxy_for -5 91.108.56.196:8888;\nproxy_for -5 91.108.56.133:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

2019-01-18 01:20:06 =CRASH REPORT====
crasher:
initial call: mtp_config:init/1
pid: <0.532.0>
registered_name: []
exception exit: {{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.23:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.23:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.184:8888;\nproxy_for 4 91.108.4.149:8888;\nproxy_for 4 91.108.4.201:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.202:8888;\nproxy_for 4 91.108.4.195:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.211:8888;\nproxy_for 4 91.108.4.205:8888;\nproxy_for 4 91.108.4.187:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for 5 91.108.56.183:8888;\nproxy_for 5 91.108.56.136:8888;\nproxy_for -5 91.108.56.183:8888;\nproxy_for -5 91.108.56.136:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]},[{gen_server,init_it,6,[{file,"gen_server.erl"},{line,352}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ancestors: [mtproto_proxy_sup,<0.530.0>]
messages: []
links: [<0.531.0>]
dictionary: []
trap_exit: false
status: running
heap_size: 4185
stack_size: 27
reductions: 1118
neighbours:
2019-01-18 01:20:06 =SUPERVISOR REPORT====
Supervisor: {local,mtproto_proxy_sup}
Context: start_error
Reason: {undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.23:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.23:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.184:8888;\nproxy_for 4 91.108.4.149:8888;\nproxy_for 4 91.108.4.201:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.202:8888;\nproxy_for 4 91.108.4.195:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.211:8888;\nproxy_for 4 91.108.4.205:8888;\nproxy_for 4 91.108.4.187:8888;\nproxy_for -4 149.154.165.250:8888;\nproxy_for -4 149.154.165.109:8888;\nproxy_for 5 91.108.56.183:8888;\nproxy_for 5 91.108.56.136:8888;\nproxy_for -5 91.108.56.183:8888;\nproxy_for -5 91.108.56.136:8888;\n","\n"],[]},{mtp_config,parse_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,138}]},{mtp_config,update_config,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,132}]},{mtp_config,update,2,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,116}]},{mtp_config,init,1,[{file,"/root/mtproto_proxy/mtproto_proxy/_build/prod/lib/mtproto_proxy/src/mtp_config.erl"},{line,83}]},{gen_server,init_it,6,[{file,"gen_server.erl"},{line,328}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
Offender: [{pid,undefined},{id,mtp_config},{mfargs,{mtp_config,start_link,[]}},{restart_type,permanent},{shutdown,5000},{child_type,worker}]

application.log 2019-01-18 01:19:37.949 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:38.179 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:38.179 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:38.187 [error] <0.529.0> CRASH REPORT Process <0.529.0> with 0 neighbours exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.37:80;\nproxy_for 2 149.154.162.31:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.166:8888;\nproxy_for 4 91.108.4.177:8888;\nproxy_for 4 91.108.4.214:8888;...",...],...},...]}}},...} in application_master:init/4 line 134 2019-01-18 01:19:38.188 [info] <0.442.0> Application mtproto_proxy exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.37:80;\nproxy_for 2 149.154.162.31:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.213:8888;\nproxy_for 4 91.108.4.166:8888;\nproxy_for 4 91.108.4.177:8888;\nproxy_for 4 91.108.4.214:8888;...",...],...},...]}}},...} 2019-01-18 01:19:40.821 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:41.053 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:41.054 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:41.059 [error] <0.529.0> CRASH REPORT Process <0.529.0> with 0 neighbours exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.30:80;\nproxy_for 2 149.154.162.22:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for -2 149.154.162.22:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.188:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.215:8888;...",...],...},...]}}},...} in application_master:init/4 line 134 2019-01-18 01:19:41.060 [info] <0.442.0> Application mtproto_proxy exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.30:80;\nproxy_for 2 149.154.162.22:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for -2 149.154.162.22:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.152:8888;\nproxy_for 4 91.108.4.188:8888;\nproxy_for 4 91.108.4.162:8888;\nproxy_for 4 91.108.4.215:8888;...",...],...},...]}}},...} 2019-01-18 01:19:43.541 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:43.770 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:43.770 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:45.943 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:46.171 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:46.172 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:46.179 [error] <0.529.0> CRASH REPORT Process <0.529.0> with 0 neighbours exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.31:80;\nproxy_for 2 149.154.162.34:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for -2 149.154.162.34:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.200:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.225:8888;\nproxy_for 4 91.108.4.205:8888;...",...],...},...]}}},...} in application_master:init/4 line 134 2019-01-18 01:19:46.180 [info] <0.442.0> Application mtproto_proxy exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.31:80;\nproxy_for 2 149.154.162.34:80;\nproxy_for -2 149.154.162.31:80;\nproxy_for -2 149.154.162.34:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.200:8888;\nproxy_for 4 91.108.4.194:8888;\nproxy_for 4 91.108.4.225:8888;\nproxy_for 4 91.108.4.205:8888;...",...],...},...]}}},...} 2019-01-18 01:19:48.446 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:48.671 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:48.672 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:50.947 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:51.172 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:51.172 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:51.182 [error] <0.529.0> CRASH REPORT Process <0.529.0> with 0 neighbours exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.38:80;\nproxy_for 2 149.154.162.36:80;\nproxy_for -2 149.154.162.38:80;\nproxy_for -2 149.154.162.36:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.169:8888;\nproxy_for 4 91.108.4.178:8888;\nproxy_for 4 91.108.4.199:8888;\nproxy_for 4 91.108.4.140:8888;...",...],...},...]}}},...} in application_master:init/4 line 134 2019-01-18 01:19:51.183 [info] <0.442.0> Application mtproto_proxy exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.38:80;\nproxy_for 2 149.154.162.36:80;\nproxy_for -2 149.154.162.38:80;\nproxy_for -2 149.154.162.36:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.169:8888;\nproxy_for 4 91.108.4.178:8888;\nproxy_for 4 91.108.4.199:8888;\nproxy_for 4 91.108.4.140:8888;...",...],...},...]}}},...} 2019-01-18 01:19:53.443 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:53.680 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:53.681 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:55.953 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:56.183 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:56.183 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:19:56.187 [error] <0.529.0> CRASH REPORT Process <0.529.0> with 0 neighbours exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.26:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.26:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.187:8888;\nproxy_for 4 91.108.4.170:8888;\nproxy_for 4 91.108.4.197:8888;...",...],...},...]}}},...} in application_master:init/4 line 134 2019-01-18 01:19:56.188 [info] <0.442.0> Application mtproto_proxy exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.26:80;\nproxy_for 2 149.154.162.30:80;\nproxy_for -2 149.154.162.26:80;\nproxy_for -2 149.154.162.30:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.135:8888;\nproxy_for 4 91.108.4.187:8888;\nproxy_for 4 91.108.4.170:8888;\nproxy_for 4 91.108.4.197:8888;...",...],...},...]}}},...} 2019-01-18 01:19:58.461 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:19:58.691 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:19:58.691 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:20:00.934 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:20:01.162 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:20:01.163 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:20:01.171 [error] <0.529.0> CRASH REPORT Process <0.529.0> with 0 neighbours exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.37:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.193:8888;\nproxy_for 4 91.108.4.190:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.221:8888;...",...],...},...]}}},...} in application_master:init/4 line 134 2019-01-18 01:20:01.172 [info] <0.442.0> Application mtproto_proxy exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.37:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.37:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.193:8888;\nproxy_for 4 91.108.4.190:8888;\nproxy_for 4 91.108.4.179:8888;\nproxy_for 4 91.108.4.221:8888;...",...],...},...]}}},...} 2019-01-18 01:20:03.472 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:20:03.701 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:20:03.701 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:20:05.939 [info] <0.442.0> Application lager started on node '[email protected]' 2019-01-18 01:20:06.169 [error] <0.532.0>@string:lexemes CRASH REPORT Process <0.532.0> with 0 neighbours exited with reason: call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in gen_server:init_it/6 line 352 2019-01-18 01:20:06.170 [error] <0.531.0>@string:lexemes Supervisor mtproto_proxy_sup had child mtp_config started with mtp_config:start_link() at undefined exit with reason call to undefined function string:lexemes("# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175....", "\n") in context start_error 2019-01-18 01:20:06.180 [error] <0.529.0> CRASH REPORT Process <0.529.0> with 0 neighbours exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.23:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.23:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.184:8888;\nproxy_for 4 91.108.4.149:8888;\nproxy_for 4 91.108.4.201:8888;\nproxy_for 4 91.108.4.179:8888;...",...],...},...]}}},...} in application_master:init/4 line 134 2019-01-18 01:20:06.181 [info] <0.442.0> Application mtproto_proxy exited with reason: {{shutdown,{failed_to_start_child,mtp_config,{undef,[{string,lexemes,["# force_probability 10 10\ndefault 2;\nproxy_for 1 149.154.175.50:8888;\nproxy_for -1 149.154.175.50:8888;\nproxy_for 2 149.154.162.25:80;\nproxy_for 2 149.154.162.23:80;\nproxy_for -2 149.154.162.25:80;\nproxy_for -2 149.154.162.23:80;\nproxy_for 3 149.154.175.100:8888;\nproxy_for -3 149.154.175.100:8888;\nproxy_for 4 91.108.4.184:8888;\nproxy_for 4 91.108.4.149:8888;\nproxy_for 4 91.108.4.201:8888;\nproxy_for 4 91.108.4.179:8888;...",...],...},...]}}},...}

crypto:rand_uniform/2 is deprecated

===> Compiling mtproto_proxy
src/mtp_config.erl:49: Warning: crypto:rand_uniform/2 is deprecated and will be removed in a future release; use rand:uniform/1
src/mtp_config.erl:60: Warning: crypto:rand_uniform/2 is deprecated and will be removed in a future release; use rand:uniform/1
src/mtp_config.erl:123: Warning: erlang:get_stacktrace/0: deprecated; use the new try/catch syntax for retrieving the stack backtrace
src/mtp_config.erl:183: Warning: erlang:get_stacktrace/0: deprecated; use the new try/catch syntax for retrieving the stack backtrace

Erlang 21.2.4 , installed via deb-file

question about performance?

ex: how many user can use that with 1 core cpu and 1 gig ram?
is there anywy to use it in windows server?

limit on users speed

Hi
Is there any way i can limit every connection on speed ? or every user ? or whole packages have max speed limitation ?

buffer sizes

In the "Tune resource consumption" of Readme, it recommends the following settings if you have a lot of RAM :

{upstream_socket_buffer_size, 20480},
{downstream_socket_buffer_size, 512000}

Which is 20KB and 512KB
In the mtproto_proxy.app.src we have these settings (which are commented out):

%% {upstream_socket_buffer_size, 51200},   %50kb
%% {downstream_socket_buffer_size, 512000},   %500kb

So Readme is recommending to reduce the upstream_socket_buffer_size and dont change downstream_socket_buffer_size if I have a lot of RAM ?
or default settings are something else ? I searched the code but couldn't find any other default values for buffer sizes.

change config file without reinstall script

Hello
you said :

There are other ways as well. It's even possible to update configuration options without service restart / without downtime, but it's a bit trickier.'

please explain this method.
also how can i limit or view connections number( or client ips ) on each port?
cal we use one port with multi secret?

thanks

User's limitation

Hello,

I use this setting on prod-sys.config but it doesn't block anything, and there was 28 connections on my server with the same fake-tls secret on my 4 devices:

%% -- mode: erlang --
[
{mtproto_proxy,
%% see src/mtproto_proxy.app.src for examples.
[
{policy,
[{max_connections, [port_name, client_ipv4], 6}]
[{max_connections, [port_name, tls_domain], 6},
{in_table, tls_domain, customer_domains}]},
{allowed_protocols, [mtp_fake_tls,mtp_secure]},
{ports,
[#{name => mtp_handler_1,
listen_ip => "0.0.0.0",
port => 7777,
secret => <<"432b9eadffb518f40901f25570c56ce3">>,
tag => <<"8b081275ec12abd306faeb2f13efbdcb">>}
]}
]},

Did I do something wrong?

Crash on launch on macOS

I'm trying to run mtproto_proxy in Docker on macOS Catalina as described in the instructions, and I'm getting an error. Please see the crash report in the attachment.

mtproto_proxy.log

Failed to set ntp: NTP not supported.

Tried to install on CentOS 7 using the interactive script. script was giving this error Failed to set ntp: NTP not supported. and exiting.
Installing ntp package resolved the issue. yum install ntp
ntp should be added to prerequisite packages.

I have some problem

Hi , i have some problem with proxy
Do you have whatssapp or telegram ID for chat ?

problem in start service

Hey when i run sudo systemctl start mtproto-proxyi have this problem:
Warning: The unit file, source configuration file or drop-ins of mtproto-proxy.service changed on disk. Run 'systemctl daemon-reload' to reload units.
and when i run systemctl daemon-reload problem still exist.
Can you tell me how i can fix it?

TLS Fake

hi
The proxy link that is with Fake Tls ( hash and base64 ) Some of the Iranian internet is not working and cannot be connected
At the beginning and after the server config
And the server is not filtered
Where's the problem?

Re-create downstream connection if failed

{Pid, DsM1} ->
Pending1 = lists:delete(Pid, Pending),
Ds1 = ds_remove(Pid, Ds),
?log(error, "Downstream=~p is down. reason=~p", [Pid, Reason]),
St#state{pending_downstreams = Pending1,
downstreams = Ds1,
downstream_monitors = DsM1};

If dc_pool gets notified about failed downstream connection, it doesn't restart it.

Expected behaviour:

Should re-create connection if number of currently open connections is below init_dc_conns.

Tls

doesn't work Fake TLS with first script ( one-line interactive installer )

How to create multi port and multi secret proxies in fake tls ?

How to create multi port and multi secret proxies in fake tls ? Just 1 Server .

sample MTproxy Suorce :

group1
cd ~/MTProxy/objs/bin && ./mtproto-proxy -u group1 -p 7777 -H 6160,1369,5183,1406,3730,9571,3670,8225,1227,2893,1526 -S 373020c2a6f1b0b6brtb2998b22b02cd8 -P deb769a5ee810d8dd732b2f79a4d69283 --aes-pwd proxy-secret proxy-multi.conf -M 1 -C 1000 -c 1000

group2

cd ~/MTProxy/objs/bin && ./mtproto-proxy -u group2 -p 7777 -H 5699,4962,7238,4127,2148,4669,5127,9938,1277,1666,5588,7825,1645,3822,3932,7053,1368,5773,8005,8297,7863,8747,6009 -S ccb5131df07469912c910df1d5321dbb -P deb769a5a810d8dd732bff79a4d69283 --aes-pwd proxy-secret proxy-multi.conf -M 1 -C 1000 -c 1000

plz help me .

best regards

wrong proxy link in the log file when listen_ip is set

I have two IP addresses set on eth0 (x.x.x.x) and eth0:0 (y.y.y.y)
I have configured the proxy to listen on eth0:0 : listen_ip => "y.y.y.y"
But it in the application.log it prints the proxy link as :
https://t.me/proxy?server=x.x.x.x&port=443&secret=aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
This link will not work because the proxy is not listening on x.x.x.x

policy

Hello
How can i define policy for each portname?
Thanks

Node is not running!

Hi
I tried to run the following command but it returns this error: Node is not running! , what I should do to fix this? 👍

"/opt/mtp_proxy/bin/mtp_proxy eval 'lists:sum([proplists:get_value(all_connections, L) || {_, L} <- ranch:info()]).' "

Base64 Secret Problem

hi
In Apple phones, Users can establish connection through "/" base64 secrets, But not in Android, Is it possible to achieve it in android?
Actually, Users can replace " _ " with " / " to establish connection But i want them to connect through this " / ".

error

===> Compiling src/mtp_down_conn.erl failed
src/mtp_down_conn.erl:37: syntax error before: ':='

src/mtp_down_conn.erl:24: type upstream_opts() undefined

Makefile:10: recipe for target 'all' failed
make: *** [all] Error 1
--------------------------------------
src/gen_timeout.erl:33: syntax error before: ':='

src/gen_timeout.erl:39: type opts() undefined

Makefile:10: recipe for target 'all' failed
make: *** [all] Error 1
-------------------------------------------

???
@seriyps

Connection issues

When i connect to proxy, in log file on server make record: protocol_error tls_invalid_digest
I use in client secret eg: ee(+)d1d908dd913f9b15cacf3e89ec7de74ac2351d4435d8a0215f75e1b62eadfd45 (generate after container start)
how i make correct secret?
old work secret eg: 9a1be70073b89885774e6617e9870ce0

problem in Centos 7.6

Hello, I changed your bash code to install this code for centos, but only dd address is working, I want to use FAKE TLS. What do I do?

make && make install wont work

i get this error when use "make && make install" command:

./rebar3 as prod release
===> Rebar dependency crypto could not be loaded for reason {"no such file or directory",
"crypto.app"}
Makefile:11: recipe for target 'all' failed
make: *** [all] Error 1

Extra random port listened by process

Hello. I'm not familiar with erlang, just configured and set up the proxy.
I've made an OS install with custom config.
Here is what I changed in config:

[
 {mtproto_proxy,
  %% see src/mtproto_proxy.app.src for examples.
  %% DO NOT EDIT src/mtproto_proxy.app.src!!!
  [
    {external_ip, "172.26.1.1"},
    {allowed_protocols, [mtp_fake_tls]},
    {ports,
     [#{name => mtp_handler1,
        listen_ip => "172.26.1.1",
        port => 12343,
        secret => <<"64a7d74f4847a0249f88f316d6dd076b">>,
        tag => <<"f5f685f3e5bda7b62eaba0a4ace4d440">>}
     ]}
   ]},

Starting up with mtp_proxy start. Everything works fine but some extra random port get listened on all interfaces.

mtp_proxy eval '[{proplists:get_value(port, L), proplists:get_value(all_connections, L)} || {Name, L} <- ranch:info()].'
[{12343,3}]
mtproxy@server:~/git/MTProxy-Erlang/src$ netstat -pln|grep mtp_proxy
(Not all processes could be identified, non-owned process info
 will not be shown, you would have to be root to see it all.)
tcp        0      0 0.0.0.0:60374           0.0.0.0:*               LISTEN      22656/mtp_proxy
tcp        0      0 172.26.1.1:12343        0.0.0.0:*               LISTEN      22656/mtp_proxy

What is the purpose of the 60374 port here? If it's for some monitoring culd we bind it 127.0.0.1?

Proper way to update mtproto_proxy

First I would like to thank you for this great software, I switched to your proxy as I was having connection and performance issues with mtprotoproxy and your proxy is working great.
I would like to know what is the proper way to update mtproto_proxy to the latest version ? I used the installer script.

config

hi

{max_connections, 128000},
{upstream_socket_buffer_size, 20480},
{downstream_socket_buffer_size, 512000},
{replay_check_session_storage, on},
{replay_check_session_storage_opts,
{max_memory_mb => 25048,
max_age_minutes => 1440}},
solved

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.