Giter Club home page Giter Club logo

erlanglab / erlangpl Goto Github PK

View Code? Open in Web Editor NEW
548.0 35.0 40.0 2.36 MB

Tool for developers working with systems running on the Erlang VM (BEAM). It helps with performance analysis.

Home Page: http://www.erlang.pl/

License: Apache License 2.0

Makefile 0.24% Erlang 51.06% Shell 4.15% Batchfile 1.55% JavaScript 35.67% HTML 0.36% CSS 5.77% Elm 1.19%
erlang erlang-vm performance-visualization performance-dashboard visualization elixir beam elm erlang-performance-lab react

erlangpl's Introduction

Erlang Performance Lab Build Status

Tool for developers working with systems running on the Erlang VM (BEAM). It helps with performance analysis.

Getting started

The Erlang Performance Lab tool (erlangpl for short) can be started using escript or as a regular Erlang release.

Quick start guide

1. Download the prebuilt erlangpl script from here or type in the shell command:

wget https://github.com/erlanglab/erlangpl/releases/download/0.9.0/erlangpl.tar.gz

2. Untar downloaded archive using any GUI program or type in the shell command:

tar -zxvf erlangpl.tar.gz

3. Run the script

./erlangpl -n NODE_NAME -c COOKIE

To make erlangpl work a few conditions must be met:

  • The node to be monitored must be run in the distributed mode (using -name or -sname flag). Click here to learn more.
  • Both the node's and the erlangpl's cookies must match. You can set the node's cookie by starting it with -setcookie flag. Keep in mind that when you do not set a cookie for the node it will be automatically set using $HOME/.erlang.cookie. See the official Erlang documentation to learn more.
  • The erlangpl script must run on OTP 19.3 or higher.
Example

If the node you want to monitor is [email protected] and has a cookie set to test (erl -name [email protected] -setcookie test) then you should run the script as follows:

./erlangpl -n [email protected] -c test

Download prebuilt script

The easiest way to get started is to download a prebuilt erlangpl script (download link).

Build it manually

Prerequisites

For building UI you need to have following dependencies installed:

Be aware that building UI can take some time. It takes around 1 minute on stock MacBook 2015 plus dependencies download for the first time. Second time dependencies will be cached.

$ git clone https://github.com/erlanglab/erlangpl.git
$ cd erlangpl
$ make release

Running erlangpl script

The erlangpl shell script is a self-contained escript, which can be started from a command line as long as you have Erlang/OTP installed.

$ ./erlangpl -h

Usage: erlangpl [-n <node>] [-c <cookie>] [-p <plugin>] [-h]
                [-v <verbose>] [-P <port>] [-V] [-s <sname>] [-l <name>]

  -n, --node     Monitored node name
  -c, --cookie   Overwrite ~/.erlang.cookie
  -p, --plugin   Path to plugins
  -h, --help     Show the program options
  -v, --verbose  Verbosity level (-v, -vv, -vvv)
  -P, --port     HTTP and WS port number
  -V, --version  Show version information
  -s, --sname    Start with a shortname
  -l, --name     Start with a longname, default [email protected]

$ ./erlangpl -n [email protected] -c YOURCOOKIE

Once started, try visiting http://localhost:37575/

Examples

Connecting to an Elixir iex session

$ iex --name [email protected] -S mix
$ ./erlangpl --node [email protected]

Mnesia cluster

You can generate messages between nodes by querying a distributed database Mnesia.

To setup a Mnesia cluster, start several Erlang nodes with unique names e.g. a@, b@, c@, etc. and start the database on all of them:

erl -name [email protected]
([email protected])1> mnesia:start().

Then create a test_table and configure it to be replicated on all nodes:

([email protected])2> mnesia:change_config(extra_db_nodes, ['[email protected]']).
([email protected])3> mnesia:change_config(extra_db_nodes, ['[email protected]']).
([email protected])4> mnesia:change_config(extra_db_nodes, ['[email protected]']).
([email protected])5> mnesia:create_table(test_table, []).
([email protected])6> [mnesia:add_table_copy(test_table, Node, ram_copies) || Node <- nodes()].

Here are some behaviours you can test:

[begin mnesia:transaction(fun() -> mnesia:write({test_table, Key, "value"}) end), timer:sleep(10) end || Key <- lists:seq(1,2000)].
[begin mnesia:sync_dirty(fun() -> mnesia:write({test_table, Key, "value"}) end), timer:sleep(10) end || Key <- lists:seq(1,2000)].
[begin mnesia:dirty_write({test_table, Key, "value"}), timer:sleep(10) end || Key <- lists:seq(1,2000)].

Videos from those experiments were posted on YouTube

Developing

Erlang

Running development release

You can also start the tool as a regular Erlang release and connect to its console to debug the tool itself.

$ make
$ rebar -f generate
$ ./rel/erlangpl/bin/erlangpl console [email protected] cookie=YOURCOOKIE

User Interface

Running standalone

erlangpl-ui can be started standalone using Node with npm or yarn. We are recomending yarn for that.

yarn && yarn start

Now, application can be found at localhost:3000 and will be listening for messages from localhost:37575 where you have to have erlangpl running.

Writing Elm code

Although erlangpl-ui is written in React we belive in Elm's power. Because of that we support Elm in our build process. This is possible because of react-elm-components and elm-webpack.

You can write any separate component in Elm and then wrap it into React component which can be integrated with whole application. Elm code should be placed in ui/src/elm and every component whould have main file in this directory and all files related to this component in directory with the same name. React wrapper file should have the same name as Elm component and flow should be disabled for this file.

-- ui/src/elm/About.elm

module About exposing (..)

import Html exposing (text)

main =
    text "Hello world from Elm component"
// ui/src/about/components/About.js

import React from 'react';
import Elm from 'react-elm-components';
import { About } from '../../elm/About.elm';

import './About.css';

const AboutWrapper = () => {
  return (
    <Elm src={About} />
  );
};

export default AboutWrapper;

Have fun!

erlangpl's People

Contributors

arkgil avatar baransu avatar getong avatar gomoripeti avatar hajto avatar mkacper avatar rockwood avatar tmr08c avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

erlangpl's Issues

Integrating new UI

I'm doing research on adding erlangpl-ui as a new frontend. Only thing I need from backend is serving static folder at localhost:8000 as it is right now.
I'll add erlangpl-ui as subrepo which will be fetched during build process and included in place of current UI.

Only thing I have problem understanding is how erl_3d is included in current frontend because plugin provides frontend as well. With new version it would be hard or impossible, and we're thinking about replacing it with 2nd or 3rd level view of vizceral. But having 3D visualization is nice and I can work on new version of 3D renderer for that which will be available at /messages within new UI. In that case erl_3d can only serve as backend plugin and all client code can be placed in erlangpl-ui or new repo.

What have to be done within this and epl_3d repo to provide new UI?

/ cc @michalslaski @arkgil

Cache state in localStorage

When we're for some reason reloading page, we lose all state and in the worst case we have to wait 5s for the new state from the server. My idea is to cache state in localStorage and when we're reloading page it will be loaded from cache. If observed node crashes and we're reloading page to reconnect we don't want to display old, cached state so we have to wipe it on WebSocket connection crash.
It's a small change but can be nice quality of life improvement.

Make the plugins Erlang/Elixir applications

Are there any advantages by treating *EPL.beam files as plugins over making plugins valid Elixir/Erlang applications? The web socket handlers could be configured via the config file.

Cluster traffic view not rendering

I was playing with Phoenix and ErlangPL in this video and could not get the traffic on the cluster to render. I also tried following the mensia sample in the readme with no luck. How do I troubleshoot or debug this issue?

screen shot 2017-04-07 at 10 36 07 am

Release more often

As of today your last published release is 44 commits behind master.
This is somewhat annoying as none of the links provided lead to the most up to date escript.

How about setting up your CI to publish the escript when building tags?
Then you'd need to cut tags more often.
You could also do that automatically by having your CI push a tag when a new commit lands in the master branch.

Thoughts and problems with connecting to nodes after erlangpl starts

As followup to #11, couple of things which in my opinion need to be considered:

  1. epl_tracer process needs to trap exits, to handle exit messages from remote tracer process (there is no spawn_monitor/4 API ๐Ÿ˜ข )
  2. node name cannot be put into epl's config ETS table at startup, but rather be a part of epl_tracer's state and retrieved by calling this process
  3. when epl_tracer sees that remote process exited, it should notify all subscribers and exit normally
  4. I don't know if this is possible to query epmd for full node names. I know we can retrieve short node name via command line interface to epmd, but maybe there is some internal API for doing that
  5. we probably need some "core" websocket handler, which won't be a part of any plugin, but will be used solely for selecting node to connect to, disconnecting from node, listing nodes etc.
  6. and of course there is a problem with epl:command/2 API. If a epl_tracer is currently dead, calling process will crash with noproc and close WebSocket connection if it is a WebSocket handler. One way to solve this issue is to introduce "proxy" process which would be always available, monitoring epl_tracer, and all command requests would go through it. Other solution is to Let It Crashโ„ข.

Weird issue when visualizing GenStage pipelines

Hi guys,

this project is really cool! I started playing with it to visualize GenStage pipelines and I bumped into a weird behaviour. You can find my code here: https://github.com/Arkham/genstaged

My GenStage setup is:

  • one producer which generates numbers
  • one producer-consumer which filters only even numbers
  • two consumers

So I booted the app with EPL and amazing, everything works perfectly! But then I commented out these lines (https://github.com/Arkham/genstaged/blob/master/lib/genstaged/consumer.ex#L16-L18) when I used to print the received events and suddenly I don't see anything anymore in my performance lab. Here's a gif

ezgif com-optimize

Crashing under large load

My app uses about 1000 processes per machine for controlling and distributing tasks across a cluster. epl_st timeouts when creating the supervisor tree to send to the web GUI. Is there either a way I can specify only certain processes to be tracked, or (better) just extend that timeout and accept a bit of lag?

Better erlangpl-ui provide method

Followup to: #18 (comment)

Right now we have erlangui-ui submodule. It's not idea and we can do better.
Because UI is not a part of erlangpl codebase we should treat it as build dependency and create some kind of script to check UI version and update it automatically.

Crashes wobserver

When wobserver is running on the node that is being connected to then a crash happens:

=ERROR REPORT==== 5-Apr-2017::14:26:45 ===
** Generic server epl_st terminating
** Last message in was {data,
                           {'[email protected]',{1491,424005,703000}},
                           [{process_count,337},
                            {memory_total,61696912},
                            {spawn,[]},
                            {exit,[]},
                            {send,
                                [{{#Port<6777.19116>,<6777.421.0>},15,0},
                                 {{#Port<6777.19117>,<6777.420.0>},15,0},
                                 {{#Port<6777.19119>,<6777.419.0>},15,0},
                                 {{#Port<6777.19120>,<6777.418.0>},15,0},
                                 {{#Port<6777.19121>,<6777.417.0>},15,0},
                                 {{#Port<6777.19123>,<6777.416.0>},15,0},
                                 {{#Port<6777.19124>,<6777.415.0>},15,0},
                                 {{#Port<6777.19126>,<6777.414.0>},15,0},
                                 {{#Port<6777.19127>,<6777.413.0>},15,0},
                                 {{#Port<6777.19128>,<6777.412.0>},15,0},
                                 {{#Port<6777.23262>,<6777.567.0>},2,0},
                                 {{<6777.45.0>,<6777.47.0>},0,1},
                                 {{<6777.45.0>,<6777.567.0>},1,0},
                                 {{<6777.567.0>,<6777.572.0>},1,0}]},
                            {send_self,[]},
                            {'receive',
                                [{#Port<6777.19116>,10,130},
                                 {#Port<6777.19117>,10,130},
                                 {#Port<6777.19119>,10,130},
                                 {#Port<6777.19120>,10,130},
                                 {#Port<6777.19121>,10,130},
                                 {#Port<6777.19123>,10,130},
                                 {#Port<6777.19124>,10,130},
                                 {#Port<6777.19126>,10,130},
                                 {#Port<6777.19127>,10,130},
                                 {#Port<6777.19128>,10,130},
                                 {#Port<6777.23262>,2,24},
                                 {<6777.45.0>,1,0},
                                 {<6777.47.0>,1,0},
                                 {<6777.274.0>,5,0},
                                 {<6777.412.0>,15,75},
                                 {<6777.413.0>,15,75},
                                 {<6777.414.0>,15,75},
                                 {<6777.415.0>,15,75},
                                 {<6777.416.0>,15,75},
                                 {<6777.417.0>,15,75},
                                 {<6777.418.0>,15,75},
                                 {<6777.419.0>,15,75},
                                 {<6777.420.0>,15,75},
                                 {<6777.421.0>,15,75},
                                 {<6777.543.0>,3,0},
                                 {<6777.546.0>,3,0},
                                 {<6777.549.0>,4,0},
                                 {<6777.567.0>,2,6}]},
                            {trace,[]}]}
** When Server state == {state,[]}
** Reason for termination ==
** {function_clause,
       [{lists,map,
            [#Fun<epl_st.2.5273273>,
             {'EXIT',
                 {{#{'__exception__' => true,
                     '__struct__' => 'Elixir.RuntimeError',
                     message => <<"attempted to call GenServer :wobserver_metrics but no handle_call/3 clause was provided">>},
                   [{'Elixir.Agent.Server','handle_call (overridable 1)',3,
                        [{file,"lib/gen_server.ex"},{line,559}]},
                    {gen_server,try_handle_call,4,
                        [{file,"gen_server.erl"},{line,615}]},
                    {gen_server,handle_msg,5,
                        [{file,"gen_server.erl"},{line,647}]},
                    {proc_lib,init_p_do_apply,3,
                        [{file,"proc_lib.erl"},{line,247}]}]},
                  {gen_server,call,[<6777.404.0>,which_children,infinity]}}}],
            [{file,"lists.erl"},{line,1238}]},
        {epl_st,generate_sup_tree,1,[{file,"src/epl_st.erl"},{line,105}]},
        {epl_st,'-handle_info/2-fun-0-',2,[{file,"src/epl_st.erl"},{line,76}]},
        {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
        {epl_st,handle_info,2,[{file,"src/epl_st.erl"},{line,75}]},
        {gen_server,try_dispatch,4,[{file,"gen_server.erl"},{line,601}]},
        {gen_server,handle_msg,5,[{file,"gen_server.erl"},{line,667}]},
        {proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}

=INFO REPORT==== 5-Apr-2017::14:26:45 ===
    application: epl_st
    exited: shutdown
    type: temporary

Probably is wobserver's fault that it happens, but it should still be hardy enough to withstand crashes like this.

Cluster node view too busy

When you have a lot of processes on a node, the cluster view of it gets far too busy to see anything useful. For example:

Along with this, the page changes periodically to show far less processes - ie. some go missing for a few seconds, then reappear. The following screen shot is from the same node a few seconds later - loads of processes have disappeared!

It would be nice to see better names for the processes too...

Refuse to build/start erlangpl when UI is not built

As reported by @sashaafm in #30, and other people on other communication channels, building and starting erlangpl without building UI first throws rather cryptic error.

The most reliable solution would be to refuse to compile (make) or bootstrap when we can't find prebuilt UI files. We could, for example, check if we can find apps/epl/priv/htdocs/index.html file when bootstraping.

Process list based on business

List of process sorted by reductions/other means describing throughput would be nice addition. It could help people finding most busy process that may be bottle neck.

Crashing after connection

I'm using

  • erlangpl 0.6
  • linux
  • OTP release19
  • ERTS version8.2.2

I'm currently developing an elixir toy project (https://github.com/DrPandemic/MasDB). When I connect erlanpl to my cluster with ./erlangpl -n [email protected] it generates some errors.

In overview memory usage, processes and throughput are indefinitely loading. Same thing for the sup-tree page. The traffic page seems to be working correctly.

I don't know if the problem is from my project. If it's the case, do you know how I could fix it?

This tools looks amazing!

Those are the errors

ERROR: timed out while collecting data from node

=ERROR REPORT==== 10-Apr-2017::17:47:57 ===
** Generic server epl_st terminating 
** Last message in was {data,
                        {'[email protected]',{1491,860872,638202}},
                        [{process_count,110},
                         {memory_total,28066008},
                         {spawn,
                          [{<6880.842.0>,
                            {proc_lib,init_p,
                             ['Elixir.Masdb.Node.DistantSupervisor',
                              [<6880.177.0>,<6880.176.0>],
                              'Elixir.Task.Supervised',reply,
                              [<8002.191.0>,monitor,
                               {'[email protected]','Elixir.Masdb.Gossip.Server'},
                               {'Elixir.Masdb.Gossip.Server',received_gossip,
                                [#{'__struct__' => 'Elixir.Masdb.Register',
                                   schemas => [],
                                   stores => [],
                                   synced => true,
                                   tables => []}]}]]},
                            {1491,860868,806979}},
                           {<6880.843.0>,
                            {proc_lib,init_p,
                             ['Elixir.Masdb.Node.DistantSupervisor',
                              [<6880.177.0>,<6880.176.0>],
                              'Elixir.Task.Supervised',reply,
                              [<8003.191.0>,monitor,
                               {'[email protected]','Elixir.Masdb.Gossip.Server'},
                               {'Elixir.Masdb.Gossip.Server',received_gossip,
                                [#{'__struct__' => 'Elixir.Masdb.Register',
                                   schemas => [],
                                   stores => [],
                                   synced => true,
                                   tables => []}]}]]},
                            {1491,860868,822536}},
                           {<6880.844.0>,
                            {proc_lib,init_p,
                             ['Elixir.Masdb.Node.DistantSupervisor',
                              [<6880.177.0>,<6880.176.0>],
                              'Elixir.Task.Supervised',reply,
                              [<8002.191.0>,monitor,
                               {'[email protected]','Elixir.Masdb.Gossip.Server'},
                               {'Elixir.Masdb.Gossip.Server',received_gossip,
                                [#{'__struct__' => 'Elixir.Masdb.Register',
                                   schemas => [],
                                   stores => [],
                                   synced => true,
                                   tables => []}]}]]},
                            {1491,860871,812235}},
                           {<6880.845.0>,
                            {proc_lib,init_p,
                             ['Elixir.Masdb.Node.DistantSupervisor',
                              [<6880.177.0>,<6880.176.0>],
                              'Elixir.Task.Supervised',reply,
                              [<8003.191.0>,monitor,
                               {'[email protected]','Elixir.Masdb.Gossip.Server'},
                               {'Elixir.Masdb.Gossip.Server',received_gossip,
                                [#{'__struct__' => 'Elixir.Masdb.Register',
                                   schemas => [],
                                   stores => [],
                                   synced => true,
                                   tables => []}]}]]},
                            {1491,860871,827102}}]},
                         {exit,
                          [{<6880.842.0>,normal,{1491,860868,810505}},
                           {<6880.843.0>,normal,{1491,860868,822966}},
                           {<6880.844.0>,normal,{1491,860871,813505}},
                           {<6880.845.0>,normal,{1491,860871,828388}}]},
                         {send,
                          [{{'Elixir.Masdb.Register.Server',<6880.842.0>},0,1},
                           {{'Elixir.Masdb.Register.Server',<6880.843.0>},0,1},
                           {{'Elixir.Masdb.Register.Server',<6880.844.0>},0,1},
                           {{'Elixir.Masdb.Register.Server',<6880.845.0>},0,1},
                           {{#Port<6880.6636>,<6880.192.0>},1,0},
                           {{#Port<6880.6656>,<6880.197.0>},1,0},
                           {{#Port<6880.6699>,<6880.836.0>},1,0},
                           {{<6880.178.0>,<6880.188.0>},2,2},
                           {{<6880.185.0>,<8003.191.0>},2,0},
                           {{<6880.185.0>,<8002.191.0>},2,0},
                           {{<6880.186.0>,<6880.188.0>},4,4},
                           {{<6880.188.0>,<8003.815.0>},1,0},
                           {{<6880.188.0>,<8003.817.0>},1,0},
                           {{<6880.188.0>,<8002.2077.0>},1,0},
                           {{<6880.188.0>,<8002.2079.0>},1,0},
                           {{<6880.188.0>,
                             {'Elixir.Masdb.Node.DistantSupervisor',
                              '[email protected]'}},
                            2,0},
                           {{<6880.188.0>,
                             {'Elixir.Masdb.Node.DistantSupervisor',
                              '[email protected]'}},
                            2,0},
                           {{<8003.191.0>,<6880.843.0>},0,1},
                           {{<8003.191.0>,<6880.845.0>},0,1},
                           {{<8002.191.0>,<6880.842.0>},0,1},
                           {{<8002.191.0>,<6880.844.0>},0,1},
                           {{<6880.192.0>,<6880.841.0>},1,0},
                           {{<6880.197.0>,<6880.841.0>},1,0},
                           {{<6880.836.0>,<6880.841.0>},1,0}]},
                         {send_self,[]},
                         {'receive',
                          [{#Port<6880.6636>,1,12},
                           {#Port<6880.6656>,1,12},
                           {#Port<6880.6699>,1,12},
                           {<6880.178.0>,2,20},
                           {<6880.185.0>,8,232},
                           {<6880.186.0>,8,120},
                           {<6880.188.0>,22,200},
                           {<6880.192.0>,1,3},
                           {<6880.197.0>,1,3},
                           {<6880.836.0>,1,3},
                           {<6880.842.0>,1,12},
                           {<6880.843.0>,1,12},
                           {<6880.844.0>,1,12},
                           {<6880.845.0>,1,12}]},
                         {trace,[]}]}
** When Server state == {state,[<0.199.0>]}
** Reason for termination == 
** {{timeout,{gen_server,call,
                         [epl_tracer,
                          {command,#Fun<supervisor.which_children.1>,
                                   [<6880.163.0>]}]}},
    [{gen_server,call,2,[{file,"gen_server.erl"},{line,204}]},
     {epl_st,command,2,[{file,"src/epl_st.erl"},{line,120}]},
     {epl_st,generate_sup_tree,1,[{file,"src/epl_st.erl"},{line,104}]},
     {epl_st,'-handle_info/2-fun-0-',2,[{file,"src/epl_st.erl"},{line,76}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {epl_st,handle_info,2,[{file,"src/epl_st.erl"},{line,75}]},
     {gen_server,try_dispatch,4,[{file,"gen_server.erl"},{line,601}]},
     {gen_server,handle_msg,5,[{file,"gen_server.erl"},{line,667}]}]}

=INFO REPORT==== 10-Apr-2017::17:47:57 ===
    application: epl_st
    exited: shutdown
    type: temporary
ERROR: timed out while collecting data from node
ERROR: timed out while collecting data from node

Random crash reports while clicking on nodes in the supervisor tree view

Occasionally and seemingly randomly get an occasional error report while clicking on supervisors in the graph view:

=ERROR REPORT==== 5-Apr-2017::14:23:59 ===
** Cowboy handler epl_st_EPL terminating in websocket_handle/3
   for the reason error:badarg
** Message was {text,<<"<6777.435.0>">>}
** Options were []
** Handler state was undefined_state

ERROR: timed out while collecting data from node

Hi guys,

I was trying out EPL to visualize a Flow experiment. You can find the code here: https://github.com/Arkham/flow_lightning

If I try to connect EPL to the running node I get this error a lot of times

ERROR: timed out while collecting data from node

And after that the epl_tracer crashes with this message

=ERROR REPORT==== 22-May-2017::16:13:12 ===
** Generic server epl_tracer terminating
** Last message in was {#Ref<0.0.4.274>,
                        [{process_count,107},
                         {memory_total,1712777168},
                         {spawn,
                          [{<6853.185.0>,
                            {proc_lib,init_p,
                             [<6853.175.0>,
                              [<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.175.0>,self,
                               'Elixir.Flow.Coordinator',
                               {<6853.175.0>,
                                #{'__struct__' => 'Elixir.Flow',
                                  operations =>
                                   [{reduce,
                                     #Fun<Elixir.FlowLightning.1.121366654>,
                                     #Fun<Elixir.FlowLightning.2.121366654>}],
                                  options => [{stages,4}],
                                  producers =>
                                   {flows,
                                    [#{'__struct__' => 'Elixir.Flow',
                                       operations =>
                                        [{mapper,flat_map,
                                          [#Fun<Elixir.FlowLightning.0.121366654>]}],
                                       options => [{stages,4}],
                                       producers =>
                                        {enumerables,
                                         [#{'__struct__' =>
                                             'Elixir.File.Stream',
                                            line_or_bytes => line,
                                            modes => [raw,read_ahead,binary],
                                            path => <<"priv/words_by_8.txt">>,
                                            raw => true}]},
                                       window =>
                                        #{'__struct__' =>
                                           'Elixir.Flow.Window.Global',
                                          periodically => [],
                                          trigger => nil}}]},
                                  window =>
                                   #{'__struct__' =>
                                      'Elixir.Flow.Window.Global',
                                     periodically => [],trigger => nil}},
                                producer_consumer,[],
                                [{demand,accumulate}]},
                               [{demand,accumulate}]]]},
                            {1495,465965,217124}},
                           {<6853.186.0>,
                            {proc_lib,init_p,
                             [<6853.185.0>,
                              [<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.185.0>,<6853.185.0>,
                               supervisor,
                               {self,'Elixir.Supervisor.Default',
                                {ok,
                                 {{simple_one_for_one,0,5},
                                  [{'Elixir.GenStage',
                                    {'Elixir.GenStage',start_link,[]},
                                    transient,5000,worker,
                                    ['Elixir.GenStage']}]}}},
                               []]]},
                            {1495,465965,217175}},
                           {<6853.187.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.GenStage.Streamer',
                                {#{'__struct__' => 'Elixir.File.Stream',
                                   line_or_bytes => line,
                                   modes => [raw,read_ahead,binary],
                                   path => <<"priv/words_by_8.txt">>,
                                   raw => true},
                                 [{consumers,permanent},{demand,accumulate}]}},
                               [{consumers,permanent},{demand,accumulate}]]]},
                            {1495,465965,223182}},
                           {<6853.188.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.187.0>,[{partition,0}]}]},
                                  {dispatcher,
                                   {'Elixir.GenStage.PartitionDispatcher',
                                    [{partitions,
                                      #{'__struct__' => 'Elixir.Range',
                                        first => 0,last => 3}},
                                     {hash,
                                      #Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
                                 {0,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.Flow.Materialize.32.45510982>,
                                 #Fun<Elixir.Flow.Materialize.33.45510982>}},
                               []]]},
                            {1495,465965,227357}},
                           {<6853.189.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.187.0>,[{partition,1}]}]},
                                  {dispatcher,
                                   {'Elixir.GenStage.PartitionDispatcher',
                                    [{partitions,
                                      #{'__struct__' => 'Elixir.Range',
                                        first => 0,last => 3}},
                                     {hash,
                                      #Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
                                 {1,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.Flow.Materialize.32.45510982>,
                                 #Fun<Elixir.Flow.Materialize.33.45510982>}},
                               []]]},
                            {1495,465965,230667}},
                           {<6853.190.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.187.0>,[{partition,2}]}]},
                                  {dispatcher,
                                   {'Elixir.GenStage.PartitionDispatcher',
                                    [{partitions,
                                      #{'__struct__' => 'Elixir.Range',
                                        first => 0,last => 3}},
                                     {hash,
                                      #Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
                                 {2,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.Flow.Materialize.32.45510982>,
                                 #Fun<Elixir.Flow.Materialize.33.45510982>}},
                               []]]},
                            {1495,465965,230748}},
                           {<6853.191.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.187.0>,[{partition,3}]}]},
                                  {dispatcher,
                                   {'Elixir.GenStage.PartitionDispatcher',
                                    [{partitions,
                                      #{'__struct__' => 'Elixir.Range',
                                        first => 0,last => 3}},
                                     {hash,
                                      #Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
                                 {3,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.Flow.Materialize.32.45510982>,
                                 #Fun<Elixir.Flow.Materialize.33.45510982>}},
                               []]]},
                            {1495,465965,230815}},
                           {<6853.192.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.188.0>,[{partition,0}]},
                                    {<6853.189.0>,[{partition,0}]},
                                    {<6853.190.0>,[{partition,0}]},
                                    {<6853.191.0>,[{partition,0}]}]}],
                                 {0,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.FlowLightning.1.121366654>,
                                 #Fun<Elixir.Flow.Materialize.3.45510982>}},
                               []]]},
                            {1495,465965,230883}},
                           {<6853.193.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.188.0>,[{partition,1}]},
                                    {<6853.189.0>,[{partition,1}]},
                                    {<6853.190.0>,[{partition,1}]},
                                    {<6853.191.0>,[{partition,1}]}]}],
                                 {1,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.FlowLightning.1.121366654>,
                                 #Fun<Elixir.Flow.Materialize.3.45510982>}},
                               []]]},
                            {1495,465965,231011}},
                           {<6853.194.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.188.0>,[{partition,2}]},
                                    {<6853.189.0>,[{partition,2}]},
                                    {<6853.190.0>,[{partition,2}]},
                                    {<6853.191.0>,[{partition,2}]}]}],
                                 {2,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.FlowLightning.1.121366654>,
                                 #Fun<Elixir.Flow.Materialize.3.45510982>}},
                               []]]},
                            {1495,465965,231123}},
                           {<6853.195.0>,
                            {proc_lib,init_p,
                             [<6853.186.0>,
                              [<6853.185.0>,<6853.175.0>,<6853.77.0>],
                              gen,init_it,
                              [gen_server,<6853.186.0>,<6853.186.0>,
                               'Elixir.GenStage',
                               {'Elixir.Flow.MapReducer',
                                {producer_consumer,
                                 [{subscribe_to,
                                   [{<6853.188.0>,[{partition,3}]},
                                    {<6853.189.0>,[{partition,3}]},
                                    {<6853.190.0>,[{partition,3}]},
                                    {<6853.191.0>,[{partition,3}]}]}],
                                 {3,4},
                                 #Fun<Elixir.Flow.Window.Global.2.31322697>,
                                 #Fun<Elixir.FlowLightning.1.121366654>,
                                 #Fun<Elixir.Flow.Materialize.3.45510982>}},
                               []]]},
                            {1495,465965,231231}},
                           {<6853.196.0>,
                            {erlang,apply,
                             [#Fun<Elixir.GenStage.13.64651076>,[]]},
                            {1495,465965,231361}}]},
                         {spawn_,true},
                         {exit,[{<6853.176.0>,normal,{1495,465965,205064}}]},
                         {exit_,true},
                         {send,
                          [{{code_server,<6853.175.0>},0,9},
                           {{code_server,<6853.185.0>},0,2},
                           {{code_server,<6853.186.0>},0,1},
                           {{code_server,<6853.187.0>},0,3},
                           {{code_server,<6853.188.0>},0,2},
                           {{#Port<6853.674>,<6853.54.0>},2,6},
                           {{#Port<6853.6204>,<6853.179.0>},1,0},
                           {{#Port<6853.6217>,<6853.4.0>},1,0},
                           {{#Port<6853.6218>,<6853.4.0>},1,0},
                           {{#Port<6853.6219>,<6853.4.0>},1,0},
                           {{#Port<6853.6220>,<6853.4.0>},1,0},
                           {{#Port<6853.6221>,<6853.4.0>},1,0},
                           {{#Port<6853.6222>,<6853.4.0>},1,0},
                           {{#Port<6853.6223>,<6853.4.0>},1,0},
                           {{#Port<6853.6224>,<6853.4.0>},1,0},
                           {{#Port<6853.6225>,<6853.4.0>},1,0},
                           {{#Port<6853.6226>,<6853.4.0>},1,0},
                           {{#Port<6853.6227>,<6853.4.0>},1,0},
                           {{#Port<6853.6228>,<6853.4.0>},1,0},
                           {{#Port<6853.6229>,<6853.4.0>},1,0},
                           {{#Port<6853.6230>,<6853.4.0>},1,0},
                           {{#Port<6853.6231>,<6853.4.0>},1,0},
                           {{#Port<6853.6232>,<6853.4.0>},1,0},
                           {{#Port<6853.6233>,<6853.4.0>},1,0},
                           {{#Port<6853.6234>,<6853.4.0>},1,0},
                           {{#Port<6853.6235>,<6853.4.0>},1,0},
                           {{#Port<6853.6236>,<6853.4.0>},1,0},
                           {{#Port<6853.6237>,<6853.4.0>},1,0},
                           {{#Port<6853.6238>,<6853.4.0>},1,0},
                           {{#Port<6853.6239>,<6853.4.0>},1,0},
                           {{#Port<6853.6240>,<6853.4.0>},1,0},
                           {{#Port<6853.6241>,<6853.4.0>},1,0},
                           {{#Port<6853.6242>,<6853.4.0>},1,0},
                           {{#Port<6853.6243>,<6853.4.0>},1,0},
                           {{#Port<6853.6244>,<6853.4.0>},1,0},
                           {{#Port<6853.6245>,<6853.4.0>},1,0},
                           {{#Port<6853.6246>,<6853.4.0>},1,0},
                           {{#Port<6853.6247>,<6853.4.0>},1,0},
                           {{#Port<6853.6248>,<6853.4.0>},1,0},
                           {{#Port<6853.6249>,<6853.4.0>},1,0},
                           {{#Port<6853.6250>,<6853.4.0>},1,0},
                           {{#Port<6853.6251>,<6853.4.0>},1,0},
                           {{#Port<6853.6252>,<6853.4.0>},1,0},
                           {{#Port<6853.6253>,<6853.4.0>},1,0},
                           {{#Port<6853.6254>,<6853.4.0>},1,0},
                           {{#Port<6853.6255>,<6853.4.0>},1,0},
                           {{#Port<6853.6256>,<6853.4.0>},1,0},
                           {{#Port<6853.6257>,<6853.4.0>},1,0},
                           {{#Port<6853.6258>,<6853.4.0>},1,0},
                           {{#Port<6853.6259>,<6853.4.0>},1,0},
                           {{#Port<6853.6260>,<6853.4.0>},1,0},
                           {{#Port<6853.6261>,<6853.4.0>},1,0},
                           {{#Port<6853.6262>,<6853.4.0>},1,0},
                           {{#Port<6853.6263>,<6853.4.0>},1,0},
                           {{#Port<6853.6264>,<6853.4.0>},1,0},
                           {{#Port<6853.6265>,<6853.4.0>},1,0},
                           {{#Port<6853.6266>,<6853.4.0>},1,0},
                           {{#Port<6853.6267>,<6853.4.0>},1,0},
                           {{#Port<6853.6268>,<6853.4.0>},1,0},
                           {{#Port<6853.6269>,<6853.4.0>},1,0},
                           {{#Port<6853.6270>,<6853.4.0>},1,0},
                           {{#Port<6853.6271>,<6853.4.0>},1,0},
                           {{#Port<6853.6272>,<6853.4.0>},1,0},
                           {{#Port<6853.6273>,<6853.4.0>},1,0},
                           {{#Port<6853.6274>,<6853.4.0>},1,0},
                           {{#Port<6853.6275>,<6853.4.0>},1,0},
                           {{#Port<6853.6276>,<6853.4.0>},1,0},
                           {{#Port<6853.6277>,<6853.4.0>},1,0},
                           {{#Port<6853.6278>,<6853.4.0>},1,0},
                           {{#Port<6853.6279>,<6853.4.0>},1,0},
                           {{#Port<6853.6280>,<6853.4.0>},1,0},
                           {{#Port<6853.6281>,<6853.4.0>},1,0},
                           {{#Port<6853.6282>,<6853.4.0>},1,0},
                           {{#Port<6853.6283>,<6853.187.0>},52809,0},
                           {{<6853.4.0>,<6853.36.0>},66,66},
                           {{<6853.36.0>,<6853.175.0>},9,0},
                           {{<6853.36.0>,<6853.185.0>},2,0},
                           {{<6853.36.0>,<6853.186.0>},1,0},
                           {{<6853.36.0>,<6853.187.0>},3,0},
                           {{<6853.36.0>,<6853.188.0>},2,0},
                           {{<6853.54.0>,<6853.56.0>},2,4},
                           {{<6853.56.0>,<6853.176.0>},1,0},
                           {{<6853.77.0>,<6853.175.0>},1,0},
                           {{<6853.77.0>,<6853.176.0>},0,1},
                           {{<6853.175.0>,<6853.185.0>},1,2},
                           {{<6853.175.0>,<6853.187.0>},1,0},
                           {{<6853.175.0>,<6853.196.0>},1,1},
                           {{<6853.179.0>,<6853.184.0>},1,0},
                           {{<6853.185.0>,<6853.186.0>},9,10},
                           {{<6853.186.0>,<6853.187.0>},0,1},
                           {{<6853.186.0>,<6853.188.0>},0,1},
                           {{<6853.186.0>,<6853.189.0>},0,1},
                           {{<6853.186.0>,<6853.190.0>},0,1},
                           {{<6853.186.0>,<6853.191.0>},0,1},
                           {{<6853.186.0>,<6853.192.0>},0,1},
                           {{<6853.186.0>,<6853.193.0>},0,1},
                           {{<6853.186.0>,<6853.194.0>},0,1},
                           {{<6853.186.0>,<6853.195.0>},0,1},
                           {{<6853.187.0>,<6853.188.0>},26,29},
                           {{<6853.187.0>,<6853.189.0>},26,28},
                           {{<6853.187.0>,<6853.190.0>},26,28},
                           {{<6853.187.0>,<6853.191.0>},23,26},
                           {{<6853.188.0>,<6853.192.0>},107,41},
                           {{<6853.188.0>,<6853.193.0>},96,47},
                           {{<6853.188.0>,<6853.194.0>},104,43},
                           {{<6853.188.0>,<6853.195.0>},102,44},
                           {{<6853.189.0>,<6853.192.0>},110,44},
                           {{<6853.189.0>,<6853.193.0>},88,39},
                           {{<6853.189.0>,<6853.194.0>},94,44},
                           {{<6853.189.0>,<6853.195.0>},94,43},
                           {{<6853.190.0>,<6853.192.0>},73,37},
                           {{<6853.190.0>,<6853.193.0>},83,43},
                           {{<6853.190.0>,<6853.194.0>},78,43},
                           {{<6853.190.0>,<6853.195.0>},77,43},
                           {{<6853.191.0>,<6853.192.0>},88,39},
                           {{<6853.191.0>,<6853.193.0>},88,38},
                           {{<6853.191.0>,<6853.194.0>},88,39},
                           {{<6853.191.0>,<6853.195.0>},83,38},
                           {{<6853.192.0>,<6853.196.0>},0,2},
                           {{<6853.193.0>,<6853.196.0>},0,2},
                           {{<6853.194.0>,<6853.196.0>},0,2},
                           {{<6853.195.0>,<6853.196.0>},0,2}]},
                         {send_,true},
                         {send_self,[]},
                         {send_self_,true},
                         {'receive',
                          [{#Port<6853.674>,6,61},
                           {#Port<6853.6204>,1,12},
                           {#Port<6853.6217>,2,25},
                           {#Port<6853.6218>,2,25},
                           {#Port<6853.6219>,2,25},
                           {#Port<6853.6220>,2,25},
                           {#Port<6853.6221>,2,25},
                           {#Port<6853.6222>,2,25},
                           {#Port<6853.6223>,2,25},
                           {#Port<6853.6224>,2,25},
                           {#Port<6853.6225>,2,25},
                           {#Port<6853.6226>,2,25},
                           {#Port<6853.6227>,2,25},
                           {#Port<6853.6228>,2,25},
                           {#Port<6853.6229>,2,25},
                           {#Port<6853.6230>,2,25},
                           {#Port<6853.6231>,2,25},
                           {#Port<6853.6232>,2,25},
                           {#Port<6853.6233>,2,25},
                           {#Port<6853.6234>,2,25},
                           {#Port<6853.6235>,2,25},
                           {#Port<6853.6236>,2,25},
                           {#Port<6853.6237>,2,25},
                           {#Port<6853.6238>,2,25},
                           {#Port<6853.6239>,2,25},
                           {#Port<6853.6240>,2,25},
                           {#Port<6853.6241>,2,25},
                           {#Port<6853.6242>,2,25},
                           {#Port<6853.6243>,2,25},
                           {#Port<6853.6244>,2,25},
                           {#Port<6853.6245>,2,25},
                           {#Port<6853.6246>,2,25},
                           {#Port<6853.6247>,2,25},
                           {#Port<6853.6248>,2,25},
                           {#Port<6853.6249>,2,25},
                           {#Port<6853.6250>,2,25},
                           {#Port<6853.6251>,2,25},
                           {#Port<6853.6252>,2,25},
                           {#Port<6853.6253>,2,25},
                           {#Port<6853.6254>,2,25},
                           {#Port<6853.6255>,2,25},
                           {#Port<6853.6256>,2,25},
                           {#Port<6853.6257>,2,25},
                           {#Port<6853.6258>,2,25},
                           {#Port<6853.6259>,2,25},
                           {#Port<6853.6260>,2,25},
                           {#Port<6853.6261>,2,25},
                           {#Port<6853.6262>,2,25},
                           {#Port<6853.6263>,2,25},
                           {#Port<6853.6264>,2,25},
                           {#Port<6853.6265>,2,25},
                           {#Port<6853.6266>,2,25},
                           {#Port<6853.6267>,2,25},
                           {#Port<6853.6268>,2,25},
                           {#Port<6853.6269>,2,25},
                           {#Port<6853.6270>,2,25},
                           {#Port<6853.6271>,2,25},
                           {#Port<6853.6272>,2,25},
                           {#Port<6853.6273>,2,25},
                           {#Port<6853.6274>,2,25},
                           {#Port<6853.6275>,2,25},
                           {#Port<6853.6276>,2,25},
                           {#Port<6853.6277>,2,25},
                           {#Port<6853.6278>,2,25},
                           {#Port<6853.6279>,2,25},
                           {#Port<6853.6280>,2,25},
                           {#Port<6853.6281>,2,25},
                           {#Port<6853.6282>,2,25},
                           {#Port<6853.6283>,52809,739326},
                           {<6853.4.0>,198,13996},
                           {<6853.36.0>,83,3499},
                           {<6853.54.0>,6,203},
                           {<6853.56.0>,2,20},
                           {<6853.77.0>,1,10},
                           {<6853.175.0>,13,418},
                           {<6853.176.0>,1,13},
                           {<6853.179.0>,1,3},
                           {<6853.185.0>,13,110},
                           {<6853.186.0>,19,1149},
                           {<6853.187.0>,52924,1585733},
                           {<6853.188.0>,203,110571},
                           {<6853.189.0>,196,110494},
                           {<6853.190.0>,192,110442},
                           {<6853.191.0>,177,98256},
                           {<6853.192.0>,380,826291},
                           {<6853.193.0>,357,827477},
                           {<6853.194.0>,366,825743},
                           {<6853.195.0>,358,828007},
                           {<6853.196.0>,1,6}]},
                         {receive_,true},
                         {trace,[]},
                         {trace_,true}]}
** When Server state == {state,[<0.194.0>,<0.86.0>,<0.85.0>],
                               #Ref<0.0.4.274>,<6853.184.0>,3}
** Reason for termination ==
** {{not_implemented,
     {#Ref<0.0.4.274>,
      [{process_count,107},
       {memory_total,1712777168},
       {spawn,
        [{<6853.185.0>,
          {proc_lib,init_p,
           [<6853.175.0>,
            [<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.175.0>,self,'Elixir.Flow.Coordinator',
             {<6853.175.0>,
              #{'__struct__' => 'Elixir.Flow',
                operations =>
                 [{reduce,#Fun<Elixir.FlowLightning.1.121366654>,
                   #Fun<Elixir.FlowLightning.2.121366654>}],
                options => [{stages,4}],
                producers =>
                 {flows,
                  [#{'__struct__' => 'Elixir.Flow',
                     operations =>
                      [{mapper,flat_map,
                        [#Fun<Elixir.FlowLightning.0.121366654>]}],
                     options => [{stages,4}],
                     producers =>
                      {enumerables,
                       [#{'__struct__' => 'Elixir.File.Stream',
                          line_or_bytes => line,
                          modes => [raw,read_ahead,binary],
                          path => <<"priv/words_by_8.txt">>,raw => true}]},
                     window =>
                      #{'__struct__' => 'Elixir.Flow.Window.Global',
                        periodically => [],trigger => nil}}]},
                window =>
                 #{'__struct__' => 'Elixir.Flow.Window.Global',
                   periodically => [],trigger => nil}},
              producer_consumer,[],
              [{demand,accumulate}]},
             [{demand,accumulate}]]]},
          {1495,465965,217124}},
         {<6853.186.0>,
          {proc_lib,init_p,
           [<6853.185.0>,
            [<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.185.0>,<6853.185.0>,supervisor,
             {self,'Elixir.Supervisor.Default',
              {ok,
               {{simple_one_for_one,0,5},
                [{'Elixir.GenStage',
                  {'Elixir.GenStage',start_link,[]},
                  transient,5000,worker,
                  ['Elixir.GenStage']}]}}},
             []]]},
          {1495,465965,217175}},
         {<6853.187.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.GenStage.Streamer',
              {#{'__struct__' => 'Elixir.File.Stream',line_or_bytes => line,
                 modes => [raw,read_ahead,binary],
                 path => <<"priv/words_by_8.txt">>,raw => true},
               [{consumers,permanent},{demand,accumulate}]}},
             [{consumers,permanent},{demand,accumulate}]]]},
          {1495,465965,223182}},
         {<6853.188.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,[{<6853.187.0>,[{partition,0}]}]},
                {dispatcher,
                 {'Elixir.GenStage.PartitionDispatcher',
                  [{partitions,
                    #{'__struct__' => 'Elixir.Range',first => 0,last => 3}},
                   {hash,#Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
               {0,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.Flow.Materialize.32.45510982>,
               #Fun<Elixir.Flow.Materialize.33.45510982>}},
             []]]},
          {1495,465965,227357}},
         {<6853.189.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,[{<6853.187.0>,[{partition,1}]}]},
                {dispatcher,
                 {'Elixir.GenStage.PartitionDispatcher',
                  [{partitions,
                    #{'__struct__' => 'Elixir.Range',first => 0,last => 3}},
                   {hash,#Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
               {1,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.Flow.Materialize.32.45510982>,
               #Fun<Elixir.Flow.Materialize.33.45510982>}},
             []]]},
          {1495,465965,230667}},
         {<6853.190.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,[{<6853.187.0>,[{partition,2}]}]},
                {dispatcher,
                 {'Elixir.GenStage.PartitionDispatcher',
                  [{partitions,
                    #{'__struct__' => 'Elixir.Range',first => 0,last => 3}},
                   {hash,#Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
               {2,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.Flow.Materialize.32.45510982>,
               #Fun<Elixir.Flow.Materialize.33.45510982>}},
             []]]},
          {1495,465965,230748}},
         {<6853.191.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,[{<6853.187.0>,[{partition,3}]}]},
                {dispatcher,
                 {'Elixir.GenStage.PartitionDispatcher',
                  [{partitions,
                    #{'__struct__' => 'Elixir.Range',first => 0,last => 3}},
                   {hash,#Fun<Elixir.Flow.Materialize.18.45510982>}]}}],
               {3,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.Flow.Materialize.32.45510982>,
               #Fun<Elixir.Flow.Materialize.33.45510982>}},
             []]]},
          {1495,465965,230815}},
         {<6853.192.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,
                 [{<6853.188.0>,[{partition,0}]},
                  {<6853.189.0>,[{partition,0}]},
                  {<6853.190.0>,[{partition,0}]},
                  {<6853.191.0>,[{partition,0}]}]}],
               {0,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.FlowLightning.1.121366654>,
               #Fun<Elixir.Flow.Materialize.3.45510982>}},
             []]]},
          {1495,465965,230883}},
         {<6853.193.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,
                 [{<6853.188.0>,[{partition,1}]},
                  {<6853.189.0>,[{partition,1}]},
                  {<6853.190.0>,[{partition,1}]},
                  {<6853.191.0>,[{partition,1}]}]}],
               {1,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.FlowLightning.1.121366654>,
               #Fun<Elixir.Flow.Materialize.3.45510982>}},
             []]]},
          {1495,465965,231011}},
         {<6853.194.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,
                 [{<6853.188.0>,[{partition,2}]},
                  {<6853.189.0>,[{partition,2}]},
                  {<6853.190.0>,[{partition,2}]},
                  {<6853.191.0>,[{partition,2}]}]}],
               {2,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.FlowLightning.1.121366654>,
               #Fun<Elixir.Flow.Materialize.3.45510982>}},
             []]]},
          {1495,465965,231123}},
         {<6853.195.0>,
          {proc_lib,init_p,
           [<6853.186.0>,
            [<6853.185.0>,<6853.175.0>,<6853.77.0>],
            gen,init_it,
            [gen_server,<6853.186.0>,<6853.186.0>,'Elixir.GenStage',
             {'Elixir.Flow.MapReducer',
              {producer_consumer,
               [{subscribe_to,
                 [{<6853.188.0>,[{partition,3}]},
                  {<6853.189.0>,[{partition,3}]},
                  {<6853.190.0>,[{partition,3}]},
                  {<6853.191.0>,[{partition,3}]}]}],
               {3,4},
               #Fun<Elixir.Flow.Window.Global.2.31322697>,
               #Fun<Elixir.FlowLightning.1.121366654>,
               #Fun<Elixir.Flow.Materialize.3.45510982>}},
             []]]},
          {1495,465965,231231}},
         {<6853.196.0>,
          {erlang,apply,[#Fun<Elixir.GenStage.13.64651076>,[]]},
          {1495,465965,231361}}]},
       {spawn_,true},
       {exit,[{<6853.176.0>,normal,{1495,465965,205064}}]},
       {exit_,true},
       {send,
        [{{code_server,<6853.175.0>},0,9},
         {{code_server,<6853.185.0>},0,2},
         {{code_server,<6853.186.0>},0,1},
         {{code_server,<6853.187.0>},0,3},
         {{code_server,<6853.188.0>},0,2},
         {{#Port<6853.674>,<6853.54.0>},2,6},
         {{#Port<6853.6204>,<6853.179.0>},1,0},
         {{#Port<6853.6217>,<6853.4.0>},1,0},
         {{#Port<6853.6218>,<6853.4.0>},1,0},
         {{#Port<6853.6219>,<6853.4.0>},1,0},
         {{#Port<6853.6220>,<6853.4.0>},1,0},
         {{#Port<6853.6221>,<6853.4.0>},1,0},
         {{#Port<6853.6222>,<6853.4.0>},1,0},
         {{#Port<6853.6223>,<6853.4.0>},1,0},
         {{#Port<6853.6224>,<6853.4.0>},1,0},
         {{#Port<6853.6225>,<6853.4.0>},1,0},
         {{#Port<6853.6226>,<6853.4.0>},1,0},
         {{#Port<6853.6227>,<6853.4.0>},1,0},
         {{#Port<6853.6228>,<6853.4.0>},1,0},
         {{#Port<6853.6229>,<6853.4.0>},1,0},
         {{#Port<6853.6230>,<6853.4.0>},1,0},
         {{#Port<6853.6231>,<6853.4.0>},1,0},
         {{#Port<6853.6232>,<6853.4.0>},1,0},
         {{#Port<6853.6233>,<6853.4.0>},1,0},
         {{#Port<6853.6234>,<6853.4.0>},1,0},
         {{#Port<6853.6235>,<6853.4.0>},1,0},
         {{#Port<6853.6236>,<6853.4.0>},1,0},
         {{#Port<6853.6237>,<6853.4.0>},1,0},
         {{#Port<6853.6238>,<6853.4.0>},1,0},
         {{#Port<6853.6239>,<6853.4.0>},1,0},
         {{#Port<6853.6240>,<6853.4.0>},1,0},
         {{#Port<6853.6241>,<6853.4.0>},1,0},
         {{#Port<6853.6242>,<6853.4.0>},1,0},
         {{#Port<6853.6243>,<6853.4.0>},1,0},
         {{#Port<6853.6244>,<6853.4.0>},1,0},
         {{#Port<6853.6245>,<6853.4.0>},1,0},
         {{#Port<6853.6246>,<6853.4.0>},1,0},
         {{#Port<6853.6247>,<6853.4.0>},1,0},
         {{#Port<6853.6248>,<6853.4.0>},1,0},
         {{#Port<6853.6249>,<6853.4.0>},1,0},
         {{#Port<6853.6250>,<6853.4.0>},1,0},
         {{#Port<6853.6251>,<6853.4.0>},1,0},
         {{#Port<6853.6252>,<6853.4.0>},1,0},
         {{#Port<6853.6253>,<6853.4.0>},1,0},
         {{#Port<6853.6254>,<6853.4.0>},1,0},
         {{#Port<6853.6255>,<6853.4.0>},1,0},
         {{#Port<6853.6256>,<6853.4.0>},1,0},
         {{#Port<6853.6257>,<6853.4.0>},1,0},
         {{#Port<6853.6258>,<6853.4.0>},1,0},
         {{#Port<6853.6259>,<6853.4.0>},1,0},
         {{#Port<6853.6260>,<6853.4.0>},1,0},
         {{#Port<6853.6261>,<6853.4.0>},1,0},
         {{#Port<6853.6262>,<6853.4.0>},1,0},
         {{#Port<6853.6263>,<6853.4.0>},1,0},
         {{#Port<6853.6264>,<6853.4.0>},1,0},
         {{#Port<6853.6265>,<6853.4.0>},1,0},
         {{#Port<6853.6266>,<6853.4.0>},1,0},
         {{#Port<6853.6267>,<6853.4.0>},1,0},
         {{#Port<6853.6268>,<6853.4.0>},1,0},
         {{#Port<6853.6269>,<6853.4.0>},1,0},
         {{#Port<6853.6270>,<6853.4.0>},1,0},
         {{#Port<6853.6271>,<6853.4.0>},1,0},
         {{#Port<6853.6272>,<6853.4.0>},1,0},
         {{#Port<6853.6273>,<6853.4.0>},1,0},
         {{#Port<6853.6274>,<6853.4.0>},1,0},
         {{#Port<6853.6275>,<6853.4.0>},1,0},
         {{#Port<6853.6276>,<6853.4.0>},1,0},
         {{#Port<6853.6277>,<6853.4.0>},1,0},
         {{#Port<6853.6278>,<6853.4.0>},1,0},
         {{#Port<6853.6279>,<6853.4.0>},1,0},
         {{#Port<6853.6280>,<6853.4.0>},1,0},
         {{#Port<6853.6281>,<6853.4.0>},1,0},
         {{#Port<6853.6282>,<6853.4.0>},1,0},
         {{#Port<6853.6283>,<6853.187.0>},52809,0},
         {{<6853.4.0>,<6853.36.0>},66,66},
         {{<6853.36.0>,<6853.175.0>},9,0},
         {{<6853.36.0>,<6853.185.0>},2,0},
         {{<6853.36.0>,<6853.186.0>},1,0},
         {{<6853.36.0>,<6853.187.0>},3,0},
         {{<6853.36.0>,<6853.188.0>},2,0},
         {{<6853.54.0>,<6853.56.0>},2,4},
         {{<6853.56.0>,<6853.176.0>},1,0},
         {{<6853.77.0>,<6853.175.0>},1,0},
         {{<6853.77.0>,<6853.176.0>},0,1},
         {{<6853.175.0>,<6853.185.0>},1,2},
         {{<6853.175.0>,<6853.187.0>},1,0},
         {{<6853.175.0>,<6853.196.0>},1,1},
         {{<6853.179.0>,<6853.184.0>},1,0},
         {{<6853.185.0>,<6853.186.0>},9,10},
         {{<6853.186.0>,<6853.187.0>},0,1},
         {{<6853.186.0>,<6853.188.0>},0,1},
         {{<6853.186.0>,<6853.189.0>},0,1},
         {{<6853.186.0>,<6853.190.0>},0,1},
         {{<6853.186.0>,<6853.191.0>},0,1},
         {{<6853.186.0>,<6853.192.0>},0,1},
         {{<6853.186.0>,<6853.193.0>},0,1},
         {{<6853.186.0>,<6853.194.0>},0,1},
         {{<6853.186.0>,<6853.195.0>},0,1},
         {{<6853.187.0>,<6853.188.0>},26,29},
         {{<6853.187.0>,<6853.189.0>},26,28},
         {{<6853.187.0>,<6853.190.0>},26,28},
         {{<6853.187.0>,<6853.191.0>},23,26},
         {{<6853.188.0>,<6853.192.0>},107,41},
         {{<6853.188.0>,<6853.193.0>},96,47},
         {{<6853.188.0>,<6853.194.0>},104,43},
         {{<6853.188.0>,<6853.195.0>},102,44},
         {{<6853.189.0>,<6853.192.0>},110,44},
         {{<6853.189.0>,<6853.193.0>},88,39},
         {{<6853.189.0>,<6853.194.0>},94,44},
         {{<6853.189.0>,<6853.195.0>},94,43},
         {{<6853.190.0>,<6853.192.0>},73,37},
         {{<6853.190.0>,<6853.193.0>},83,43},
         {{<6853.190.0>,<6853.194.0>},78,43},
         {{<6853.190.0>,<6853.195.0>},77,43},
         {{<6853.191.0>,<6853.192.0>},88,39},
         {{<6853.191.0>,<6853.193.0>},88,38},
         {{<6853.191.0>,<6853.194.0>},88,39},
         {{<6853.191.0>,<6853.195.0>},83,38},
         {{<6853.192.0>,<6853.196.0>},0,2},
         {{<6853.193.0>,<6853.196.0>},0,2},
         {{<6853.194.0>,<6853.196.0>},0,2},
         {{<6853.195.0>,<6853.196.0>},0,2}]},
       {send_,true},
       {send_self,[]},
       {send_self_,true},
       {'receive',
        [{#Port<6853.674>,6,61},
         {#Port<6853.6204>,1,12},
         {#Port<6853.6217>,2,25},
         {#Port<6853.6218>,2,25},
         {#Port<6853.6219>,2,25},
         {#Port<6853.6220>,2,25},
         {#Port<6853.6221>,2,25},
         {#Port<6853.6222>,2,25},
         {#Port<6853.6223>,2,25},
         {#Port<6853.6224>,2,25},
         {#Port<6853.6225>,2,25},
         {#Port<6853.6226>,2,25},
         {#Port<6853.6227>,2,25},
         {#Port<6853.6228>,2,25},
         {#Port<6853.6229>,2,25},
         {#Port<6853.6230>,2,25},
         {#Port<6853.6231>,2,25},
         {#Port<6853.6232>,2,25},
         {#Port<6853.6233>,2,25},
         {#Port<6853.6234>,2,25},
         {#Port<6853.6235>,2,25},
         {#Port<6853.6236>,2,25},
         {#Port<6853.6237>,2,25},
         {#Port<6853.6238>,2,25},
         {#Port<6853.6239>,2,25},
         {#Port<6853.6240>,2,25},
         {#Port<6853.6241>,2,25},
         {#Port<6853.6242>,2,25},
         {#Port<6853.6243>,2,25},
         {#Port<6853.6244>,2,25},
         {#Port<6853.6245>,2,25},
         {#Port<6853.6246>,2,25},
         {#Port<6853.6247>,2,25},
         {#Port<6853.6248>,2,25},
         {#Port<6853.6249>,2,25},
         {#Port<6853.6250>,2,25},
         {#Port<6853.6251>,2,25},
         {#Port<6853.6252>,2,25},
         {#Port<6853.6253>,2,25},
         {#Port<6853.6254>,2,25},
         {#Port<6853.6255>,2,25},
         {#Port<6853.6256>,2,25},
         {#Port<6853.6257>,2,25},
         {#Port<6853.6258>,2,25},
         {#Port<6853.6259>,2,25},
         {#Port<6853.6260>,2,25},
         {#Port<6853.6261>,2,25},
         {#Port<6853.6262>,2,25},
         {#Port<6853.6263>,2,25},
         {#Port<6853.6264>,2,25},
         {#Port<6853.6265>,2,25},
         {#Port<6853.6266>,2,25},
         {#Port<6853.6267>,2,25},
         {#Port<6853.6268>,2,25},
         {#Port<6853.6269>,2,25},
         {#Port<6853.6270>,2,25},
         {#Port<6853.6271>,2,25},
         {#Port<6853.6272>,2,25},
         {#Port<6853.6273>,2,25},
         {#Port<6853.6274>,2,25},
         {#Port<6853.6275>,2,25},
         {#Port<6853.6276>,2,25},
         {#Port<6853.6277>,2,25},
         {#Port<6853.6278>,2,25},
         {#Port<6853.6279>,2,25},
         {#Port<6853.6280>,2,25},
         {#Port<6853.6281>,2,25},
         {#Port<6853.6282>,2,25},
         {#Port<6853.6283>,52809,739326},
         {<6853.4.0>,198,13996},
         {<6853.36.0>,83,3499},
         {<6853.54.0>,6,203},
         {<6853.56.0>,2,20},
         {<6853.77.0>,1,10},
         {<6853.175.0>,13,418},
         {<6853.176.0>,1,13},
         {<6853.179.0>,1,3},
         {<6853.185.0>,13,110},
         {<6853.186.0>,19,1149},
         {<6853.187.0>,52924,1585733},
         {<6853.188.0>,203,110571},
         {<6853.189.0>,196,110494},
         {<6853.190.0>,192,110442},
         {<6853.191.0>,177,98256},
         {<6853.192.0>,380,826291},
         {<6853.193.0>,357,827477},
         {<6853.194.0>,366,825743},
         {<6853.195.0>,358,828007},
         {<6853.196.0>,1,6}]},
       {receive_,true},
       {trace,[]},
       {trace_,true}]}},
    [{epl_tracer,handle_info,2,[{file,"src/epl_tracer.erl"},{line,271}]},
     {gen_server,try_dispatch,4,[{file,"gen_server.erl"},{line,601}]},
     {gen_server,handle_msg,5,[{file,"gen_server.erl"},{line,667}]},
     {proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,247}]}]}
ERROR: timed out while collecting data from node

Add erlangpl to software repositories

Why don't we make it easier to install and allow people to install it through different kinds of package managers like:

brew install erlangpl
apt-get install erlangpl

And so forth and so on.

remote node

Great project. Is erlangpl meant to run on the same server that it is monitoring. Currently I'm running a VM on my mac that I'm able to run erlangpl on. I can then create a ssh tunnel so that I can access erlangpl in my browser on my mac. I was wondering how I would go about running erlangpl on my mac and then connecting to the vm. If i can elaborate, please let me know. More documentation would be awesome.

Failure after connecting to an Elixir node

** Cowboy handler epl_3d_EPL terminating in websocket_handle/3
   for the reason error:undef
** Message was {text,<<"<6238.1374.0>">>}
** Options were []
** Handler state was undefined_state
** Request was [{socket,#Port<0.3495>},
                {transport,ranch_tcp},
                {connection,keepalive},
                {pid,<0.184.0>},
                {method,<<"GET">>},
                {version,'HTTP/1.1'},
                {peer,{{127,0,0,1},40159}},
                {host,<<"localhost">>},
                {host_info,undefined},
                {port,8000},
                {path,<<"/epl_3d_EPL">>},
                {path_info,undefined},
                {qs,<<>>},
                {qs_vals,undefined},
                {bindings,[]},
                {headers,[{<<"host">>,<<"localhost:8000">>},
                          {<<"connection">>,<<"Upgrade">>},
                          {<<"pragma">>,<<"no-cache">>},
                          {<<"cache-control">>,<<"no-cache">>},
                          {<<"upgrade">>,<<"websocket">>},
                          {<<"origin">>,<<"http://localhost:8000">>},
                          {<<"sec-websocket-version">>,<<"13">>},
                          {<<"user-agent">>,
                           <<"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36">>},
                          {<<"accept-encoding">>,
                           <<"gzip, deflate, sdch, br">>},
                          {<<"accept-language">>,
                           <<"en-US,en;q=0.8,pl;q=0.6">>},
                          {<<"cookie">>,
                           <<"user_id=a9cf3ecd-6b04-42bb-9e94-8b05c81f1c6f; _tide_key=SFMyNTY.g3QAAAABbQAAAAtfY3NyZl90b2tlbm0AAAAYdnc1RnJ0cCtZVVhnMHFhZ01RbFM4QT09.K1vMyMqaQDdK-gTu5_7NV-E2vsJFHS83WxgFKQl7qjA; menuPosition=right-menu">>},
                          {<<"sec-websocket-key">>,
                           <<"7UxEOOT70mx0CRJLhQfXLQ==">>},
                          {<<"sec-websocket-extensions">>,
                           <<"permessage-deflate; client_max_window_bits">>}]},
                {p_headers,[{<<"upgrade">>,[<<"websocket">>]},
                            {<<"connection">>,[<<"upgrade">>]}]},
                {cookies,undefined},
                {meta,[{websocket_version,13}]},
                {body_state,waiting},
                {multipart,undefined},
                {buffer,<<>>},
                {resp_compress,false},
                {resp_state,done},
                {resp_headers,[]},
                {resp_body,<<>>},
                {onresponse,undefined}]
** Stacktrace: [{epl,proplist_to_json,
                    [[{pid,<6238.1374.0>},
                      {current_function,{gen_server,loop,6}},
                      {initial_call,{proc_lib,init_p,5}},
                      {status,waiting},
                      {message_queue_len,0},
                      {messages,[]},
                      {links,
                          [<6238.1366.0>,<6238.1367.0>,#Port<6238.168573>,
                           <6238.1327.0>]},
                      {dictionary,
                          [{'$ancestors',
                               [<6238.1367.0>,'Elixir.Tide.Repo.Pool',
                                'Elixir.Tide.Repo','Elixir.Tide.Supervisor',
                                <6238.1363.0>]},
                           {'$initial_call',
                               {'Elixir.DBConnection.Connection',init,1}}]},
                      {trap_exit,false},
                      {error_handler,error_handler},
                      {priority,normal},
                      {group_leader,<6238.1362.0>},
                      {total_heap_size,4184},
                      {heap_size,2586},
                      {stack_size,9},
                      {reductions,536457},
                      {garbage_collection,
                          [{min_bin_vheap_size,46422},
                           {min_heap_size,233},
                           {fullsweep_after,65535},
                           {minor_gcs,1622}]},
                      {suspending,[]}]],
                    []},
                {epl_3d_EPL,websocket_handle,3,
                    [{file,"src/epl_3d_EPL.erl"},{line,60}]},
                {cowboy_websocket,handler_call,7,
                    [{file,"src/cowboy_websocket.erl"},{line,528}]},
                {cowboy_protocol,execute,4,
                    [{file,"src/cowboy_protocol.erl"},{line,522}]}]

Unable to connect to iex session

Hi!

I'm trying to get erlangpl running, but I'm not able to connect to a local node. I could be doing something stupid. Thanks for the help!

ยป iex --sname foo
Erlang/OTP 19 [erts-8.2] [source] [64-bit] [smp:4:4] [async-threads:10] [hipe] [kernel-poll:false]

Interactive Elixir (1.4.0) - press Ctrl+C to exit (type h() ENTER for help)
iex(foo@rockmac)1>
ยป erlangpl --node [email protected]
ERROR: failed to connect to remote node '[email protected]'

Why rebar2?

Curious if there is a reason you have stuck with rebar2. I'd be happy to look at sending a PR for moving to rebar3 but wanted to check first if you were interested or not.

Make erlangpl more fault tolerant

Right now we're connecting to existing node on startup as cli arguments. Is there possibility to move this process as a WebSocket response? It would be nice to start erlangpl as normal, go to browser and then select node. We can move current startup to special flag for example.
This idea comes from experiments with monitoring my poorly written Elixir app. When app crashes, or we cannot connect to node because of some errors, etc. we have to restart erlangpl which comes with lost of our collected data (charts for example, or maybe some views in future), so keeping application alive all the time, and trying to reconnect would be better idea. Right now whole process is not enough fault tolerant ๐Ÿ˜‰

Is there possibility to look for existing nodes on the same machine (127.0.0.1), we could show list of possible nodes we would like to connect. We could as well create list of recent connected node, add them to favourite. It would make whole usage experience more easier and smoother.

I'd love to send PR with some help from you because I have no idea about Erlang and just a little about OTP. So if you could point me in the right direction, it would be much easier.

/ cc @michalslaski @arkgil

Is this library still alive?

Hi, just wanna ask a question if you are going to maitain this library and if no -- why?

It looks so nice but the last update was 4 years ago. Also, the issue with cowboy and removed sha1 function (#93) seems to not be resolved.

Elixir integration, escripts, and build tools madness

So I really wanted to make it possible to write plugins in Elixir - I hoped this way we could encourage community to get involved and try to extend EPL.

I thought to myself: "Ha! Since Mix can build Erlang projects, that's gonna be piece of cake. I'll just add umbrella mix.exs, and EPL's own mix.exs and everything should work like a charm".

It did and it compiled. Mix could even share dependencies with rebar in the same directory so no garbage was produced except different directory for compilation artefacts. But we can live with that.

Wait, did I forget anything? Riiiight, we need to build an escript. Which is currently built using hacks probably found on the Internet by Michal during his university years, sprinkled on top with black magic. It looks like it just loads all beam files and contents of all priv directories it can find, reads them (!) and writes them to zip file (!!!). Apparently prepending all this stuff with escripts shebang makes it a working escript.

I thought to myself once again: "Hmm, if Mix is such a cool build tool for such a cool language, it probably can build escripts, right?". And indeed it can. It can even build escripts for Erlang applications, with or without Elixir included. "Phew, problem solved **(wiped forehead)**". But...

Mix can't package priv into escripts. So with our current setup there is no way to build an escript with Elixir using Mix. We can't use our current method too, because that would require compiled Elixir to be present in our build directory, and I'm pretty sure rebar won't handle Elixir compilation.

In my opinion the only viable and should-be-working option is to allow users to pass path to their Elixir installation ebin directory, so that we can package it manually in bootstrap script. Once packaged we would need to optionally start it in erlangpl.erl and probably log a message if we couldn't do that.

I wrote that all because I hoped for rubber-duck-like inspiration and it worked. I figured out last paragraph while writing.

But as always I need opinions, so comments and feedback are welcome.

Impact of ErlangPL on the monitored system

Hello, so ErlangPL looks amazing, but I couldn't find any mentions of its impact on the monitored system. Erlang tools like cprof, eprof and fprof take a hit on the system and aren't recommended to be kept running in production.

I guess ErlangPL uses the :sys module and :proc_lib to retrieve runtime info from the system. Therefore, I'd like to know if ErlangPL is intended to be kept running as to monitor a running production system or if it's only meant for development and debugging.

Thanks!

Through calculation seem to be wrong

Either I'm not getting or the message throughput is calculated/presented incorrectly. Look at the screenshots: the number says it ~200 while the graph show ~100.
screen shot 2017-03-01 at 17 28 45
screen shot 2017-03-01 at 17 28 50
screen shot 2017-03-01 at 17 28 54

Get rid of `start_link` callback of plugin modules

I have noticed that each plugin module needs to implement start_link callback, which for every plugin looks like this:

start_link(_Options) ->
    {ok, spawn_link(fun() -> receive _ -> ok end end)}.

I was wondering if we really need this callback. If I checked correctly, the only place it is used is here, which basically adds infinitely waiting child to epl supervisor.

I think we could remove this from all core plugins and of course remove the part which calls these functions. It could be a step towards creating friendlier plugin system ๐Ÿ™‚

While we're at it, in erlanglab/epl_counter#1 @michalslaski mentioned that init callback is also not used anymore, so maybe it's time to get rid of it too. Or at least rename it, so that it's not confused with cowboy_websocket_handler init callback.

Can't analyze Canillita

When I try to analyze inaka/canillita, I found an error.
These are the steps I follow:

$ git clone https://github.com/inaka/canillita.git
$ cd canillita
$ make rel
$ _rel/canillita/bin/canillita console

โ€ฆthen, in another shellโ€ฆ

$ ./erlangpl --node [email protected] --cookie canillita -vvv

โ€ฆwhere I get the following outputโ€ฆ

DEBUG: Plain args: ["./erlangpl","--node","[email protected]","--cookie",
                    "canillita","-vvv"]
DEBUG: Args: [{node,"[email protected]"},
              {cookie,"canillita"},
              {verbose,3},
              "./erlangpl"]
DEBUG: Loading: epl/priv/.gitignore
DEBUG: Loading: epl/priv/htdocs/asset-manifest.json
DEBUG: Loading: epl/priv/htdocs/favicon.ico
DEBUG: Loading: epl/priv/htdocs/index.html
DEBUG: Loading: epl/priv/htdocs/static/css/main.b813327a.css
DEBUG: Loading: epl/priv/htdocs/static/css/main.b813327a.css.map
DEBUG: Loading: epl/priv/htdocs/static/js/main.1ce390f5.js
DEBUG: Loading: epl/priv/htdocs/static/media/fontawesome-webfont.674f50d2.eot
DEBUG: Loading: epl/priv/htdocs/static/media/fontawesome-webfont.912ec66d.svg
DEBUG: Loading: epl/priv/htdocs/static/media/fontawesome-webfont.af7ae505.woff2
DEBUG: Loading: epl/priv/htdocs/static/media/fontawesome-webfont.b06871f2.ttf
DEBUG: Loading: epl/priv/htdocs/static/media/fontawesome-webfont.fee66e71.woff
DEBUG: Loading: epl/priv/htdocs/static/media/glyphicons-halflings-regular.448c34a5.woff2
DEBUG: Loading: epl/priv/htdocs/static/media/glyphicons-halflings-regular.89889688.svg
DEBUG: Loading: epl/priv/htdocs/static/media/glyphicons-halflings-regular.e18bbf61.ttf
DEBUG: Loading: epl/priv/htdocs/static/media/glyphicons-halflings-regular.f4769f9b.eot
DEBUG: Loading: epl/priv/htdocs/static/media/glyphicons-halflings-regular.fa277232.woff
DEBUG: Loading: epl/priv/htdocs/static/media/header_copy.4bb83c59.png
INFO:  Plugins []
INFO:  Started plugins: [{epl_dashboard_EPL,{ok,<0.90.0>}},
                         {epl_st_EPL,{ok,<0.91.0>}},
                         {epl_version_EPL,{ok,<0.92.0>}},
                         {epl_traffic_EPL,{ok,<0.93.0>}}]
Visit http://localhost:8000/

but once I open my browser at localhost:8000, I get the following error report in that console:

ERROR: timed out while collecting data from node

=ERROR REPORT==== 3-Apr-2017::10:53:07 ===
** Generic server epl_st terminating
** Last message in was {data,
                           {'[email protected]',{1491,227582,456213}},
                           [{process_count,135},
                            {memory_total,37707288},
                            {spawn,[]},
                            {exit,[]},
                            {send,
                                [{{#Port<10020.7107>,<10020.254.0>},1,0},
                                 {{<10020.254.0>,<10020.259.0>},1,0}]},
                            {send_self,[]},
                            {'receive',
                                [{#Port<10020.7107>,1,12},
                                 {<10020.158.0>,5,0},
                                 {<10020.254.0>,1,3}]},
                            {trace,[]}]}
** When Server state == {state,[]}
** Reason for termination ==
** {{timeout,{gen_server,call,
                         [epl_tracer,
                          {command,#Fun<supervisor.which_children.1>,
                                   [<10020.184.0>]}]}},
    [{gen_server,call,2,[{file,"gen_server.erl"},{line,204}]},
     {epl_st,command,2,[{file,"src/epl_st.erl"},{line,120}]},
     {epl_st,generate_sup_tree,1,[{file,"src/epl_st.erl"},{line,104}]},
     {epl_st,'-handle_info/2-fun-0-',2,[{file,"src/epl_st.erl"},{line,76}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {epl_st,handle_info,2,[{file,"src/epl_st.erl"},{line,75}]},
     {gen_server,try_dispatch,4,[{file,"gen_server.erl"},{line,601}]},
     {gen_server,handle_msg,5,[{file,"gen_server.erl"},{line,667}]}]}

=INFO REPORT==== 3-Apr-2017::10:53:07 ===
    application: epl_st
    exited: shutdown
    type: temporary
ERROR: timed out while collecting data from node
ERROR: timed out while collecting data from node
ERROR: timed out while collecting data from node
ERROR: timed out while collecting data from node
ERROR: timed out while collecting data from node

Move core plugins to apps directory

Since we're using umbrella layout, I thought it might be a good idea to move epl_traffic and epl_dashboard as a standalone applications in apps/ directory. We could also merge epl_st with this repo and move it there.

This way we get nice and clean project structure.

(We could rename epl app to epl_core to distinguish it from other plugins)

Global traffic

I'm really unsure where this question should be asked, so I'm posting it here.

On the traffic page, is it possible to see the communication between all nodes? If I'm not mistaken, currently we can only see messages between the inspected nodes and the others.

If it's not currently possible, would it necessitate a lot of modifications in the erlangpl core to achieve this?

OTP 20 + old cowboy + crypto

In OTP 20 functions deprecated in crypto-3.0 first released in OTP-R16B01 will be removed.

The currently used cowboy version 0.8.6 uses crypto:sha in websocket which won't work from OTP 20.

Crashdump when trying to connect to an Elixir or Erlang node

Hello, I just cloned your repo and followed your steps in order to get ErlangPL running and play with it. I've ran both an Elixir and Erlang emulator like so:

iex --name [email protected] -S mix
erl -name [email protected]

and then

./erlangpl --node [email protected]

and i got the following crash:

=INFO REPORT==== 4-Apr-2017::18:40:46 ===
    application: epl
    exited: {bad_return,
                {{epl_app,start,[normal,[]]},
                 {'EXIT',
                     {{badmatch,[]},
                      [{epl_app,insert_node_name,1,
                           [{file,"src/epl_app.erl"},{line,462}]},
                       {epl_app,start,2,[{file,"src/epl_app.erl"},{line,50}]},
                       {application_master,start_it_old,4,
                           [{file,"application_master.erl"},{line,273}]}]}}}}
    type: temporary
escript: exception error: no match of right hand side value
                 {error,
                     {bad_return,
                         {{epl_app,start,[normal,[]]},
                          {'EXIT',
                              {{badmatch,[]},
                               [{epl_app,insert_node_name,1,
                                    [{file,"src/epl_app.erl"},{line,462}]},
                                {epl_app,start,2,
                                    [{file,"src/epl_app.erl"},{line,50}]},
                                {application_master,start_it_old,4,
                                    [{file,"application_master.erl"},
                                     {line,273}]}]}}}}}
  in function  erlangpl:main/1 (src/erlangpl.erl, line 19)
  in call from escript:run/2 (escript.erl, line 760)
  in call from escript:start/1 (escript.erl, line 277)
  in call from init:start_em/1
  in call from init:do_boot/3

I'm on Mac OS Sierra (latest) with Elixir 1.4.2 and Erlang/OTP 19 (erts-8.3). Am I missing anything? Thanks

Crash after connection (possible duplicate)

Wanted to lodge a new issue since the other one referencing this (#43) was closed and appears to have been fixed.

I'm getting this same issue. I've updated credo to use runtime: false, I've also removed wobserver from my application (an issue I saw elsewhere) and also I've used the latest binary from the releases page.

** {{timeout,{gen_server,call,
                         [epl_tracer,
                          {command,#Fun<supervisor.which_children.1>,
                                   [<6914.2686.0>]}]}},
    [{gen_server,call,2,[{file,"gen_server.erl"},{line,204}]},
     {epl_st,command,2,[{file,"src/epl_st.erl"},{line,129}]},
     {epl_st,generate_sup_tree,1,[{file,"src/epl_st.erl"},{line,104}]},
     {epl_st,'-handle_info/2-fun-0-',2,[{file,"src/epl_st.erl"},{line,76}]},
     {lists,foldl,3,[{file,"lists.erl"},{line,1263}]},
     {epl_st,handle_info,2,[{file,"src/epl_st.erl"},{line,75}]},
     {gen_server,try_dispatch,4,[{file,"gen_server.erl"},{line,601}]},
     {gen_server,handle_msg,5,[{file,"gen_server.erl"},{line,667}]}]}

=INFO REPORT==== 16-Jun-2017::17:23:38 ===
    application: epl_st
    exited: shutdown
    type: temporary

Elixir v 1.4.2
OTP 19

Any thoughts or assistance is greatly appreciated.

Help tooltips in dashboad system info

I think that adding tooltips to dashboard/system would be good idea. Some of users might be still new to Erlang/OTP/Elixir and they don't know much about some of the settings.

zrzut ekranu 2017-08-05 o 23 17 29

For example when user hovers over kernel support it could tell him what does option does and maybe how to enable it (i personally have no idea what it does) and what consequences could be.

Or for example when hovering over process limit we could tell user how processes the system is using already.

jsone failing on node info parsing

I'm playing with Phoenix 1.3 channels right now and stumbled into this error while trying to get node info about probably simple Phoenix.Channel process.
https://github.com/Baransu/e_game/blob/master/lib/e_game/web/channels/room_channel.ex

You can reproduce it by running project from repo (with new Phoenix 1.3 commands) and trying to inspect this node:
screen shot 2017-07-07 at 00 41 04

From what I see jsone is failing and causing the crash but I don't know what is failing

Error in process <0.227.0> on node '[email protected]' with exit value:
{[{reason,badarg},
  {mfa,{epl_st_EPL,websocket_handle,3}},
  {stacktrace,
      [{jsone_encode,object_key,
           [<7275.394.0>,
            [{object_value,<<"removed">>,
                 [{<7275.404.0>,<<"removed">>},{<7275.425.0>,<<"removed">>}]},
             {object_members,
                 [{error_handler,<<"error_handler">>},
                  {garbage_collection,
                      <<"[{max_heap_size,#{error_logger => true,kill => true,size => 0}},\n {min_bin_vheap_size,46422},\n {min_heap_size,233},\n {fullsweep_after,65535},\n {minor_gcs,11}]">>},
                  {group_leader,<<"<7275.236.0>">>},
                  {heap_size,<<"1598">>},
                  {initial_call,
                      #{<<"arity">> => 5,<<"function">> => init_p,
                        <<"module">> => proc_lib}},
                  {links,
                      <<"[<7275.394.0>,<7275.425.0>,<7275.404.0>,<7275.267.0>]">>},
                  {message_queue_len,<<"0">>},
                  {messages,<<"[]">>},
                  {priority,<<"normal">>},
                  {reductions,<<"4180">>},
                  {stack_size,<<"16">>},
                  {status,<<"waiting">>},
                  {suspending,<<"[]">>},
                  {total_heap_size,<<"1974">>},
                  {trap_exit,<<"true">>}]},
             {object_members,[]},
             {object_members,[{<<"topic">>,<<"node-info">>}]}],
            <<"{\"data\":{\"id\":\"<7275.268.0>\",\"info\":{\"current_function\":{\"arity\":4,\"function\":\"loop\",\"module\":\"ranch_conns_sup\"},\"dictionary\":{\"$ancestors\":[\"<7275.267.0>\",\"Elixir.EGame.Web.Endpoint.Server\",\"Elixir.EGame.Web.Endpoint\",\"Elixir.EGame.Supervisor\",\"<7275.237.0>\"],\"$initial_call\":{\"arity\":7,\"function\":\"init\",\"module\":\"ranch_conns_sup\"},">>,
            {encode_opt_v2,false,false,
                [{scientific,20}],
                {iso8601,0},
                string,0,0,false}],
           [{line,220}]},
       {jsone,encode,2,[{file,"src/jsone.erl"},{line,339}]},
       {epl_st_EPL,websocket_handle,3,[{file,"src/epl_st_EPL.erl"},{line,47}]},
       {cowboy_websocket,handler_call,7,
           [{file,"src/cowboy_websocket.erl"},{line,588}]},
       {cowboy_protocol,execute,4,
           [{file,"src/cowboy_protocol.erl"},{line,442}]}]},
  {msg,{text,<<"<7275.268.0>">>}},
  {req,
      [{socket,#Port<0.9929>},
       {transport,ranch_tcp},
       {connection,keepalive},
       {pid,<0.227.0>},
       {method,<<"GET">>},
       {version,'HTTP/1.1'},
       {peer,{{127,0,0,1},57094}},
       {host,<<"localhost">>},
       {host_info,undefined},
       {port,37575},
       {path,<<"/epl_st_EPL">>},
       {path_info,undefined},
       {qs,<<>>},
       {qs_vals,undefined},
       {bindings,[]},
       {headers,
           [{<<"host">>,<<"localhost:37575">>},
            {<<"connection">>,<<"Upgrade">>},
            {<<"pragma">>,<<"no-cache">>},
            {<<"cache-control">>,<<"no-cache">>},
            {<<"upgrade">>,<<"websocket">>},
            {<<"origin">>,<<"http://localhost:37575">>},
            {<<"sec-websocket-version">>,<<"13">>},
            {<<"user-agent">>,
             <<"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36">>},
            {<<"accept-encoding">>,<<"gzip, deflate, br">>},
            {<<"accept-language">>,<<"pl-PL,pl;q=0.8,en-US;q=0.6,en;q=0.4">>},
            {<<"cookie">>,
             <<"connect.sid=s%3AyVekaxk8_VvRRRNKaDbfIYLbffH4wSLq.FdUkg1%2BiOxjbY1cFGK7z9kyTlUYXkJbhhSsxC85w43Y; _patron_league_key=g3QAAAABbQAAABBndWFyZGlhbl9kZWZhdWx0bQAAAV9leUpoYkdjaU9pSklVelV4TWlJc0luUjVjQ0k2SWtwWFZDSjkuZXlKaGRXUWlPaUpWYzJWeU9qRWlMQ0psZUhBaU9qRTBOelEwTlRnM05USXNJbWxoZENJNk1UUTNNVGcyTmpjMU1pd2lhWE56SWpvaVVHRjBjbTl1VEdWaFozVmxMbVJsZGlJc0ltcDBhU0k2SWprM1pHUm1ZemN6TFRNMFlUQXRORE01T1MwNVlUbGpMVEppWlRBeVlXWmlaVEUxTVNJc0luQmxiU0k2ZXlKa1pXWmhkV3gwSWpvMk0zMHNJbk4xWWlJNklsVnpaWEk2TVNJc0luUjVjQ0k2SW5SdmEyVnVJbjAudi1lTENhQmZyNzRRZndubzhoWkZSY09FWWpDMGwyQ1c0RHRXUTN5ZFQ2emU0bzg3VkNFYWx0QnJ3TzNSZm9YOFZrOUdWTG5FRmpUcFhHQXpNRUdxRHc=##sp3rArdIpCTu16bcaqK7imFTS-c=; lastUpload={%22last%22:1472135802443%2C%22images%22:0}; TawkConnectionTime=0; io=8UAEb8_-jdpGtBhCAAAA; menuSize=0; menuPosition=left-menu; intercom-id-iynax7cx=39ea4b13-97dd-4240-a674-94a73108a7fd; __ar_v4=V2QOTBOTS5COHGQVEMOZMZ%3A20170521%3A8%7CIH3MINJP7JCV7CCTYJYSZA%3A20170521%3A8%7C7TSF5BMKKVACDED7TVROJD%3A20170521%3A8; _ga=GA1.1.154117981.1475505405; mp_1931ca691c8e604805b5832c30f07d71_mixpanel=%7B%22distinct_id%22%3A%20%2215c3126b6931e5-086fea71166d83-3060750a-13c680-15c3126b694580%22%2C%22mp_lib%22%3A%20%22Segment%3A%20web%22%2C%22%24initial_referrer%22%3A%20%22http%3A%2F%2Flocalhost%3A8080%2Fc6580820%2Fships%2F57988b8103777d17690002e6%2Fcustomize%3Fpane%3Droot%22%2C%22%24initial_referring_domain%22%3A%20%22localhost%3A8080%22%7D; amplitude_id=eyJkZXZpY2VJZCI6IjVmOGI0ZTgzLWE0YTAtNDYzNy05ODQ3LWRiYWUyYWZhOTFiNVIiLCJ1c2VySWQiOm51bGwsIm9wdE91dCI6ZmFsc2UsInNlc3Npb25JZCI6MTQ5ODgxNjEwNDQwNywibGFzdEV2ZW50VGltZSI6MTQ5ODgxNjEwNDQwNCwiZXZlbnRJZCI6MCwiaWRlbnRpZnlJZCI6NSwic2VxdWVuY2VOdW1iZXIiOjV9; intercom-id-a9981292337788423d2b9798ad23aa0ca7143b10=a3a9e314-a13d-49a9-ae36-d3f870928095; hull_558979b4f59837f6160003c9=eyJIdWxsLUF1dGgtU2NvcGUiOiJVc2VyOjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1JZCI6IjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1TaWciOiIxNDk4ODE5Mjc0LjgxNmY5ZDYwNzQxYzc0NzZkMTY5Mjk3MzM2MmY1NWVmNzkxYjg3YTgifQ==; hull_53175bb2635c78c8790032cd=eyJIdWxsLUF1dGgtU2NvcGUiOiJVc2VyOjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1JZCI6IjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1TaWciOiIxNDk4ODE5NDUzLmUyMDExYjkyZGMzODZhNDJhZjA1NGE5OGE5NmNkYjU5ZTI3NWJmMzYifQ==; ajs_anonymous_id=%224794d025-d231-43ff-9b1e-36f1f9555c36%22; ajs_group_id=null; hull_52fb86bedea4dfd8de000003=eyJIdWxsLUF1dGgtU2NvcGUiOiJVc2VyOjU5MjZiN2E1NjM3ZmRkZDg4NjAwMDA2NSIsIkh1bGwtVXNlci1JZCI6IjU5MjZiN2E1NjM3ZmRkZDg4NjAwMDA2NSIsIkh1bGwtVXNlci1TaWciOiIxNDk4ODM3MTAyLjJjOTUyZmY0YTk2YzViODg1ZTFkOTQ0Mjc0M2Y3YmU4MjQzMGVkNWYifQ==; ajs_user_id=%225926b7a5637fddd886000065%22; mp_3f3f1551de3571ba4858431cda92aafc_mixpanel=%7B%22distinct_id%22%3A%20%225926b7a5637fddd886000065%22%2C%22mp_lib%22%3A%20%22Segment%3A%20web%22%2C%22%24initial_referrer%22%3A%20%22%24direct%22%2C%22%24initial_referring_domain%22%3A%20%22%24direct%22%2C%22mp_name_tag%22%3A%20%22tomasz.cichocinski%40codeheroes.io%22%2C%22id%22%3A%20%225926b7a5637fddd886000065%22%2C%22name%22%3A%20null%2C%22%24created%22%3A%20%222017-05-25T10%3A53%3A25.000Z%22%2C%22%24email%22%3A%20%22tomasz.cichocinski%40codeheroes.io%22%2C%22company%22%3A%20%7B%22id%22%3A%20%22561fb665450f34b1cf00000a%22%7D%7D">>},
            {<<"sec-websocket-key">>,<<"vJx1NDD6rkyCMtPp1hwOQA==">>},
            {<<"sec-websocket-extensions">>,
             <<"permessage-deflate; client_max_window_bits">>}]},
       {p_headers,
           [{<<"sec-websocket-extensions">>,
             [{<<"permessage-deflate">>,[<<"client_max_window_bits">>]}]},
            {<<"upgrade">>,[<<"websocket">>]},
            {<<"connection">>,[<<"upgrade">>]}]},
       {cookies,undefined},
       {meta,[{websocket_version,13},{websocket_compress,false}]},
       {body_state,waiting},
       {buffer,<<>>},
       {multipart,undefined},
       {resp_compress,false},
       {resp_state,done},
       {resp_headers,[]},
       {resp_body,<<>>},
       {onresponse,undefined}]},
  {state,undefined_state}],
 [{cowboy_websocket,handler_call,7,
      [{file,"src/cowboy_websocket.erl"},{line,642}]},
  {cowboy_protocol,execute,4,[{file,"src/cowboy_protocol.erl"},{line,442}]}]}

=ERROR REPORT==== 7-Jul-2017::00:33:20 ===
Ranch listener http had connection process started with cowboy_protocol:start_link/4 at <0.227.0> exit with reason: {[{reason,badarg},{mfa,{epl_st_EPL,websocket_handle,3}},{stacktrace,[{jsone_encode,object_key,[<7275.394.0>,[{object_value,<<"removed">>,[{<7275.404.0>,<<"removed">>},{<7275.425.0>,<<"removed">>}]},{object_members,[{error_handler,<<"error_handler">>},{garbage_collection,<<"[{max_heap_size,#{error_logger => true,kill => true,size => 0}},\n {min_bin_vheap_size,46422},\n {min_heap_size,233},\n {fullsweep_after,65535},\n {minor_gcs,11}]">>},{group_leader,<<"<7275.236.0>">>},{heap_size,<<"1598">>},{initial_call,#{<<"arity">> => 5,<<"function">> => init_p,<<"module">> => proc_lib}},{links,<<"[<7275.394.0>,<7275.425.0>,<7275.404.0>,<7275.267.0>]">>},{message_queue_len,<<"0">>},{messages,<<"[]">>},{priority,<<"normal">>},{reductions,<<"4180">>},{stack_size,<<"16">>},{status,<<"waiting">>},{suspending,<<"[]">>},{total_heap_size,<<"1974">>},{trap_exit,<<"true">>}]},{object_members,[]},{object_members,[{<<"topic">>,<<"node-info">>}]}],<<"{\"data\":{\"id\":\"<7275.268.0>\",\"info\":{\"current_function\":{\"arity\":4,\"function\":\"loop\",\"module\":\"ranch_conns_sup\"},\"dictionary\":{\"$ancestors\":[\"<7275.267.0>\",\"Elixir.EGame.Web.Endpoint.Server\",\"Elixir.EGame.Web.Endpoint\",\"Elixir.EGame.Supervisor\",\"<7275.237.0>\"],\"$initial_call\":{\"arity\":7,\"function\":\"init\",\"module\":\"ranch_conns_sup\"},">>,{encode_opt_v2,false,false,[{scientific,20}],{iso8601,0},string,0,0,false}],[{line,220}]},{jsone,encode,2,[{file,"src/jsone.erl"},{line,339}]},{epl_st_EPL,websocket_handle,3,[{file,"src/epl_st_EPL.erl"},{line,47}]},{cowboy_websocket,handler_call,7,[{file,"src/cowboy_websocket.erl"},{line,588}]},{cowboy_protocol,execute,4,[{file,"src/cowboy_protocol.erl"},{line,442}]}]},{msg,{text,<<"<7275.268.0>">>}},{req,[{socket,#Port<0.9929>},{transport,ranch_tcp},{connection,keepalive},{pid,<0.227.0>},{method,<<"GET">>},{version,'HTTP/1.1'},{peer,{{127,0,0,1},57094}},{host,<<"localhost">>},{host_info,undefined},{port,37575},{path,<<"/epl_st_EPL">>},{path_info,undefined},{qs,<<>>},{qs_vals,undefined},{bindings,[]},{headers,[{<<"host">>,<<"localhost:37575">>},{<<"connection">>,<<"Upgrade">>},{<<"pragma">>,<<"no-cache">>},{<<"cache-control">>,<<"no-cache">>},{<<"upgrade">>,<<"websocket">>},{<<"origin">>,<<"http://localhost:37575">>},{<<"sec-websocket-version">>,<<"13">>},{<<"user-agent">>,<<"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36">>},{<<"accept-encoding">>,<<"gzip, deflate, br">>},{<<"accept-language">>,<<"pl-PL,pl;q=0.8,en-US;q=0.6,en;q=0.4">>},{<<"cookie">>,<<"connect.sid=s%3AyVekaxk8_VvRRRNKaDbfIYLbffH4wSLq.FdUkg1%2BiOxjbY1cFGK7z9kyTlUYXkJbhhSsxC85w43Y; _patron_league_key=g3QAAAABbQAAABBndWFyZGlhbl9kZWZhdWx0bQAAAV9leUpoYkdjaU9pSklVelV4TWlJc0luUjVjQ0k2SWtwWFZDSjkuZXlKaGRXUWlPaUpWYzJWeU9qRWlMQ0psZUhBaU9qRTBOelEwTlRnM05USXNJbWxoZENJNk1UUTNNVGcyTmpjMU1pd2lhWE56SWpvaVVHRjBjbTl1VEdWaFozVmxMbVJsZGlJc0ltcDBhU0k2SWprM1pHUm1ZemN6TFRNMFlUQXRORE01T1MwNVlUbGpMVEppWlRBeVlXWmlaVEUxTVNJc0luQmxiU0k2ZXlKa1pXWmhkV3gwSWpvMk0zMHNJbk4xWWlJNklsVnpaWEk2TVNJc0luUjVjQ0k2SW5SdmEyVnVJbjAudi1lTENhQmZyNzRRZndubzhoWkZSY09FWWpDMGwyQ1c0RHRXUTN5ZFQ2emU0bzg3VkNFYWx0QnJ3TzNSZm9YOFZrOUdWTG5FRmpUcFhHQXpNRUdxRHc=##sp3rArdIpCTu16bcaqK7imFTS-c=; lastUpload={%22last%22:1472135802443%2C%22images%22:0}; TawkConnectionTime=0; io=8UAEb8_-jdpGtBhCAAAA; menuSize=0; menuPosition=left-menu; intercom-id-iynax7cx=39ea4b13-97dd-4240-a674-94a73108a7fd; __ar_v4=V2QOTBOTS5COHGQVEMOZMZ%3A20170521%3A8%7CIH3MINJP7JCV7CCTYJYSZA%3A20170521%3A8%7C7TSF5BMKKVACDED7TVROJD%3A20170521%3A8; _ga=GA1.1.154117981.1475505405; mp_1931ca691c8e604805b5832c30f07d71_mixpanel=%7B%22distinct_id%22%3A%20%2215c3126b6931e5-086fea71166d83-3060750a-13c680-15c3126b694580%22%2C%22mp_lib%22%3A%20%22Segment%3A%20web%22%2C%22%24initial_referrer%22%3A%20%22http%3A%2F%2Flocalhost%3A8080%2Fc6580820%2Fships%2F57988b8103777d17690002e6%2Fcustomize%3Fpane%3Droot%22%2C%22%24initial_referring_domain%22%3A%20%22localhost%3A8080%22%7D; amplitude_id=eyJkZXZpY2VJZCI6IjVmOGI0ZTgzLWE0YTAtNDYzNy05ODQ3LWRiYWUyYWZhOTFiNVIiLCJ1c2VySWQiOm51bGwsIm9wdE91dCI6ZmFsc2UsInNlc3Npb25JZCI6MTQ5ODgxNjEwNDQwNywibGFzdEV2ZW50VGltZSI6MTQ5ODgxNjEwNDQwNCwiZXZlbnRJZCI6MCwiaWRlbnRpZnlJZCI6NSwic2VxdWVuY2VOdW1iZXIiOjV9; intercom-id-a9981292337788423d2b9798ad23aa0ca7143b10=a3a9e314-a13d-49a9-ae36-d3f870928095; hull_558979b4f59837f6160003c9=eyJIdWxsLUF1dGgtU2NvcGUiOiJVc2VyOjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1JZCI6IjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1TaWciOiIxNDk4ODE5Mjc0LjgxNmY5ZDYwNzQxYzc0NzZkMTY5Mjk3MzM2MmY1NWVmNzkxYjg3YTgifQ==; hull_53175bb2635c78c8790032cd=eyJIdWxsLUF1dGgtU2NvcGUiOiJVc2VyOjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1JZCI6IjU5NDhkOTQ5OGU3YTEyZTlhNzAwNDVkNCIsIkh1bGwtVXNlci1TaWciOiIxNDk4ODE5NDUzLmUyMDExYjkyZGMzODZhNDJhZjA1NGE5OGE5NmNkYjU5ZTI3NWJmMzYifQ==; ajs_anonymous_id=%224794d025-d231-43ff-9b1e-36f1f9555c36%22; ajs_group_id=null; hull_52fb86bedea4dfd8de000003=eyJIdWxsLUF1dGgtU2NvcGUiOiJVc2VyOjU5MjZiN2E1NjM3ZmRkZDg4NjAwMDA2NSIsIkh1bGwtVXNlci1JZCI6IjU5MjZiN2E1NjM3ZmRkZDg4NjAwMDA2NSIsIkh1bGwtVXNlci1TaWciOiIxNDk4ODM3MTAyLjJjOTUyZmY0YTk2YzViODg1ZTFkOTQ0Mjc0M2Y3YmU4MjQzMGVkNWYifQ==; ajs_user_id=%225926b7a5637fddd886000065%22; mp_3f3f1551de3571ba4858431cda92aafc_mixpanel=%7B%22distinct_id%22%3A%20%225926b7a5637fddd886000065%22%2C%22mp_lib%22%3A%20%22Segment%3A%20web%22%2C%22%24initial_referrer%22%3A%20%22%24direct%22%2C%22%24initial_referring_domain%22%3A%20%22%24direct%22%2C%22mp_name_tag%22%3A%20%22tomasz.cichocinski%40codeheroes.io%22%2C%22id%22%3A%20%225926b7a5637fddd886000065%22%2C%22name%22%3A%20null%2C%22%24created%22%3A%20%222017-05-25T10%3A53%3A25.000Z%22%2C%22%24email%22%3A%20%22tomasz.cichocinski%40codeheroes.io%22%2C%22company%22%3A%20%7B%22id%22%3A%20%22561fb665450f34b1cf00000a%22%7D%7D">>},{<<"sec-websocket-key">>,<<"vJx1NDD6rkyCMtPp1hwOQA==">>},{<<"sec-websocket-extensions">>,<<"permessage-deflate; client_max_window_bits">>}]},{p_headers,[{<<"sec-websocket-extensions">>,[{<<"permessage-deflate">>,[<<"client_max_window_bits">>]}]},{<<"upgrade">>,[<<"websocket">>]},{<<"connection">>,[<<"upgrade">>]}]},{cookies,undefined},{meta,[{websocket_version,13},{websocket_compress,false}]},{body_state,waiting},{buffer,<<>>},{multipart,undefined},{resp_compress,false},{resp_state,done},{resp_headers,[]},{resp_body,<<>>},{onresponse,undefined}]},{state,undefined_state}],[{cowboy_websocket,handler_call,7,[{file,"src/cowboy_websocket.erl"},{line,642}]},{cowboy_protocol,execute,4,[{file,"src/cowboy_protocol.erl"},{line,442}]}]}

runtime error

Hi,

I'm facing a runtime error related to the websocket component.

Error in process <0.226.0> on node '[email protected]' with exit value:
{undef,[{crypto,sha,
[<<"DhX4I0c+u60EPDe8ETlV2w==258EAFA5-E914-47DA-95CA-C5AB0DC85B11">>],
[]},
{cowboy_websocket,websocket_handshake,3,
[{file,"src/cowboy_websocket.erl"},{line,142}]},
{cowboy_protocol,execute,4,
[{file,"src/cowboy_protocol.erl"},{line,522}]}]}

For your information, the node running on centos 7 with erlang/OTP 21

I've tried to recompile the tool but it fails with the following error ๐Ÿ‘ error

/production/factory/erlangpl/node_modules/elm-format: Command failed.
Exit signal: SIGABRT
Command: binwrap-install
Arguments:
Directory: /production/factory/erlangpl/node_modules/elm-format
Output:
/usr/bin/node[1466]: ../src/node_contextify.cc:626:static void node::contextify::ContextifyScript::New(const v8::FunctionCallbackInfov8::Value&): Assertion `args[1]->IsString()' failed.
1: 0x8dc510 node::Abort() [/usr/bin/node]
2: 0x8dc5e5 [/usr/bin/node]
3: 0x91081e node::contextify::ContextifyScript::New(v8::FunctionCallbackInfov8::Value const&) [/usr/bin/node]
4: 0xb6166b [/usr/bin/node]
5: 0xb63602 v8::internal::Builtin_HandleApiCall(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/bin/node]

Any suggestion appreciated.

Thank you

Add download links to a README

Many people are seeing our repository first, not website, and they end up builiding EPL manually instead of downloading prebuilt escript. We should make it clear in the README that they can download it.

ETS node view

Guys,

I've been working on the ets node view for a few days and I've just finished the first prototype. It shows all the ETS tables present on a particular node. Each table is connected to its owner's process. Tables which "can cause problems" are coloured red and yellow. Tables to be coloured are choosen based on the size of the memory allocated for a particular table. The five biggest tables are coloured red and the next five are coloured yellow.

Here are some screenshots:
ets_node_view

ets_node_view_detail

Here is the code: https://github.com/mkacper/erlangpl/tree/ets-node-view-proto

I encourage you to build this branch and check it out. Please let me know what you think :)

Genereal idea for this view is to make it possible to analyse ETS tables from different perspectives and help users to find suspicious tables. I think it would be nice to add some kind of switch in the UI to allow users change the view's mode (change what we analyse in terms of ETS e.g. memory usage or size of the table etc.).

What to analyse:

  • Allocated memory usage - already prototyped. Shows the amount of the memory allocated for a particular table. I talked to @michalslaski about that and we agreed that it would be propably better to show this kind of information in a sorted table instead on a Vizceral graph and using colours. So why did I do this prototype? Because I talked to some Erlangers and they said that ETS memory size can often provide us with the information that something is wrong. I decided to show this data on a Vizceral graph because it gives us a nice context (especially if we could double click on a suspicious table and see who and how comunicates with the table) and I have a Vizceral at hand :) What do you think about that?

  • Memory utilization/fragmentation - I've mentioned about it during ErlangPL meetup. I have some doubts about this metric. Again, I've talk to some Erlangers and none of them (excluded @michalslaski :)) have never met this kind of problem with ETS tables. Moreover, getting this metric seems to be not so easy. I don't know if it's worth the effort at this stage.

  • Size - numbers of rows in a table. Some people say this metric can be helpful. Again, does it make sense to show it on Vizceral graph?

  • Lock statistics - statistics about locks protecting ETS tables (I don't know what exactly yet). Need compiling ErlangVM with --enable-lock-counter flag (unless you use OTP 20). I heard that even if the lock profiler is disabled it degradates ErlangVM perfromance for about 30%.

  • Mean access time - how much time do you need (on average) to access the table.

The most important questions (in my opinion) are:

  • Which of the aformentioned ideas make sense?
  • How should we visualise patricular metrics?
  • What else could we measure (in terms of ETS performance)?

I would be very grateful for your feedback/suggestions/ideas :)

websocket port is hard coded

Command line parameter '-P' does not change the web-ui's port to connect to epl's websocket. So if you change the port via '-P', you can connect to the gui with a browser at that port, but no data will be displayed.

The websocket port is hardcoded in web-ui/src/sockets.js:

index ef7b989..c90257f 100644
--- a/src/sockets.js
+++ b/src/sockets.js
@@ -92,7 +92,7 @@ let socketsArray = [];
 export const createSockets = (sockets: any) => {
   socketsArray = Object.keys(sockets).map(route => {
     const { hostname } = window.location;
-    let ws = new WebSocket(`ws://${hostname}:8000/${route}`);
+    let ws = new WebSocket(`ws://${hostname}:9080/${route}`);

     const handlers = Object.keys(sockets[route].topics).reduce(
       (acc, topic) => {```

Update traffic data format

Netflix/vizceral#59 got merged into Vizceral recently. It adds entryNode property and removes requirement for INTERNET node. It also allows two way message passing in our cluster view.

Things we have to change to update to version 4.4.0:

  • change focused -> focusedChild
  • for every graph add entryNode property describing graph entry node
  • replace one, one way connection to entryNode with warnings with two, one way connection without it

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.