Giter Club home page Giter Club logo

elk-stuff's People

Contributors

ajnouri avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

elk-stuff's Issues

With variable output file name : "The starting part of the path should not be dynamic"

When I add a output file with variable in the pipiline.conf

path => "%{type}_%{+yyyy_MM_dd}.log"

I get the logstash error :

The starting part of the path should not be dynamic

It works fine when I remove the path line or without variable name:

path => "access.log"

Here is the pipelines.conf

    input {
        file {
            path => "/mylogstash/data/apache_access.log"
            start_position => "beginning"
            type => "access"
        }
        http {
            type => "access"
        }
    }
    filter {
        grok {
        match => { "message" => '%{HTTPD_COMMONLOG} "%{GREEDYDATA:referrer}" "%{GREEDYDATA:agent}"' }
        }
        mutate {
            convert => {
                "response" => "integer"
                "bytes" => "integer"
            }
        }
    }
    output {
        stdout {
            codec => rubydebug
        }
        file {
            path => "%{type}_%{+yyyy_MM_dd}.log"
        }
    }

Logstash Flapping listener: logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5001"} ... logstash.inputs.udp - UDP listener died

I am trying to listen to udp port 5001, but the process is flapping

16:34:33.991 [[main]<udp] INFO logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5001"}
16:34:33.992 [[main]<udp] WARN logstash.inputs.udp - UDP listener died {:exception=>#<NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr>, :backtrace=>["org/jruby/RubyModule.java:2746:in const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:87:in udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:57:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:470:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:463:in `start_input'"]}

16:34:38.992 [[main]<udp] INFO logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5001"}
16:34:38.993 [[main]<udp] WARN logstash.inputs.udp - UDP listener died {:exception=>#<NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr>, :backtrace=>["org/jruby/RubyModule.java:2746:in const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:87:in udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:57:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:470:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:463:in `start_input'"]}
16:34:43.995 [[main]<udp] INFO logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5001"}

16:34:43.996 [[main]<udp] WARN logstash.inputs.udp - UDP listener died {:exception=>#<NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr>, :backtrace=>["org/jruby/RubyModule.java:2746:in const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:87:in udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:57:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:470:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:463:in `start_input'"]}

It is talking about NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr (constant missing)
Tried setting the host:

 udp {
 port => 5001
 host => "51.15.239.154"
 type => "cisco"
 }

still the same outcome

16:37:37.844 [[main]<udp] INFO  logstash.inputs.udp - Starting UDP listener {:address=>"51.15.239.154:5001"}
16:37:37.845 [[main]<udp] WARN  logstash.inputs.udp - UDP listener died {:exception=>#<NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr>, :backtrace=>["org/jruby/RubyModule.java:2746:in `const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:87:in `udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:57:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:470:in `inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:463:in `start_input'"]}

16:37:42.848 [[main]<udp] INFO  logstash.inputs.udp - Starting UDP listener {:address=>"51.15.239.154:5001"}
16:37:42.849 [[main]<udp] WARN  logstash.inputs.udp - UDP listener died {:exception=>#<NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr>, :backtrace=>["org/jruby/RubyModule.java:2746:in `const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:87:in `udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:57:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:470:in `inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:463:in `start_input'"]}
16:37:47.849 [[main]<udp] INFO  logstash.inputs.udp - Starting UDP listener {:address=>"51.15.239.154:5001"}

I am running ELK stack on containers with docker-compose:

root@scw-1f59eb:~# docker-compose ps
            Name                         Command                         State                          Ports             
-------------------------------------------------------------------------------------------------------------------------
root_elasticsearch_1           /docker-entrypoint.sh elas     Up                             0.0.0.0:9200->9200/tcp,      
                               ...                                                           0.0.0.0:9300->9300/tcp       
root_kibana_1                  /docker-entrypoint.sh kibana   Up                             0.0.0.0:5601->5601/tcp       
root_mylogstash_1              /docker-entrypoint.sh bash     Up                             0.0.0.0:5000->5000/tcp,      
                                                                                             0.0.0.0:5000->5000/udp,      
                                                                                             0.0.0.0:514->514/tcp,        
                                                                                             0.0.0.0:514->514/udp,        
                                                                                             0.0.0.0:8080->8080/tcp       

using the following docker-compose file:

version: '2'
services:
    elasticsearch:
        image: elasticsearch
        ports:
            - "9200:9200"
            - "9300:9300"
        volumes:
            - ./data:/usr/share/elasticsearch/data
    mylogstash:
        build:
            context: .
            dockerfile: Dockerfile
        volumes:
            - ./config-dir:/mylogstash/config-dir:rw
            - ./data:/mylogstash/data:rw
            - ./plugin:/mylogstash/plugin:rw
        ports:
            - "5000:5000/tcp"
            - "5000:5000/udp"
            - "8080:8080"
            - "514:514/udp"
            - "514:514/tcp"
        tty: true
        command: ["bash"]
        links:
            - elasticsearch
    kibana:
        image: kibana
        ports:
            - "5601:5601"
        links:
            - elasticsearch

Logstash : Reading mysql database: You have an error in your SQL syntax;

I have a mysql fake database which I would like to read as input to logstash & then send the data to elasticsearch.

The database IP, the password, the port, the database and the table are correct As well as the query:

From the database host

$ mysql -h 127.0.0.1 -u root -p -Bse "USE employees; SELECT * FROM employees;"

...
|26859|1955-11-09|Jaana|Fadgyas|M|1989-09-19|
|26860|1959-11-21|Indrajit|Molberg|M|1988-10-14|
|26861|1964-06-17|Candida|Falck|F|1989-10-09|
|26862|1954-08-25|Sadegh|Hebert|F|1989-12-06|
|26863|1954-08-14|Badri|Shinomoto|M|1989-08-29|
|26864|1964-09-04|Yinghua|Strooper|M|1995-06-05|
|26865|1953-01-20|Qiwen|Ventosa|M|1988-11-05|
|26866|1962-02-03|Denis|Gecseg|F|1985-10-18|
|26867|1959-11-26|Qunsheng|Kitsuregawa|F|1988-09-11|
|26868|1956-11-22|Elrique|Khalil|F|1987-02-26|
|26869|1955-04-26|Jasminko|Vecchio|F|1985-10-29|
|26870|1960-12-12|Stepehn|Peot|M|1991-07-23|
...

From logstash

$ logstash -f mylogstash/config-dir/pipelines.conf --config.reload.automatic

12:49:00.100 [Ruby-0-Thread-18: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:283] ERROR logstash.inputs.jdbc - Java::ComMysqlJdbcExceptionsJdbc4::MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ';) ASt1LIMIT 1' at line 1: SELECT count(*) AScountFROM (SELECT * FROM employees;) ASt1LIMIT 1

$ cat pipelines.conf

input {
	jdbc { 
		jdbc_driver_library => "/mylogstash/plugin/mysql-connector-java-5.1.23-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"
		jdbc_connection_string => "jdbc:mysql://192.168.0.146:3306/employees"
		jdbc_user => "root"
		jdbc_password => "passwd"
		statement => "SELECT * FROM employees;"
		schedule => "* * * * *"
		jdbc_paging_enabled => "true"
		jdbc_page_size => "50000"
	 }
}
filter {
}
    

output {

    elasticsearch {
        hosts => [ "elasticsearch:9200" ]
        document_type => "default"
        http_compression => true
    }
}

The query looks OK.

logstash UDP listener died ()

16:08:17.319 [[main]<udp] WARN logstash.inputs.udp - UDP listener died {:exception=>#<NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr>, :backtrace=>["org/jruby/RubyModule.java:2746:in const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:87:in udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:57:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:470:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:463:in `start_input'"]}

using very simple config:

input {
    udp {
        port => 5001
        type => "cisco"
    }

}

filter {
}

output {
    stdout {
        codec => rubydebug
    }
}

# logstash -f mylogstash/config-dir/cisco.conf -r

[LogStash::Runner] ERROR logstash.agent - Cannot create pipeline {:reason=>"Expected one of #, => at line 3, column 8 (byte 25) after input {\n\tinput {\n\t jdbc "}

mylogstash/config-dir/pipelines.conf

input {
	input {
	 jdbc {
	   jdbc_driver_library => "/mylogstash/plugin/mysql-connector-java-5.1.23-bin.jar"
	   jdbc_driver_class => "com.mysql.jdbc.Driver"
	   jdbc_connection_string => "jdbc:mysql://192.168.0.146:3306/employees"
	   jdbc_user => "root"
	   jdbc_password => "passwd"
	   statement => "SELECT * FROM employees;"
	   schedule => "* * * * *"              
	   jdbc_paging_enabled => "true"
	   jdbc_page_size => "50000"
	 }
	}
}
filter {
}
    
output {

    elasticsearch {
        hosts => [ "elasticsearch:9200" ]
        document_type => "default"
        http_compression => true
    }
}
root@c3681470d383:/# logstash -f mylogstash/config-dir/pipelines.conf --config.reload.automatic
Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
09:32:34.593 [main] INFO  logstash.modules.scaffold - Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
09:32:34.595 [main] INFO  logstash.modules.scaffold - Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
09:32:34.672 [LogStash::Runner] ERROR logstash.agent - Cannot create pipeline {:reason=>"Expected one of #, => at line 3, column 8 (byte 25) after input {\n\tinput {\n\t jdbc "}

Cannot make Logstash read syslog

Cannot make logstash receive syslog 514 traffic.
ELK run successfully and listens to port 514:

# docker-compose ps
          Name                        Command               State                                  Ports                                 
----------------------------------------------------------------------------------------------------------------------------------------
logstash_elasticsearch_1   /docker-entrypoint.sh elas ...   Up      0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp                       
logstash_kibana_1          /docker-entrypoint.sh kibana     Up      0.0.0.0:5601->5601/tcp                                               
logstash_mylogstash_1      /docker-entrypoint.sh bash       Up      0.0.0.0:5000->5000/tcp, 0.0.0.0:514->514/tcp, 0.0.0.0:8080->8080/tcp 

I have tried both plugins syslog and tcp/udp listening on port 514:

1- Syslog plugin:

input {
  syslog { }
}

filter {
}

output {
    stdout {
        codec => rubydebug
    }
}

Result From inside logstash container ==> Logstash do not react to syslog traffic:

21:03:29.310 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
21:03:29.344 [Ruby-0-Thread-12: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-syslog-3.4.1/lib/logstash/inputs/syslog.rb:109] INFO logstash.inputs.syslog - Starting syslog udp listener {:address=>"0.0.0.0:514"}
21:03:29.357 [Ruby-0-Thread-14: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-syslog-3.4.1/lib/logstash/inputs/syslog.rb:113] INFO logstash.inputs.syslog - Starting syslog tcp listener {:address=>"0.0.0.0:514"}
21:03:29.533 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}

2- TCP/UDP plugin:

input{
      tcp {
        port => 514
        type => syslog
      }
      udp {
        port => 514
        type => syslog
      }
    }

    filter {
    }
        

    output {
        stdout {
            codec => rubydebug
        }
    }

Result ==> From inside logstash container: looks like something missing?

16:27:55.840 [[main]<udp] WARN logstash.inputs.udp - UDP listener died {:exception=>#<NameError: uninitialized constant LogStash::Inputs::Udp::IPAddr>, :backtrace=>["org/jruby/RubyModule.java:2746:in const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:87:in udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.3.1/lib/logstash/inputs/udp.rb:57:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:470:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:463:in `start_input'"]}

Elasticsearch Unreachable: [http://localhost:9200/] Connection refused (Connection refused)"}

15:53:08.813 [Ruby-0-Thread-5: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:228] WARN logstash.outputs.elasticsearch - Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused)"}

Cannot send query to test http input plugin

I am running logstash in a container with docker-compose and it works fine when reading a file.

Logstash is ready for http input:

selection_002_14_04

But I cannot send request, postman always stuck on "sending".
For testing I remove eachtime the sincedb file, a bit tedious.

I am suspecting something wrong with the request:

selection_004_14_04

the http plugin is in the pipeline file:

root@03451a9958b8:/# cat mylogstash/config-dir/pipelines.conf 
input {
    file {
        path => "/mylogstash/data/apache_access.log"
        start_position => "beginning"
        type => "access"
    }
    http {
        type => "access"

    }
}
filter {
    grok {
        #match => {"message" => "%{IP:ip_address} %{USER:identity} %{USER:auth} \[%{HTTPDATE:req_ts}\] \"%{WORD:http_verb} %{URIPATHPARAM:req_path}\\" %{INT:http_status}"}
    	match => { "message" => '%{HTTPD_COMMONLOG} "%{GREEDYDATA:referrer}" "%{GREEDYDATA:agent}"' }
    }

    mutate {
        convert => {
            "response" => "integer"
            "bytes" => "integer"
        }
    }
    date {
        match => ["timestamp","dd/MMM/yyyy:HH:mm:ss Z" ]
        remove_field => [ "timestamp" ]
    }
}
output {
    stdout {
        codec => rubydebug
    }
    file {
        path => "/mylogstash/data/%{type}_%{+yyyy_MM_dd}.log"
        #path => "access.log"
    }
}
root@03451a9958b8:/# 

Here is logstash container IP address
selection_001_14_04

And it is reachable from the host sending the request:

$ docker inspect grokpattern_mylogstash_1 | grep -i ipaddress
            "SecondaryIPAddresses": null,
            "IPAddress": "",
                    "IPAddress": "172.23.0.2",
ajn@~/github/elk-stuff/udemy-course/grok-pattern$ ping -c 3 172.23.0.2
PING 172.23.0.2 (172.23.0.2) 56(84) bytes of data.
64 bytes from 172.23.0.2: icmp_seq=1 ttl=64 time=0.111 ms
64 bytes from 172.23.0.2: icmp_seq=2 ttl=64 time=0.056 ms
64 bytes from 172.23.0.2: icmp_seq=3 ttl=64 time=0.060 ms

--- 172.23.0.2 ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 1998ms
rtt min/avg/max/mdev = 0.056/0.075/0.111/0.026 ms

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.