Giter Club home page Giter Club logo

logstash's Introduction

Hi there ๐Ÿ‘‹

logstash's People

Contributors

gregretkowski avatar jordansissel avatar

Watchers

 avatar

logstash's Issues

Packaging

Randomly building and publishing gems seems suboptimal.

I'd like to, for each component/release, publish gems, debs, and rpms.

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 5:55

JRuby?

1.8/1.9 are failures of engineering quality.

JRuby might be a nice escape and also give us:

* proper threads (for multicore)
* full access to java (useful if ruby fails us)
* a boatload of libraries (ruby + java)
* great debugging tools (jvm)
* good performance (hotspot)

We'll probably want to decide this before 1.0.

Original issue reported on code.google.com by [email protected] on 14 Feb 2011 at 3:28

Grok parsing results in all fields being arrays

What steps will reproduce the problem?
1. Upgrade jls-grok to 0.4.1 and restart logstash
2. Have an event get parsed by grok
3. Notice logstash dies with error

What is the expected output? What do you see instead?
D, [2011-02-18T04:15:01.396308 #27316] DEBUG -- 
logstash/filters/grok.rb:78#filter: ["Event now: ", 
{"@timestamp"=>"2011-02-18T04:15:01.395170Z", "@tags"=>[], 
"@type"=>"tello-app-log", "@fields"=>{"MINUTE"=>["15"], "verb"=>["GET"], 
"SECOND"=>["01"], "timestamp"=>["Fri Feb 18 04:15:01 +0000 2011"], 
"URIPARAM"=>[], "HOUR"=>["04"], "TIME"=>["04:15:01"], "MONTH"=>["Feb"], 
"DAY"=>["Fri"], "request"=>["/login"], "ZONE"=>["+0000"], "IP"=>[], 
"MONTHDAY"=>["18"], "URIPATH"=>["/login"], "YEAR"=>["2011"], 
"xforwardedfor"=>["75.161.43.150"], "HOSTNAME"=>["75.161.43.150"]}, 
"@message"=>"Started GET \"/login\" for 75.161.43.150 at Fri Feb 18 04:15:01 
+0000 2011"}]

vs.
D, [2011-02-18T21:52:34.158169 #20954] DEBUG -- 
logstash/filters/grok.rb:78#filter: ["Event now: ", 
{"@timestamp"=>"2011-02-18T21:52:34.157106Z", "@tags"=>[], 
"@type"=>"tello-app-log", "@fields"=>{"MINUTE"=>[["52"]], "SECOND"=>[["34"]], 
"DATE_EU"=>[["2011-02-18"]], "loglevel"=>[["INFO"]], "TIME"=>[["21:52:34"]], 
"HOUR"=>[["21"]], "MONTHNUM"=>[["02"]], "MONTHDAY"=>[["18"]], 
"YEAR"=>[["2011"]]}, "@message"=>"[2011-02-18 21:52:34] INFO : "}]

With Error:
/usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.2011
0112115018/lib/logstash/filters/grep.rb:93:in `match': can't convert Array into 
String (TypeError)
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/filters/grep.rb:93:in `filter'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/filters/grep.rb:86:in `each'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/filters/grep.rb:86:in `filter'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/filters/grep.rb:77:in `each'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/filters/grep.rb:77:in `filter'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/filters/grep.rb:68:in `each'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/filters/grep.rb:68:in `filter'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/agent.rb:114:in `filter'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/agent.rb:113:in `each'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/agent.rb:113:in `filter'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/agent.rb:129:in `receive'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/agent.rb:62:in `register'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/inputs/file.rb:35:in `call'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/inputs/file.rb:35:in `receive'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/inputs/file.rb:49:in `receive_data'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/inputs/file.rb:48:in `each'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/inputs/file.rb:48:in `receive_data'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/eventmachine-tail-0.5.20101204110840/lib/em/filetail.rb:256:in `read'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/eventmachine-tail-0.5.20101204110840/lib/em/filetail.rb:238:in `schedule_next_read'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `call'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run_machine'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/lib/logstash/agent.rb:95:in `run'
    from /usr/local/rvm/rubies/ree-1.8.7-2010.02/lib/ruby/gems/1.8/gems/logstash-0.2.20110112115018/bin/logstash:86
    from /usr/local/rvm/gems/ree-1.8.7-2010.02/bin/logstash:19:in `load'
    from /usr/local/rvm/gems/ree-1.8.7-2010.02/bin/logstash:19

What version of the product are you using? On what operating system?
Logstash version doesn't matter but jls-grok-0.4.1 has the problem while 
jls-grok-0.2.3104 does not. OS is CentOS

Please provide any additional information below.
- grok:
    tello-app-log:
      patterns:
      - "Started %{WORD:verb} \"%{URIPATHPARAM:request}\" for %{IPORHOST:xforwardedfor} at %{DATESTAMP_RAILS:timestamp}"
      - "INFO :   Processing by %{DATA:controller}#%{DATA:action} as %{NOTSPACE:format}"
      - "INFO : Completed %{NUMBER:response} %{NOTSPACE:response_desc} in %{INT:requesttime}ms"
      - "\[%{DATE_EU} %{TIME}\] %{NOTSPACE:loglevel}\s?: "


Original issue reported on code.google.com by [email protected] on 18 Feb 2011 at 9:56

OOM issue in logstash with tcp input

What steps will reproduce the problem?
1. Create logstash server with 4 tcp input and mongodb output.
2. Run it for a while.
3. We have seen oom erro in system.

I, [2011-01-13T21:13:14.210438 #13633]  INFO -- logstash: Starting tcp
listener for tcp://0.0.0.0:5565/
I, [2011-01-13T21:13:14.211069 #13633]  INFO -- logstash: Starting tcp
listener for tcp://0.0.0.0:5578/
I, [2011-01-13T21:13:14.211703 #13633]  INFO -- logstash: Starting tcp
listener for tcp://0.0.0.0:5568/
I, [2011-01-13T21:13:14.212318 #13633]  INFO -- logstash: Starting tcp
listener for tcp://0.0.0.0:5569/
tcmalloc: large alloc 1499643904 bytes == (nil) @
tcmalloc: large alloc 2699038720 bytes == (nil) @
tcmalloc: large alloc 4857946112 bytes == (nil) @
tcmalloc: large alloc 8743981056 bytes == (nil) @
tcmalloc: large alloc 15738843136 bytes == (nil) @
tcmalloc: large alloc 28329598976 bytes == (nil) @
tcmalloc: large alloc 50992955392 bytes == (nil) @
What is the expected output? What do you see instead?


What version of the product are you using? On what operating system?
Centos 5.4 x86_64

Please provide any additional information below.


Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 2:40

Grok filter would benefit from type hinting in captures


numerical values would benefit from some kind of type hinting. Hints like 'int' 
or 'float' will help make it so we can do better range queries in elasticsearch 
and allow other tools in the logstash pipeline to better handle number values.

on IRC we discussed options, and it was proposed that we abuse the grok pattern 
naming syntax %{PATTERN:name} and use %{PATTERN:name:type} where 'type' is 
'int' or 'float'

Original issue reported on code.google.com by [email protected] on 24 Feb 2011 at 6:45

grep filter should allow dynamic fields


{{{
add_field:
  foo: bar
}}}

The above is useful, but what would be more useful is if we could inject data 
from an existing field (such as hostname, or any other value).

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 4:49

Ruby 'DateTime#sec_fraction' is inconsistent across versions.


In 1.8.7, DateTime#sec_fraction is fractions of a day. In 1.9.2, sec_fraction 
is fractions of a second.

Code:
  require "date"
  p [RUBY_PLATFORM, RUBY_VERSION, DateTime.parse("2001-01-01T00:00:00.1234").sec_fraction]

Output:
["x86_64-linux", "1.9.2", (617/5000)]
["java", "1.8.7", Rational(617, 432000000)]
["x86_64-linux", "1.8.7", Rational(617, 432000000)]

1.8.6 fails completely to parse this time format.

I may need to reimplement some/all of the time parsing we do so it becomes 
reliable across ruby versions.

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 9:15

agents continually disconnect from rabbitmq

I'm using the amqp output transport for logstash, and I'm seeing my agents 
disconnecting from RabbitMQ with a "connection_closed_abruptly" log message
over and over.

=WARNING REPORT==== 6-Feb-2011::22:49:58 ===
exception on TCP connection <0.3910.0> from 129.21.49.17:54628
connection_closed_abruptly.

=INFO REPORT==== 6-Feb-2011::22:49:58 ===
closing TCP connection <0.3910.0> from 129.21.49.17:54628

=INFO REPORT==== 6-Feb-2011::22:49:58 ===
accepted TCP connection on 0.0.0.0:5672 from 129.21.49.17:57163

Original issue reported on code.google.com by [email protected] on 11 Feb 2011 at 9:29

Standardize logstash-web internal search API


Currently search is provided by web/lib/elasticsearch and one call search() 
returns results and facets.

In order to more easily support search backends, I want a better api and at 
least a split result/facet api.

Original issue reported on code.google.com by [email protected] on 9 Feb 2011 at 7:40

Current gem (logstash-0.2.20110331121235) can't parse URI query strings

What steps will reproduce the problem?
1. gem install logstash
2. Create a logstash.yaml config file with a URI with a query string, e.g. 
amqp://localhost/direct/nlog?durable_exchange=y&durable_queue=y
3. run logstash -f logstash.yaml

What is the expected output? What do you see instead?

Expected: running logstash.
Instead: 
/var/lib/gems/1.9.1/gems/logstash-0.2.20110331121235/lib/logstash/inputs/base.rb
:21:in `initialize': uninitialized constant LogStash::Inputs::Base::CGI 
(NameError)

What version of the product are you using? On what operating system?

logstash-0.2.20110331121235 on debian squeeze, under ruby1.9.1 (which is really 
1.9.2 -- thanks, debian!).

Please provide any additional information below.

Looks like the config system in git HEAD is completely different, and has been 
for some time before that 20110331 timestamp would indicate the gem was 
generated.  How does code make its way from git into the gem?


Original issue reported on code.google.com by [email protected] on 1 Apr 2011 at 12:54

Logstash-web doesn't work with Rack 1.2.1

I'm using Ruby 1.8.6p399 on Centos 5. If I install logstash it brings in Rack 
1.2.1. However when launching logstash-web I get following error

$ logstash-web
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:27: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:39: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:43: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:54: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:99: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/rack-1.2.1/lib/rack/utils.rb:138:in `union': can't 
convert Array into String (TypeError)
        from /usr/lib/ruby/gems/1.8/gems/rack-1.2.1/lib/rack/utils.rb:138
        from /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require'
        from /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `require'
        from /usr/lib/ruby/gems/1.8/gems/rack-1.2.1/lib/rack/request.rb:1
        from /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require'
        from /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `require'
        from /usr/lib/ruby/gems/1.8/gems/rack-1.2.1/lib/rack/showexceptions.rb:3
        from /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require'
         ... 12 levels...
        from /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `require'
        from /usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/bin/logstash-web:6
        from /usr/bin/logstash-web:19:in `load'
        from /usr/bin/logstash-web:19

I downgraded to 1.1.0 ie.


gem install rack --version 1.1.0

Now it 

$ logstash-web
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:27: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:39: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:43: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:54: warning: parenthesize argument(s) for future version
/usr/lib/ruby/gems/1.8/gems/logstash-0.2.20101208111718/lib/logstash/web/server.
rb:99: warning: parenthesize argument(s) for future version
>> Thin web server (v1.2.7 codename No Hup)
>> Maximum connections set to 1024
>> Listening on 0.0.0.0:9292, CTRL+C to stop

Original issue reported on code.google.com by [email protected] on 9 Dec 2010 at 3:24

Consistent terminology

Review code and documentation and remove synonyms and try to ensure consistent 
use of terms.

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 6:06

Clear text password in logfile

When running logstash and using a stomp connector as input/output, the full URL 
including the clear text password is logged.

Would it be possible to turn off this verbosity?

Thanks,
M

Original issue reported on code.google.com by [email protected] on 12 Jan 2011 at 11:17

eventmachine-tail: Does not pick up cifs mounted files

What steps will reproduce the problem?
1. mount a remote directory with cifs
2. input a remote file under that mounted directory in logstash.yaml
3. restart logstash

What is the expected output? What do you see instead?
Expect to see the remote log file tail'd. It doesn't.

What version of the product are you using? On what operating system?
debian lenny 32bit, logstash-0.2.20101222161645

Please provide any additional information below.


Original issue reported on code.google.com by [email protected] on 12 Jan 2011 at 9:21

Memory usage in jruby under high load

Config:

input {
  stdin { 
    type => "foo"
  }
}

output {
  stdout { }
} 


---

After shpiping in a few thousand events, quickly, memory usage gets up to 
400-500mb at which point java/jruby start doing aggreessive GC, and eventually 
it dies exits code 1 with no error message.

jmap -histo output:

 num     #instances         #bytes  class name
----------------------------------------------
   1:       1498246       83901776  org.jruby.RubyString
   2:        512791       77903760  [Lorg.jruby.runtime.builtin.IRubyObject;
   3:       1144161       73226304  org.jruby.RubyHash$RubyHashEntry
   4:        962203       53883368  org.jruby.util.ByteList
   5:        969937       45393160  [B
   6:        513009       32832576  org.jruby.RubyArray
   7:        180358       20201024  [Lorg.jruby.RubyHash$RubyHashEntry;
   8:        180356       20199872  org.jruby.RubyHash
   9:         33657        5239592  <constMethodKlass>
  10:        100897        4956992  [Ljava.lang.Object;
  11:        180388        4329312  java.util.concurrent.atomic.AtomicInteger
  12:        180358        4328592  org.jruby.RubyHash$33
  13:        180358        4328592  org.jruby.RubyHash$34

Original issue reported on code.google.com by [email protected] on 22 Feb 2011 at 4:20

logstash-web: Simple ACLs would be nice

Simple IP/CIDR ACLs would be nice for the simple logstash-web frontend.

Complex ACLs, AAA, etc, and other extensive access control features are out of 
scope for now. Those are likely best served by putting a real webserver in 
front to do filtering (nginx, apache, etc)


Original issue reported on code.google.com by [email protected] on 26 Jan 2011 at 7:36

Need key-value parser


Some logs are kind enough to log with 'foo=bar' or 'foo:bar' patterns. We 
should make parsing this simple.

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 4:46

Crash in grok (jruby)

Reproduced this crash. In master, the script test/jruby/blah.sh should trigger 
it.

here's the relevant backtrace.

#12 0x00007f9dda45c85c in tctreesplay (tree=0x7f9dd45b1128, 
kbuf=0x7f9dd45b1128, ksiz=4)
    at tokyocabinet_all.c:3769
#13 0x00007f9dda45c97a in tctreeiternext (tree=0x7f9dd45b1128, 
sp=0x7f9dd9d0650c)
    at tokyocabinet_all.c:3349
#14 0x00007f9ddab27dd7 in grok_capture_walk_next () from /usr/lib/libgrok.so
#15 0x00007f9ddab2a4c9 in grok_match_walk_next () from /usr/lib/libgrok.so
#16 0x00007f9ddad633e0 in ffi_call_unix64 ()
   from /home/jls/.rvm/rubies/jruby-1.5.3/lib/native/x86_64-Linux/libjffi-1.0.so
#17 0x00007f9ddad62f4f in ffi_call ()
   from /home/jls/.rvm/rubies/jruby-1.5.3/lib/native/x86_64-Linux/libjffi-1.0.so

Original issue reported on code.google.com by [email protected] on 22 Feb 2011 at 3:49

grok not linking to tokyocabinet

Hello,

I've been running logstash in a debian environment for a while and have been 
loving it.

I've been since forced to move it to a centos box which has resulted in a few 
minor complications most of which I've been able to sort out.

When running grok I come into a problem with the tokyocabinet lib.

When I run ldd libgrok.so i get :

ldd libgrok.so 
    libdl.so.2 => /lib64/libdl.so.2 (0x00002afd76bef000)
    libpcre.so.0 => /lib64/libpcre.so.0 (0x00002afd76df3000)
    libevent-1.4.so.2 => /usr/lib64/libevent-1.4.so.2 (0x00002afd77012000)
    libtokyocabinet.so.8 => not found
    libc.so.6 => /lib64/libc.so.6 (0x00002afd7722d000)
    /lib64/ld-linux-x86-64.so.2 (0x000000383b000000)
    libnsl.so.1 => /lib64/libnsl.so.1 (0x00002afd77584000)
    librt.so.1 => /lib64/librt.so.1 (0x00002afd7779d000)
    libresolv.so.2 => /lib64/libresolv.so.2 (0x00002afd779a6000)
    libpthread.so.0 => /lib64/libpthread.so.0 (0x00002afd77bbb000)

So obviously grok can't find libtokyocabinet.so.8 but how do I fix this? I have 
the lib installed and I know where its located but when I've tried to recompile 
grok it can never find it.


Original issue reported on code.google.com by [email protected] on 14 Apr 2011 at 6:33

Logstash doesn't support saved queries and batch run reports.

What steps will reproduce the problem?

N/A.

What is the expected output? What do you see instead?

Many commercial log analysis tools (e.g. splunk, logrhythm, loglogic) offer a 
reporting front end and the ability to save queries or reports for either later 
running or regularly scheduled events.

What version of the product are you using? On what operating system?
I'm not using the product, just had a chat with Jordan on HN.

Please provide any additional information below.


Original issue reported on code.google.com by [email protected] on 16 Nov 2010 at 11:25

/usr/lib/libgrok.so: undefined symbol: tccmpint32 (LoadError)

Starting logstash results in the server not starting with the following error:


I, [2011-03-28T17:33:01.273675 #20283]  INFO -- logstash: Registering 
file://<server>/var/log/messages
/usr/lib/ruby/gems/1.9.1/gems/ffi-0.6.3/lib/ffi/library.rb:61:in `block in 
ffi_lib': Could not open library 'libgrok': libgrok: cannot open shared object 
file: No such file or directory. Could not open library 'libgrok.so': 
/usr/lib/libgrok.so: undefined symbol: tccmpint32 (LoadError)
        from /usr/lib/ruby/gems/1.9.1/gems/ffi-0.6.3/lib/ffi/library.rb:43:in `map'
        from /usr/lib/ruby/gems/1.9.1/gems/ffi-0.6.3/lib/ffi/library.rb:43:in `ffi_lib'
        from /usr/lib/ruby/gems/1.9.1/gems/jls-grok-0.4.6/lib/grok.rb:7:in `<module:CGrok>'
        from /usr/lib/ruby/gems/1.9.1/gems/jls-grok-0.4.6/lib/grok.rb:5:in `<class:Grok>'
        from /usr/lib/ruby/gems/1.9.1/gems/jls-grok-0.4.6/lib/grok.rb:4:in `<top (required)>'
        from <internal:lib/rubygems/custom_require>:29:in `require'
        from <internal:lib/rubygems/custom_require>:29:in `require'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/lib/logstash/filters/grok.rb:5:in `<top (required)>'
        from <internal:lib/rubygems/custom_require>:29:in `require'
        from <internal:lib/rubygems/custom_require>:29:in `require'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/lib/logstash/filters.rb:11:in `from_name'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/lib/logstash/agent.rb:74:in `block in register'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/lib/logstash/agent.rb:71:in `each'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/lib/logstash/agent.rb:71:in `register'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/lib/logstash/agent.rb:96:in `block in run'
        from /usr/lib/ruby/gems/1.9.1/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `call'
        from /usr/lib/ruby/gems/1.9.1/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run_machine'
        from /usr/lib/ruby/gems/1.9.1/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/lib/logstash/agent.rb:95:in `run'
        from /usr/lib/ruby/gems/1.9.1/gems/logstash-0.2.20110206003556/bin/logstash:86:in `<top (required)>'
        from /usr/bin/logstash:19:in `load'
        from /usr/bin/logstash:19:in `<main>'


This is using grok-1.20110308.1 on a 64 bit system. Grok compiles and install 
fine so I am not sure if the error is with grok of logstash

Original issue reported on code.google.com by [email protected] on 28 Mar 2011 at 4:35

Allow dynamic outputs (for sharding/partitioning)

ElasticSearch and other backends might benefit from sharding, that is, be able 
to write to

elasticsearch://localhost:9200/logstash-YYYY-mm-dd/mylogs

In the above example, this would effectively partition your logs by day.

Further, each log type should probably get its own type in elasticsearch rather 
than being stored in the same /logstash/all index+type.

Some format string would be required here. Something similar to grok's syntax 
like:

elasticsearch://localhost:9200/logs-%{date}/%{type}

We'd have to figure out how to let folks specify the date format, but %{type} 
would become the LogStash::Event#type ('@type' in the event json)

This would be beneficial to other possible outputs like mongodb, mysql, hdfs, 
local files, etc.

Original issue reported on code.google.com by [email protected] on 15 Feb 2011 at 8:25

Corrupt data coming out of logstash grok filter

I'm using the following configuration:

inputs:
  all:
  - amqp://REDACTED/logstash/fanout/raw_logs
filters:
- date:
    syslog:
      timestamp: "%b %e %H:%M:%S"
      timestamp8601: ISO8601
    apache-access:
      timestamp: "%d/%b/%Y:%H:%M:%S %Z"
    apache-error:
      timestamp: "%a %b %d %H:%M:%S %Y"
    fanforce-request:
      timestamp: "%Y-%m-%dT%H:%M:%S.%f"
      timestamp8601: ISO8601
- grok:
    syslog:
      patterns:
      - %{SYSLOGLINE}
    apache-error:
      patterns:
      - %{APACHE_ERROR_LOG}
    apache-combined:
      patterns:
      - %{COMBINEDAPACHELOG_TIPPR}
    myapp-request:
      patterns:
      - %{MYAPP_STATS_EVENT}
      - %{MYAPP_LOG_GENERAL}
outputs:
- 
elasticsearch://REDACTED/logstash/events_river?method=river&type=rabbitmq&host=R
EDACTED&user=logstash&pass=REDACTED&vhost=logstash&queue=elasticsearch&exchange=
parsed_logs&exchange_type=fanout&durable=true

...and the following nondefault grok patterns:


URIPATH_LOCAL (?:/[A-Za-z0-9$.+!*'(),~:#%_=-]*)+
URIPARAM_LOCAL \?[?A-Za-z0-9$.+!*'(),~#%&/=:;_-]*
URIPATHPARAM_LOCAL %{URIPATH_LOCAL}(?:%{URIPARAM_LOCAL})?
URI_LOCAL 
%{URIPROTO}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST})?(?:%{URIPATHPARAM_LOCAL})?

APACHE_LOG_LEVEL (?:emerg|alert|crit|error|warn|notice|info|debug)
APACHE_ERROR_LOG \[%{DATESTAMP_OTHER:timestamp}\] \[%{APACHE_LOG_LEVEL:level}\] 
%{GREEDYDATA:message}
COMBINEDAPACHELOG_LOCAL %{IPORHOST:clientip} %{USER:ident} %{USER:auth} 
\[%{HTTPDATE:timestamp}\] "%{WORD:verb} %{URIPATHPARAM_LOCAL:request} 
HTTP/%{NUMBER:httpversion:float}" %{NUMBER:response:int} 
(?:%{NUMBER:bytes:int}|-) "(?:%{URI_LOCAL:referrer}|-?)" %{QS:agent}(?: 
<%{NUMBER:bytes_in:int} >%{NUMBER:bytes_out:int} %{NUMBER:response_time:int}ms 
"%{HOSTNAME:domain}"(?: %{NUMBER:seconds_time:int})?)?(?: 
FF:"(?:%{IPORHOST:forwarded_for}|-)")?

MYAPP_LEVELNAME (?:DEBUG|INFO|WARNING|ERROR|CRITICAL)
MYAPP_TIMESTAMP 
%{YEAR}-%{MONTHNUM}-%{MONTHDAY}T%{HOUR}:%{MINUTE}:%{SECOND}[.][0-9]+
MYAPP_MODULE fanforce[.a-z]+
MYAPP_EMAIL [^ ]+@[^ ]+

MYAPP_REQUEST %{MYAPP_TIMESTAMP} %{MYAPP_MODULE:module} 
%{MYAPP_LEVELNAME:level} (?:%{IPORHOST:source_address}|-) 
(?:%{HOSTNAME:domain}|-) %{URIPATH:path} (?:%{MYAPP_EMAIL:email}|anonymous) 
%{GREEDYDATA:message}

MYAPP_NAME [a-zA-Z0-9_-]+
MYAPP_LOG_HEADER %{MYAPP_TIMESTAMP} %{MYAPP_MODULE:module} 
%{MYAPP_LEVELNAME:level} (?:%{IPORHOST:source_address}|-) 
(?:%{HOSTNAME:domain}|-) %{URIPATH:path} (?:%{MYAPP_EMAIL:email}|anonymous)
MYAPP_LOG_GENERAL %{MYAPP_LOG_HEADER} %{GREEDYDATA:message}
MYAPP_STATS_EVENT %{MYAPP_LOG_HEADER} stats-event 
%{MYAPP_NAME:stats_event_type}(?: \[amount:%{NUMBER:amount:float}\])?(?: 
\[channel:(?:%{MYAPP_NAME:channel})\])?(?: 
\[geography:(?:%{MYAPP_NAME:geography})?\])?(?: 
\[offer:%{MYAPP_NAME:offer}\])?(?: \[publisher:%{MYAPP_NAME:publisher}\])?


A substantial subset of my apache-combined logs leave the grok filter with data 
which not only fails to match the pattern in question, but which is not even 
valid UTF-8. An example of such an invalid message:

{"@source_host":"www8.corp.myapp.com","@source_path":"/var/log/myapp/request.log
","@tags":[],"@timestamp":"2011-03-03T19:41:07.860691Z","@type":"myapp-request",
"@source":"file://www8.corp.myapp.com/var/log/myapp/request.log","@message":"201
1-03-03T19:41:07.860103 myapp.core.publisher.views INFO 128.83.44.159 
redactedpartnerdomain.com /a/channel/subscribe/ anonymous attempting to 
subscribe [email protected] to 
redactedpartner-austin","@fields":{"message":["attempting to subscribe 
[email protected] to 
redactedpartner-austin"],"HOSTNAME":["128.83.44.159"],"module":["myapp.core.publ
isher.views"],"SECOND":["07"],"domain":["redactedpartnerdomain.com"],"level":["I
NFO"],"HOUR":["19"],"FANFORCE_TIMESTAMP":["\xc0\xbdJ\\npAn\xb703T19:41:07.860103
"],"FANFORCE_LOG_HEADER":["2011-03-03T19:41:07.860103 
myapp.core.publisher.views INFO 128.83.44.159 redactedpartnerdomain.com 
/a/channel/subscribe/ 
anonymous"],"source_address":["128.83.44.159"],"MONTHDAY":["03"],"MONTHNUM":["An
"],"IP":[],"YEAR":["\xc0\xbdJ\\n"],"path":["/a/channel/subscribe/"],"MINUTE":["4
1"],"email":[]}}

Clearly, many of the fields do not in any way match the message.

Please contact directly if unredacted data is necessary for debugging.

Original issue reported on code.google.com by [email protected] on 4 Mar 2011 at 1:43

mqrpc: duplicate responses




When running the 'normal' throughput test with 3 servers and 1 client, the
below occurs:

W, [2009-11-22T00:32:18.052361 #3119/#<Thread:0x301f4e8>]  WARN -- :
Tracking HelloRequest#767 to hello
W, [2009-11-22T00:32:18.441956 #3119/receiver]  WARN -- : Got response to 767
W, [2009-11-22T00:32:18.456098 #3119/receiver]  WARN -- : Got response to
767/HelloResponse but wasn't waiting on it?

Seems like our queue isn't a proper queue, or messages are being requeued
unexpectedly and two servers end up getting the request.

Original issue reported on code.google.com by [email protected] on 22 Nov 2009 at 8:35

Cant us [ or ] in a query

What steps will reproduce the problem?
1. Run logstash-web
2. Go to the URL 
3. type [foo in the "query" and hit submit

What is the expected output? What do you see instead?
everything with [foo, instead the server crashes with:

/var/lib/gems/1.8/gems/logstash-0.1.523/lib/logstash/web/lib/elasticsearch.rb:50
:in `search': undefined method `[]' for nil:NilClass (NoMethodError)
        from /var/lib/gems/1.8/gems/eventmachine-0.12.10/lib/em/deferrable.rb:134:in `call'
        from /var/lib/gems/1.8/gems/eventmachine-0.12.10/lib/em/deferrable.rb:134:in `set_deferred_status'
        from /var/lib/gems/1.8/gems/eventmachine-0.12.10/lib/em/deferrable.rb:173:in `succeed'
        from /var/lib/gems/1.8/gems/em-http-request-0.2.14/lib/em-http/client.rb:499:in `unbind'
        from /var/lib/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:1417:in `event_callback'
        from /var/lib/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run_machine'
        from /var/lib/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run'
        from /var/lib/gems/1.8/gems/thin-1.2.7/lib/thin/backends/base.rb:57:in `start'
        from /var/lib/gems/1.8/gems/thin-1.2.7/lib/thin/server.rb:156:in `start'
        from /var/lib/gems/1.8/gems/rack-1.2.1/lib/rack/handler/thin.rb:14:in `run'
        from /var/lib/gems/1.8/gems/logstash-0.1.523/lib/logstash/web/server.rb:86
        from /usr/lib/ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require'
        from /usr/lib/ruby/1.8/rubygems/custom_require.rb:31:in `require'
        from /var/lib/gems/1.8/gems/logstash-0.1.523/bin/logstash-web:6
        from /var/lib/gems/1.8/bin/logstash-web:19:in `load'
        from /var/lib/gems/1.8/bin/logstash-web:19



What version of the product are you using? On what operating system?
logstash-0.1.532 (installed from gem)

Please provide any additional information below.
No grok installed, no filters, just using the basic tutorial from the site. 

Original issue reported on code.google.com by [email protected] on 17 Nov 2010 at 8:57

Clean out the wiki

We have lots of old documentation and notes about the original design of 
logstash which no longer exists. Should clean these up.

Original issue reported on code.google.com by [email protected] on 10 Feb 2011 at 7:29

Packaging

Randomly building and publishing gems seems suboptimal.

I'd like to, for each component/release, publish gems, debs, and rpms.

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 5:55

  • Merged into: #18

jls-grok libffi dependency installs requires version unsupported with Ruby 1.8.x

libffi stopped supporting Ruby 1.8 as of the 1.0 release. Consequently, other 
projects which use libffi (ie. Vagrant) and support Ruby 1.8.x have locked 
themselves to version 0.6.3.

This manifests itself as such:

/usr/lib/ruby/gems/1.8/gems/ffi-1.0.5/lib/ffi/memorypointer.rb:27:in 
`from_string': undefined method `bytesize' for #<String:0x2b7aeb9ee6c0> 
(NoMethodError)
        from /usr/lib/ruby/gems/1.8/gems/jls-grok-0.3.3209/lib/grok.rb:62:in `add_patterns_from_file'

Original issue reported on code.google.com by [email protected] on 16 Feb 2011 at 4:29

Elastic search logs error indicating Mapping not found for @timestamp

Get this error when using river to index and perform any search through 
logstash-web.  One example:
@timestamp:[2011-01-27 TO 2011-01-29] 
loggerclass:"org.apache.hadoop.hdfs.server.datanode.DataNode"


[2011-01-27 18:11:31,353][DEBUG][action.search.type       ] [Mahkizmo] 
[_river][0], node[YKdyz3QiSdyjZb93oZbl7A], [P], s[STARTED]: Failed to execute 
[org.elasticsearch.action.search.SearchRequest@2ff530fc]
org.elasticsearch.search.SearchParseException: [_river][0]: from[0],size[50]: 
Parse Failure [Failed to parse source 
[{"size":50,"from":0,"sort":[{"@timestamp":"desc"}],"facets":{"by_hour":{"histog
ram":{"time_interval":"1h","field":"@timestamp"}}},"query":{"query_string":{"def
ault_operator":"AND","query":"timestamp:[2011-01-27 TO 2011-01-29] 
loggerclass:\"org.apache.hadoop.hdfs.server.datanode.DataNode\""}}}]]
        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:420)
        at org.elasticsearch.search.SearchService.createContext(SearchService.java:335)
        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:169)
        at org.elasticsearch.search.action.SearchServiceTransportAction.sendExecuteQuery(SearchServiceTransportAction.java:131)
        at org.elasticsearch.action.search.type.TransportSearchQueryThenFetchAction$AsyncAction.sendExecuteFirstPhase(TransportSearchQueryThenFetchAction.java:76)
        at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.performFirstPhase(TransportSearchTypeAction.java:193)
        at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.access$000(TransportSearchTypeAction.java:77)
        at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$1.run(TransportSearchTypeAction.java:152)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:619)
Caused by: org.elasticsearch.search.SearchParseException: [_river][0]: 
from[0],size[50]: Parse Failure [No mapping found for [@timestamp]]
        at org.elasticsearch.search.sort.SortParseElement.addSortField(SortParseElement.java:139)
        at org.elasticsearch.search.sort.SortParseElement.addCompoundSortField(SortParseElement.java:96)
        at org.elasticsearch.search.sort.SortParseElement.parse(SortParseElement.java:68)
        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:407)
        ... 10 more

Original issue reported on code.google.com by deinspanjer on 28 Jan 2011 at 2:15

Duplicate metadata when drilling into logs on logstash-web

(metadata) @source_host:carrera.databits.net,
(metadata) @source_host:carrera.databits.net,
(metadata) @source_path:/home/jls/logs/access_log,
(metadata) @source_path:/home/jls/logs/access_log,
(metadata) @tags:
(metadata) @tags:
(metadata) @timestamp:2011-01-12T15:30:30.000000-0800,
(metadata) @timestamp:2011-01-12T15:30:30.000000-0800,
(metadata) @type:apache-access,
(metadata) @type:apache-access,

Original issue reported on code.google.com by [email protected] on 13 Feb 2011 at 11:36

Add 'exec' input format



Should be similar to grok's "exec" feature:

{{{
exec "command" {
  restart-on-exit: true/false
  minimum-restart-delay: seconds
  run-interval: seconds
  read-stderr: true/false
}
}}}

This would allow folks to easily hook in any program output.

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 4:45

Finalize logstash messaging protocol and version it

The LogStash::Event stuff needs to be formalized and documented with protocol 
versioning so other systems can expect to speak it reliably.

We could try to align with GELF, but it seems too fixated on files and syslog. 
It is also a structure and serialization format together, not just structure.

Original issue reported on code.google.com by [email protected] on 18 Jan 2011 at 6:05

websocket blank page

What steps will reproduce the problem?
1.start logstash with websocket enabled
2. browse destination:3232 through google chrome as a browser
3. blank page, nothing displays

What is the expected output? What do you see instead?
I should see real time events, but I just get blank page.

What version of the product are you using? On what operating system?
SLES 10SP2

*** LOCAL GEMS ***

actionmailer (2.3.5)
actionpack (2.3.5)
activerecord (2.3.5)
activeresource (2.3.5)
activesupport (2.3.5)
addressable (2.2.2)
amqp (0.7.0.pre)
async_sinatra (0.4.0)
daemons (1.0.10)
em-http-request (0.2.15)
em-websocket (0.2.0)
eventmachine (0.12.10)
eventmachine-tail (0.5.20101204110840)
fastthread (1.0.7)
gem_plugin (0.2.3)
haml (3.0.25)
jls-grok (0.2.3104)
json (1.4.6)
logstash (0.2.20101222161645)
mongrel (1.1.5)
rack (1.2.1, 1.1.0)
rails (2.3.5)
rake (0.8.7)
sinatra (1.1.2)
thin (1.2.7)
tilt (1.2.1)
uuidtools (2.1.1)

Please provide any additional information below.

logstash.yaml:

inputs:
  syslog: 
  - /var/log/messages
  - /var/log/syslog
  - /var/log/*.log
  apache-access:
  - /var/log/apache2/access.log
  apache-error:
  - /var/log/apache2/error.log

outputs:
 - websocket:///

I run logstash as follows:

root@host:~ # logstash -f logstash.yaml 
I, [2011-01-06T15:47:31.272393 #6142]  INFO -- logstash: Registering 
file://host/var/log/messages
I, [2011-01-06T15:47:31.273292 #6142]  INFO -- logstash: Registering 
file://host/var/log/syslog
I, [2011-01-06T15:47:31.274007 #6142]  INFO -- logstash: Registering 
file://host/var/log/*.log
I, [2011-01-06T15:47:31.274718 #6142]  INFO -- logstash: Registering 
file://host/var/log/apache2/access.log
I, [2011-01-06T15:47:31.275589 #6142]  INFO -- logstash: Registering 
file://host/var/log/apache2/error.log
I, [2011-01-06T15:47:31.359762 #6142]  INFO -- logstash: Registering websocket 
on websocket://0.0.0.0:3232/

Browse the destination host on port 3232, though I get a blank page.

Original issue reported on code.google.com by [email protected] on 6 Jan 2011 at 2:48

Scheduled queries

It would be great to be able to save queries to be run at a later time and 
generate an email/alert given some parameters or thresholds.

Original issue reported on code.google.com by [email protected] on 16 Nov 2010 at 10:57

  • Merged into: #3

Negation via grok input filters?

I want to ignore events that match a specific grok pattern. Apparently, only 
the 'grep' filter supports negation at the moment.

Original issue reported on code.google.com by [email protected] on 26 Jan 2011 at 7:39

special chars failed to get transform into UTF-8

- What steps will reproduce the problem?
1. Get logstash read a syslog message with 'special chars' like รจ รฉ โ€œ โ€ 
(not sure which ones does or does not raise the bug, a full check should be 
done)

- What is the expected output? What do you see instead?
ISO -> UTF-8 should work, or at least, not make logstash stop working (maybe 
send message in ISO if it fails, or just drop the message and sending an error 
message)


- What version of the product are you using? On what operating system?
logstash from "gem install logstash"

logstash -v
logstash: version unknown

Output of the error can be see here : http://pastebin.com/WuXFTFuX

Original issue reported on code.google.com by [email protected] on 23 Dec 2010 at 11:43

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.