Giter Club home page Giter Club logo

logstasher's Introduction

Logstasher Gem Version Tests

Buy Me A Coffee

Awesome Logging for Rails !!

This gem is heavily inspired from lograge, but it's focused on one thing and one thing only. That's making your logs awesome like this:

Awesome Logs

How it's done ?

By, using these awesome tools:

  • Logstash - Store and index your logs
  • Kibana - for awesome visualization. This is optional though, and you can use any other visualizer

Update: Logstash now includes Kibana build in, so no need to separately install. Logstasher has been tested with logstash with file input and json codec.

See quickstart for quickly setting up logstash

About logstasher

This gem purely focuses on how to generate logstash compatible logs i.e. logstash json event format, without any overhead. Infact, logstasher logs to a separate log file named logstash_<environment>.log. The reason for this separation:

  • To have a pure json log file
  • Prevent any logger messages(e.g. info) getting into our pure json logs

Before logstasher :

Started GET "/login" for 10.109.10.135 at 2013-04-30 08:59:01 -0400
Processing by SessionsController#new as HTML
[ActiveJob] [TestJob] [61d6e87a-875d-4255-9424-cab7d5ff208c] Performing TestJob from Test(default) with arguments: 0, 1
  Rendered sessions/new.html.haml within layouts/application (4.3ms)
  Rendered shared/_javascript.html.haml (0.6ms)
  Rendered shared/_flashes.html.haml (0.2ms)
  Rendered shared/_header.html.haml (52.9ms)
  Rendered shared/_title.html.haml (0.2ms)
  Rendered shared/_footer.html.haml (0.2ms)
Completed 200 OK in 532ms (Views: 62.4ms | ActiveRecord: 0.0ms | ND API: 0.0ms)

After logstasher:

{"job_id":"61d6e87a-875d-4255-9424-cab7d5ff208c","queue_name":"Test(default)","job_class":"ExampleJob","job_args":[1,0],
"exception":["ZeroDivisionError","divided by 0"],"duration":3.07,"request_id":"61d6e87a-875d-4255-9424-cab7d5ff208c",
"source":"unknown","tags":["job","perform","exception"],"@timestamp":"2016-03-29T16:14:32.837Z","@version":"1"}

{"@source":"unknown","@tags":["request"],"@fields":{"method":"GET","path":"/","format":"html","controller":"file_servers"
,"action":"index","status":200,"duration":28.34,"view":25.96,"db":0.88,"ip":"127.0.0.1","route":"file_servers#index",
"parameters":"","ndapi_time":null,"uuid":"e81ecd178ed3b591099f4d489760dfb6","user":"[email protected]",
"site":"internal"},"@timestamp":"2013-04-30T13:00:46.354500+00:00"}

By default, the older format rails request logs are disabled, though you can enable them.

Installation

In your Gemfile:

gem 'logstasher'

Configure your <environment>.rb e.g. development.rb

# Enable the logstasher logs for the current environment
config.logstasher.enabled = true

# Each of the following lines are optional. If you want to selectively disable log subscribers.
config.logstasher.controller_enabled = false
config.logstasher.mailer_enabled = false
config.logstasher.record_enabled = false
config.logstasher.view_enabled = false
config.logstasher.job_enabled = false

# This line is optional if you do not want to suppress app logs in your <environment>.log
config.logstasher.suppress_app_log = false

# This line is optional, it allows you to set a custom value for the @source field of the log event
config.logstasher.source = 'your.arbitrary.source'

# This line is optional if you do not want to log the backtrace of exceptions
config.logstasher.backtrace = false

# This line is optional if you want to filter backtrace, it will just log result of callable as backtrace
config.logstasher.backtrace_filter = ->(bt){ Rails.backtrace_cleaner.clean(bt) }

# This line is optional, defaults to log/logstasher_<environment>.log
config.logstasher.logger_path = 'log/logstasher.log'

# This line is optional, loaded only if the value is truthy
config.logstasher.field_renaming = {
    old_field_name => new_field_name,
}

Optionally use config/logstasher.yml (overrides <environment>.rb)

Has the same optional fields as the <environment>.rb. You can specify common configurations that are then overriden by environment specific configurations:

controller_enabled: true
mailer_enabled: false
record_enabled: false
job_enabled: false
view_enabled: true
suppress_app_log: false
development:
  enabled: true
  record_enabled: true
production:
  enabled: true
  mailer_enabled: true
  view_enabled: false

Logging params hash

Logstasher can be configured to log the contents of the params hash. When enabled, the contents of the params hash (minus the ActionController internal params) will be added to the log as a deep hash. This can cause conflicts within the Elasticsearch mappings though, so should be enabled with care. Conflicts will occur if different actions (or even different applications logging to the same Elasticsearch cluster) use the same params key, but with a different data type (e.g. a string vs. a hash). This can lead to lost log entries. Enabling this can also significantly increase the size of the Elasticsearch indexes.

To enable this, add the following to your <environment>.rb

# Enable logging of controller params
config.logstasher.log_controller_parameters = true

Adding custom fields to the log

Since some fields are very specific to your application for e.g. user_name, so it is left up to you, to add them. Here's how to add those fields to the logs:

# Create a file - config/initializers/logstasher.rb

if LogStasher.enabled?
  LogStasher.add_custom_fields do |fields|
    # This block is run in application_controller context,
    # so you have access to all controller methods
    fields[:user] = current_user && current_user.mail
    fields[:site] = request.path =~ /^\/api/ ? 'api' : 'user'

    # If you are using custom instrumentation, just add it to logstasher custom fields
    LogStasher.custom_fields << :myapi_runtime
  end

  LogStasher.add_custom_fields_to_request_context do |fields|
    # This block is run in application_controller context,
    # so you have access to all controller methods
    # You can log custom request fields using this block
    fields[:user] = current_user && current_user.mail
    fields[:site] = request.path =~ /^\/api/ ? 'api' : 'user'
  end
end

Logging ActionMailer events

Logstasher can easily log messages from ActionMailer, such as incoming/outgoing e-mails and e-mail content generation (Rails >= 4.1). This functionality is automatically enabled. Since the relationship between a concrete HTTP request and a mailer invocation is lost once in an ActionMailer instance method, global (per-request) state is kept to correlate HTTP requests and events from other parts of rails, such as ActionMailer. Every time a request is invoked, a request_id key is added which is present on every ActionMailer event.

Note: Since mailers are executed within the lifetime of a request, they will show up in logs prior to the actual request.

Logging ActiveJob events

Logstasher can also easily log messages from ActiveJob (Rails >= 5.2). This functionality is automatically enabled. The request_id is set to the Job ID when the job is performed, and then reverted back to its previous value once the job is complete. Imagine this scenario:

  • Web request starts (sets request_id to some value)
  • Job is enqueued because of the web request (the same web request_id is used)
  • Job is performing starts (pretend non-asynchronous adapter or perform_now was used)
  • request_id is set to the job id. This is important because for asynchronous jobs, there's no way to remember the original request_id
  • Now, you can add your own detailed logging to the job, and the request_id can be used
  • Once the job completes, the request_id is reverted and other SQL and View log lines will use that same old request_id again.

Listening to ActiveSupport::Notifications events

It is possible to listen to any ActiveSupport::Notifications events and store arbitrary data to be included in the final JSON log entry:

# In config/initializers/logstasher.rb

# Watch calls the block with the same arguments than any ActiveSupport::Notification, plus a store
LogStasher.watch('some.activesupport.notification') do |name, start, finish, id, payload, store|
  # Do something
  store[:count] = 42
end

Would change the log entry to:

{"@source":"unknown","@tags":["request"],"@fields":{"method":"GET","path":"/","format":"html","controller":"file_servers","action":"index","status":200,"duration":28.34,"view":25.96,"db":0.88,"ip":"127.0.0.1","route":"file_servers#index", "parameters":"","ndapi_time":null,"uuid":"e81ecd178ed3b591099f4d489760dfb6","user":"[email protected]", "site":"internal","some.activesupport.notification":{"count":42}},"@timestamp":"2013-04-30T13:00:46.354500+00:00"}

The store exposed to the blocked passed to watch is thread-safe, and reset after each request. By default, the store is only shared between occurences of the same event. You can easily share the same store between different types of notifications, by assigning them to the same event group:

# In config/initializers/logstasher.rb

LogStasher.watch('foo.notification', event_group: 'notification') do |*args, store|
  # Shared store with 'bar.notification'
end

LogStasher.watch('bar.notification', event_group: 'notification') do |*args, store|
  # Shared store with 'foo.notification'
end

Quick Setup for Logstash

Follow the instructions at logstash documentation to setup logstash. Start logstash with the following command:

bin/logstash -f quickstart.conf

Versions

All versions require Rails 5.2.x (Tested upto 6.1.x) and higher and Ruby 2.6+.

Development

  • Install dependencies: export RAILS_VERSION=5.2.0 bundle install --without guard --path=${BUNDLE_PATH:-vendor/bundle}
  • Run tests - rake
  • Generate test coverage report - rake coverage. Coverage report path - coverage/index.html

Copyright

Copyright (c) 2016 Shadab Ahmed, released under the MIT license

logstasher's People

Contributors

achempion avatar afbroman avatar alext avatar bitdeli-chef avatar etiennedepaulis avatar gouravtiwari avatar jasiek avatar jdurand avatar kovyrin avatar kristianhildebrandt avatar kyrylo avatar m-barthelemy avatar marcgrimme avatar mikesea avatar mtwentyman avatar petergoldstein avatar quixoten avatar ravbaker avatar roolo avatar shadabahmed avatar shaicoleman avatar simonecarriero avatar streetlogics avatar syndbg avatar tfausak avatar tijmenb avatar tusharmaroo avatar valtri avatar yuc-zhu avatar zerobearing2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstasher's Issues

Memory leak with latest version of logstasher

Hi folks,
for me it looks like the current logstasher gem has a serious memory leak.
How comes:
This commit removes all previous custom_fields: LogStasher.custom_fields.clear. This happened even through a request which was the problem. So the commit above fixed this but now the custom_fields are never cleaned up. This means during lifetime of a process (unicorn, whatever) of your Rails app the memory consumption grows. To make it worse the custom_fields are added with the same fields over again which even increases memory pressure more.
We've seen this in production now 3 times. It's fix time I guess.
So this is the issue and hopefully the fix comes soon.
Fix here #110

Question: Request parameters format?

Is there a reason you are not leaving parameters as a "Hash"?
https://github.com/shadabahmed/logstasher/blob/master/lib/logstasher.rb#L36-L37

payload[:parameters] = payload[:params].except(*ActionController::LogSubscriber::INTERNAL_PARAMS).inject(""){|s,(k,v)|
  s+="#{k}=#{v}\n"}

Would seem more useful to remain a Hash and allow the log entry to be JSON encoded, no?

payload[:parameters] = payload[:params].except(*ActionController::LogSubscriber::INTERNAL_PARAMS)

LogStasher.log('info', 'log info text') is not working

I can't get this method to work. It logs correctly when using 'warn' or 'error'. When run from the command line, it returns nil for 'info' but true for 'warn'/'error'. When used with 'info' it doesn't throw an error, just nothing happens.

Followed your instructions for installation and config - am I missing something?

Customizing log file name and path?

Just seeking comments here. Would it be a good idea to allow customizing log filename and path via app.config option like this below?

# If you want logstasher to log to a different file, uncomment and modify the line below.
# Defaults to "#{Rails.root}/log/logstash_#{Rails.env}.log"
config.logstasher.log_file = "/path/to/log/file/filename.log"

In my use case, I am sort of needing to do that.

Comments appreciated.

Thanks.

utime_failed error in Windows

Someone on a mac just utilized this gem for our project. We really like it, but our Windows developers are hitting the following error when Rails.application.initialize! is executed

Errno::EACCES: Permission denied @ utime_failed - log/test.log
/ruby/lib/ruby/2.1.0/fileutils.rb:1163:in `utime'
/ruby/lib/ruby/2.1.0/fileutils.rb:1163:in `block in touch'
/ruby/lib/ruby/2.1.0/fileutils.rb:1160:in `each'
/ruby/lib/ruby/2.1.0/fileutils.rb:1160:in `touch'
/ruby/lib/ruby/gems/2.1.0/gems/logstasher-0.9.0/lib/logstasher.rb:218:in `new_logger'
/ruby/lib/ruby/gems/2.1.0/gems/logstasher-0.9.0/lib/logstasher.rb:103:in `setup'
/ruby/lib/ruby/gems/2.1.0/gems/logstasher-0.9.0/lib/logstasher/railtie.rb:38:in `block (2 levels) in <class:Railtie>'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/lazy_load_hooks.rb:36:in `call'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/lazy_load_hooks.rb:36:in `execute_hook'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/lazy_load_hooks.rb:45:in `block in run_load_hooks'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/lazy_load_hooks.rb:44:in `each'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/lazy_load_hooks.rb:44:in `run_load_hooks'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/application/finisher.rb:62:in `block in <module:Finisher>'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/initializable.rb:30:in `instance_exec'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/initializable.rb:30:in `run'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/initializable.rb:55:in `block in run_initializers'
/ruby/lib/ruby/2.1.0/tsort.rb:226:in `block in tsort_each'
/ruby/lib/ruby/2.1.0/tsort.rb:348:in `block (2 levels) in each_strongly_connected_component'
/ruby/lib/ruby/2.1.0/tsort.rb:427:in `each_strongly_connected_component_from'
/ruby/lib/ruby/2.1.0/tsort.rb:347:in `block in each_strongly_connected_component'
/ruby/lib/ruby/2.1.0/tsort.rb:345:in `each'
/ruby/lib/ruby/2.1.0/tsort.rb:345:in `call'
/ruby/lib/ruby/2.1.0/tsort.rb:345:in `each_strongly_connected_component'
/ruby/lib/ruby/2.1.0/tsort.rb:224:in `tsort_each'
/ruby/lib/ruby/2.1.0/tsort.rb:205:in `tsort_each'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/initializable.rb:54:in `run_initializers'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/application.rb:352:in `initialize!'
/src/config/environment.rb:5:in `<top (required)>'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/dependencies.rb:274:in `require'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/dependencies.rb:274:in `block in require'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/dependencies.rb:240:in `load_dependency'
/ruby/lib/ruby/gems/2.1.0/gems/activesupport-4.2.6/lib/active_support/dependencies.rb:274:in `require'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/application.rb:328:in `require_environment!'
/ruby/lib/ruby/gems/2.1.0/gems/railties-4.2.6/lib/rails/application.rb:457:in `block in run_tasks_blocks'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:240:in `call'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:240:in `block in execute'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:235:in `each'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:235:in `execute'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:179:in `block in invoke_with_call_chain'
/ruby/lib/ruby/2.1.0/monitor.rb:211:in `mon_synchronize'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:172:in `invoke_with_call_chain'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:201:in `block in invoke_prerequisites'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:199:in `each'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:199:in `invoke_prerequisites'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:178:in `block in invoke_with_call_chain'
/ruby/lib/ruby/2.1.0/monitor.rb:211:in `mon_synchronize'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:172:in `invoke_with_call_chain'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/task.rb:165:in `invoke'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:150:in `invoke_task'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:106:in `block (2 levels) in top_level'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:106:in `each'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:106:in `block in top_level'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:115:in `run_with_threads'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:100:in `top_level'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:78:in `block in run'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:176:in `standard_exception_handling'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/lib/rake/application.rb:75:in `run'
/ruby/lib/ruby/gems/2.1.0/gems/rake-10.5.0/bin/rake:33:in `<top (required)>'
/ruby/bin/rake:23:in `load'
/ruby/bin/rake:23:in `<main>'
Tasks: TOP => db:reset => environment

I'm not sure how to file a ticket against fileutils so thought I'd start here...? We're really stuck as we can't even get the server up with this issue.

Obviously Windows doesn't have a touch command but I've seen some reports saying the method still works on Windows.

Windows 7
Ruby 2.1.5p273
Rails 4.2.6
Logstasher 0.9.0

Unable to visualize data on Kibana.

Hi, I have successfully managed to mount Kibana with my application after following the instructions on this page. I have also connected kibana with elastic search without any issues.

In my rails application I am generating logs with help of https://github.com/shadabahmed/logstasher gem.
I am just replacing config.kibana_default_route with the path of file generated by logstasher gem.

config.kibana_default_route = '/dashboard/file/default.json'

But after opening localhost:3000/kibana I am getting no data and no error.
screenshot from 2016-03-31 16 27 31

What am I missing?

File Size

It appears as though the logfile created by logstasher will grow in size to the point where it crashes the server because it has run out of disk space. Rails has a log rotation feature built in which can be configured as such:

config.logger = Logger.new("#{Rails.root}/log/#{Rails.env}.log", 10, 100.megabytes)  

This creates up to 10 log files each up to 100 megabytes in size for a maximum of 1GB of log files.

Is there a feature in logstasher that addresses this issue?

Is there is a way to skip binary files?

I have this in env file

# Enable logging of controller params
config.logstasher.log_controller_parameters = true

and some image uploads are serialised to logstash index.

Is there is a way to skip these files?

Elasticsearch cannot accept documents within arrays (or so)

If, for example, logstasher generates this log line (real numbers replaced with 0es for privacy):

{"@source":"unknown","@tags":["request"],"@fields":{"method":"POST","path":"/0/","format":"*/*","controller":"0/0","action":"update","status":200,"duration":100.1,"view":0.2,"db":25.75,"ip":"0.0.0.0","route":"0/0#0","request_id":"30af3016-8f26-4764-8a48-bf3e201c7492","parameters":{"cells":{"c;0;0;2015-02-03":["w",[{"rt":"g","ri":0,"id":0},{"rt":"r","ri":0,"id":0,"amt":0},{"rt":"w","st":0,"et":0,"color":"0","description":null,"breaks":0,"amt":0}]]},"pp":0,"ctx_token":"0"},"client_id":0,"shop_id":0,"user_id":0,"site":"app"},"@timestamp":"2015-02-03T14:48:31.588Z","@version":"1","type":"rails","file":"/home/deployer/apps/xxx/shared/log/logstash_lol.log","host":"in2","offset":"104057368"}

It will not be accepted by Elasticsearch and it will throw an exception:

[2015-02-03 18:49:13,963][DEBUG][action.bulk              ] [X-Man] [logstash-2015.02.03][1] failed to execute bulk item (index) index {[logstash-2015.02.03][rails][1SZRSpULSwW4-FHNwwORSQ], source[{"@source":"unknown","@tags":["request"],"@fields":{"method":"POST","path":"/0/","format":"*/*","controller":"0/0","action":"update","status":200,"duration":100.1,"view":0.2,"db":25.75,"ip":"0.0.0.0","route":"0/0#0","request_id":"30af3016-8f26-4764-8a48-bf3e201c7492","parameters":{"cells":{"c;0;0;2015-02-03":["w",[{"rt":"g","ri":0,"id":0},{"rt":"r","ri":0,"id":0,"amt":0},{"rt":"w","st":0,"et":0,"color":"0","description":null,"breaks":0,"amt":0}]]},"pp":0,"ctx_token":"0"},"client_id":0,"shop_id":0,"user_id":0,"site":"app"},"@timestamp":"2015-02-03T14:48:31.588Z","@version":"1","type":"rails","file":"/home/deployer/apps/xxx/shared/log/logstash_lol.log","host":"in2","offset":"24970"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [@fields.parameters.cells.c;0;0;2015-02-03]
        at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:415)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:555)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeNonDynamicArray(ObjectMapper.java:686)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeArray(ObjectMapper.java:605)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeNonDynamicArray(ObjectMapper.java:688)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeArray(ObjectMapper.java:605)
        at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:492)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:555)
        at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:490)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:555)
        at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:490)
        at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:555)
        at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:490)
        at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:541)
        at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:490)
        at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:392)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:444)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:150)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:511)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:419)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.ElasticsearchIllegalArgumentException: unknown property [rt]
        at org.elasticsearch.index.mapper.core.StringFieldMapper.parseCreateFieldForString(StringFieldMapper.java:331)
        at org.elasticsearch.index.mapper.core.StringFieldMapper.parseCreateField(StringFieldMapper.java:277)
        at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:405)
        ... 22 more

Custom store data from ActiveSupport::Notifications not saving in release 0.7.1

Hello, upgrading from 0.7.0 to 0.7.1 breaks the ActiveSupport::Notifications watch behaviour for me.

From the docs:

# Watch calls the block with the same arguments than any ActiveSupport::Notification, plus a store
LogStasher.watch('some.activesupport.notification') do |name, start, finish, id, payload, store|
  # Do something
  store[:count] = 42
end

After the 0.7.1 upgrade, the custom store key/value pair does not get output in the final output log. This is working fine in 0.7.0.

Comparison:
https://github.com/shadabahmed/logstasher/compare/6506a39f6bb71662f954af3f59d36da2f21cc7ec...8206937595a5a61d1e3ab697a60e777abfdd9e7e?diff=split&name=8206937595a5a61d1e3ab697a60e777abfdd9e7e

logging json errors

Hi. I want to save all response json errors into @fields in initializer.

My logstasher.rb

if Rails.env.staging? || Rails.env.production?
  LogStasher.add_custom_fields do |fields|
    fields[:node] = nodename=`hostname`[0..-2]
    fields[:user] = current_user
    fields[:env] = Rails.env
    fields[:json_errors] = request.json_erros (??? HOW)
    LogStasher.custom_fields << :myapi_runtime
  end
end

custom fields don't show up in log entries

There's a bug in 0.8 where, if you set custom_fields, they will be cleared before they are really used. The problem is that all three log subscribers (ActiveRecord, ActionView, ActionController) have this line:

LogStasher.custom_fields.clear

practically, this means that you will only see custom fields on your ActionController log entries if you have no view rendering or AR calls, since those calls will cause their associated log subscriber to clear the custom_fields array. This isn't very useful.

A temporary workaround is to just disable the new log subscribers like so in your config/logstasher.rb

  ::ActiveSupport::LogSubscriber.log_subscribers.each do |subscriber|
    case subscriber.class.name
      when 'LogStasher::ActionView::LogSubscriber'
        LogStasher.unsubscribe(:action_view, subscriber)
      when 'LogStasher::ActiveRecord::LogSubscriber'
        LogStasher.unsubscribe(:active_record, subscriber)
    end
  end

Adding Custom fields when request is Exception doesn't work

When adding custom fields in the logstasher initializer:

LogStasher.add_custom_fields do |fields| fields[:uid] = request.uuid end

I see that it adds the new field in the regular request @tags":["request"].

But when I have an exception (@tags":["request","exception"]) the new field is not being added to the log entry. Even if I try to debug I see that it doesn't even stop in the initializer code.

private method `logger'

I got this error with 0.6.0

Rack app error: #<NoMethodError: private method `logger' called for #<LogStasher::MailerLogSubscriber:0x0000000bd79370>>

Any plans to release f89b365?

Formatting ALL logs as JSON for logstash?

Great gem! Thank you for this.

I am noting something, however, that's counter-intuitive. It seems this only replaces Rails' default request logging, but not all logging across the board. For instance, arbitrary log messages like log.debug 'Hello, World!' don't get formatted as JSON and still go to the usual .rb log file.

The documentation also alludes to this limitation in a list of project goals:

Prevent any logger messages(e.g. info) getting into our pure json logs

I am thinking, however, that there are probably many folks like myself that would like all application logs to go to the same place. In many cases, the sort of log messages that logstasher is not currently bothering with might be much more valuable than the request logs- which are often dialed down in production anyway to reduce noise.

What would be your thoughts on updating the gem to at least offer the option of replacing Rails.logger with a LogStasher? (Note I have tried this already and it does not work at present.)

I'd be happy to help out with this a little, but have to admit I'm a pretty new to the internals of Rails' logging- which is why I am raising this as a question/comment first instead of jumping straight to a pull request.

Can you add an option to disable tags?

We currently use beaver to generate our own tags to show environment and application category but those tags are overwritten by tags logstasher generates when it is parsed by logstash. If logstasher has an option to disable its own tags such as 'request', it will be more flexible to use.

Below is an example log that goes to logstash. I use logstash json filter to parse message field.

{"tags": ["staging-someserver-com", "staging"], "@Version": 1, "@timestamp": "2015-05-11T23:15:53.290Z", "host": "staging-someserver.com", "file": "/somedirectory/log/logstasher.log", "message": "{\"method\":\"GET\",\"path\":\"/\",\"forma
t\":\"html\",\"controller\":\"home\",\"action\":\"index\",\"status\":200,\"duration\":322.64,\"view\":274.75,\"db\":21.63,\"ip\":\"199.199.199
.199\",\"route\":\"home#index\",\"request_id\":\"some-random-id\",\"source\":\"something\",\"tags\":[\"request\"],\"@timestamp\":
"2015-05-11T23:15:53.259Z\",\"@Version\":\"1\"}", "type": "rails-logstasher"}"

Could not log "process_action.action_controller" event

On Exceptions I get the following error:
Could not log "process_action.action_controller" event. NoMethodError: undefined method<<' for nil:NilClasscoming fromlogstasher-0.2.8/lib/logstasher/log_subscriber.rb:17`
I'm using rails 4.1.4.

Option suppress_app_log doesn't work

I tried setting the suppress_app_log to true but it still added new content in my default log file.

  # Enable the logstasher logs for the current environment
  config.logstasher.enabled = true
  config.logstasher.suppress_app_log = true

Tried it on rails 3.2.15, setting this in development environment.

I just had to create a scaffold for blog_posts (rails g scaffold blog_posts), and here's what I get...

Running rails server:

[13:36:43] kouno:rails-test $ be rails s
=> Booting WEBrick
=> Rails 3.2.15 application starting in development on http://0.0.0.0:3000
=> Call with -d to detach
=> Ctrl-C to shutdown server
[2013-10-24 13:36:45] INFO  WEBrick 1.3.1
[2013-10-24 13:36:45] INFO  ruby 1.9.3 (2012-04-20) [x86_64-darwin12.3.0]
[2013-10-24 13:36:45] INFO  WEBrick::HTTPServer#start: pid=32951 port=3000
Connecting to database specified by database.yml
  BlogPost Load (0.1ms)  SELECT "blog_posts".* FROM "blog_posts"
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
Served asset /application.css - 304 Not Modified (4ms)
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
Served asset /blog_posts.css - 304 Not Modified (1ms)
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
Served asset /scaffolds.css - 304 Not Modified (1ms)
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
Served asset /jquery.js - 304 Not Modified (4ms)
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
Served asset /jquery_ujs.js - 304 Not Modified (1ms)
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
Served asset /blog_posts.js - 304 Not Modified (1ms)
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
Served asset /application.js - 304 Not Modified (3ms)
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true
[2013-10-24 13:36:50] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true

Content of development.log:

[13:36:51] kouno:rails-test $ cat log/development.log
Connecting to database specified by database.yml
  BlogPost Load (0.1ms)  SELECT "blog_posts".* FROM "blog_posts"
Served asset /application.css - 304 Not Modified (4ms)
Served asset /blog_posts.css - 304 Not Modified (1ms)
Served asset /scaffolds.css - 304 Not Modified (1ms)
Served asset /jquery.js - 304 Not Modified (4ms)
Served asset /jquery_ujs.js - 304 Not Modified (1ms)
Served asset /blog_posts.js - 304 Not Modified (1ms)
Served asset /application.js - 304 Not Modified (3ms)

LogStach.watch don't work

app/controllers/application_controller.rb

def logger_before
   ActiveSupport::Notifications.instrument('errors', errors: "before_action client_ip=#{session[:client_ip]} mdc=#{session[:mdc]} params=[#{params}]")
end

config/initializers/logstasher.rb

if LogStasher.enabled?

  LogStasher.watch('errors') do |name, start, finish, id, payload, store|   
   # DON'T WORK <<<<<<<<<<<<<<<<<<<<<<<<------------------- 
   store[:errors] = payload[:errors]
   # DON'T WORK <<<<<<<<<<<<<<<<<<<<<<<<------------------- 
  end

  LogStasher.add_custom_fields do |fields|
    fields[:mdc] = session[:mdc]
    # If you are using custom instrumentation, just add it to logstasher custom fields
    LogStasher.custom_fields << :myapi_runtime
  end

end

log/logstash_development.log

{"method":"GET","path":"/extract_json","format":"html","controller":"extracts","action":"show_json","status":200,"duration":36679.02,"view":0.13,"db":7.69,"ip":"127.0.0.1","route":"extracts#show_json","request_id":"6381913f-5f28-4509-9244-47e337c51d34","parameters":{},"source":"bonusesfera-dev","tags":["request"],"@timestamp":"2016-03-09T20:37:58.937Z","@version":"1"}

Anyone can me help?

Logstasher not logging RoutingError

Hi there,

In production.log i have:

ActionController::RoutingError (No route matches [GET] "/asdf"):
  actionpack (4.2.1) lib/action_dispatch/middleware/debug_exceptions.rb:21:in `call'
  actionpack (4.2.1) lib/action_dispatch/middleware/show_exceptions.rb:30:in `call'
  railties (4.2.1) lib/rails/rack/logger.rb:38:in `call_app'
  railties (4.2.1) lib/rails/rack/logger.rb:20:in `block in call'

However this is not reflected in logstash_production.log.

Best,
Ivan

Way to install logstash

The way to logstash has been changed

http://www.logstash.net/docs/1.4.2/release-notes

This line no longer works -: java -jar logstash-1.3.3-flatjar.jar agent -f quickstart.conf -- web

# Old way:
% java -jar logstash-1.3.3-flatjar.jar agent -f logstash.conf

# New way:
% bin/logstash agent -f logstash.conf

@shadabahmed -: Should I submit this as a pull request?

Also, how do you keep the server permanently running so that it even starts on a restart?

Logging / not logging controller params hash

In #20 and the README indicating there might be issues with logging the params hash, I'm a bit concerned, as most of the problems I encounter have needed the params hash to discover the source of the errors. I don't want to explode my logs OR lose log records, neither is useful.

Is there an alternative I can use that let's me still view the params hash, but doesn't cause either of those problems to show up? For example, could I somehow have them logged simply as a custom value converted to a string?

Best way to make logstash_<env>.log buffered

I notice that log messages are immediately written to my logstash_.log. I tail this log and make requests in my app and see lines written immediately. I'd like to buffer these messages in memory and write to disk less often. This is how the default Rails BufferedLogger works.

Does logstasher allow a simple way to configure the buffering of log messages? If not, I'll dig through the code and see what I can do.

Occasional empty fields in log entries (ip, route, request_id, hostname, application, environment, system)

Hi, sometimes (actually quite often), a log entry is partially empty, i.e. many of its fields are null. This has been observed both in the log file logstash_development.log, and in logstash itself. I am using logstasher (0.9.0) and rails 3.2.22.2. Here is a sample problematic message (notice the "ip" and the rest of the fields are 'null'):

{"method":"GET","path":"/beta/test","format":"html","controller":"beta/test","action":"show","status":404,"duration":57.59,"view":0.71,"db":6.58,"ip":null,"route":null,"request_id":null,"hostname":null,"application":null,"environment":null,"system":null,"source":"logstasher","tags":["request"],"@timestamp":"2016-04-22T11:35:58Z","@version":"1"}

While I see many bad log entries like this, there are also many valid ones, that have all those fields filled properly. Even issuing one and the same request several times in a row (e.g. by clicking a button), leads to sometimes complete and sometimes incomplete log entries. Can't find a particular pattern, though...

Any ideas what could be the reason for this strange behaviour? Any tips for troubleshooting?

Clean install throws nil object

When enabling logstasher the following errors lines appear (both in the 'normal' log file and in the logstash log file):
2016-01-28 05:03:27 ERROR: Could not log "start_processing.action_controller" event. NoMethodError: You have a nil object when you didn't expect it!

How can I forward this to stdout?

I'm trying to use this gem inside a docker container, and docker logs work best with stdout. I tried configuring config.logstasher.logger_path to /dev/stdout, but I would get the following error:

Message from application: No such device or address @ rb_sysopen - /dev/stdout (Errno::ENXIO)

Missing license text

Linux distributions like licenses and there is missing license text in the logstash source code. Also most of the variants of MIT license even require distribution of license text with the sources. :-)

I'm not sure which variant you would prefer. Open Source Initiative presents "expat" variant:
http://opensource.org/licenses/MIT
https://fedoraproject.org/wiki/Licensing:MIT?rd=Licensing/MIT#Modern_Style_with_sublicense
but there are many others (as seen on the Fedora wiki).

Also there can be added gem.license field to gemspec? Currently the MIT licensing is mentioned only in the README file.

Kibana can't read Logstasher logs

Edit: I'm using logstash 1.3.2

I can search the logs inside of ES, they are there, but whenever i try and search for them via kibana I am getting nothing.

The big difference I have noticed is that with the logstasher logs the values are wrapped inside of a @fields property object

{
"_index": "logstash-2014.01.06",
"_type": "logs",
"_id": "mWr5s74vQYmk4pVnT5Xrlg",
"_score": 2.9078493,
"_source": {
"@source": "unknown",
"@tags": [
"request"
],
"@fields": {
"method": "GET",
"path": "/health_check",
"format": "html",
"controller": "health_checks",
"action": "show",
"status": 200,
"duration": 3.94,
"view": 0.79,
"db": 0,
"ip": "127.0.0.1",
"route": "health_checks#show",
"parameters": {}
},
"@timestamp": "2014-01-06T00:31:46.838Z",
"@Version": "1",
"type": "rails",
"host": "p-hub-web-2",
"path": "/srv/tuts-hub/shared/log/logstash_production.log"
}
},

Do I have to do anything special? The mappings shouldnt matter because the data is actually already stored inside of elasticsearch, and I can even query it via the api, I just cant get it displayed inside of kibana.

Pls Help..?

Adding custom fields to the log is not working with rails 3.2.13

Adding custom fields to the log is not working with rails 3.2.13 and logstasher 0.2.5.
I have followed all the step and created a file as config/initializers/logstasher.rb. Following are the code inside the file "logstasher.rb"

if LogStasher.enabled
 LogStasher.add_custom_fields do |fields|
    fields[:country_code] = "US"
 end
end

After that I have restarted the webrick server, but the LogStasher.enaabled returns nil value and not storing the country code in the log file.

My requirement is like, I need to save the location information to the log file with the other information to display the kibana map.

Can you plz help me?

How to format Grape logs as well

Thanks a lot for providing this gem. It helps so much for integrating rails log and logstasher.

I am also using Grape on top of Rails for API, this part of log is not formatted by logstasher gem, is there anyway to include Grape as well?

Thanks a lot.

Best Regards.
Larry

Logstasher is not Logging Errors

Hi,

I'm running a rails 4 application in development mode.

Here is my config/environments/development.rb =>

  # Logstasher Config
  # Enable the logstasher logs for the current environment
  config.logstasher.enabled = true

  # This line is optional if you do not want to suppress app logs in your <environment>.log
  config.logstasher.suppress_app_log = false

  # This line is optional, it allows you to set a custom value for the @source field of the log event
  config.logstasher.source = 'foobar'

  # This line is optional if you do not want to log the backtrace of exceptions
  config.logstasher.backtrace = false

  # This line is optional, defaults to log/logstasher_<environment>.log
  config.logstasher.logger_path = 'log/logstasher.log'

The problem is, when an application error occurs (like 404 not found) I can see the error in log/development.log but not in log/logstasher.log

How can I solve this issue?

Invalid gemspec

source 'https://rubygems.org'
gem 'logstasher',
  github: 'shadabahmed/logstasher',
  ref: 'bd4235e'
$ bundle install --path vendor/bundle
Using logstash-event 1.1.5
Using request_store 1.1.0

logstasher at /Users/taylor/Documents/GitHub/logstasher/tmp/vendor/bundle/ruby/2.0.0/bundler/gems/logstasher-bd4235e59c4f did not have a valid gemspec.
This prevents bundler from installing bins or native extensions, but that may not affect its functionality.
The validation message from Rubygems was:
  ["lib/logstasher.rb
lib/logstasher/device/redis.rb
lib/logstasher/log_subscriber.rb
lib/logstasher/rails_ext/action_controller/metal/instrumentation.rb
lib/logstasher/rails_ext/rack/logger.rb
lib/logstasher/railtie.rb
lib/logstasher/version.rb
"] are not files
Using logstasher 0.6.0 from git://github.com/shadabahmed/logstasher.git (at bd4235e)
Using bundler 1.7.0
Your bundle is complete!
It was installed into ./vendor/bundle
$ ruby --version
ruby 2.0.0p451 (2014-02-24 revision 45167) [universal.x86_64-darwin13]
$ gem --version
2.0.14
$ bundle --version
Bundler version 1.7.0

uninitialized constant ActionMailer::LogSubscriber

My simple project didn't need to sending email, so I didn't config any email settings.
But when I used logstasher, there was an error here:

lib/logstasher.rb:24:in `block in remove_existing_log_subscriptions': uninitialized constant ActionMailer::LogSubscriber (NameError)

I tried many settings, when I added ActionMailer::Base.delivery_method = :smtp to my env file before logstasher config, everything worked well.

config.logstasher.log_level is ignored

We're using Logstasher while also keeping the regular log for our Rails 4 application.

I saw that there seems to be a "config.logstasher.log_level" option, defaulting to warn.
However, leaving it unconfigured or overriding it to fatal (for example) doesn't seem to be actually doing something : Logstash Json log always logs everything >= info.

Is this the intended behavior?

Custom fields to payload after action is called no longer working?

I believe that inserting custom fields in a before_filter action is no longer working (See PR #24 for previous work).

I tried making a test to replicate it but it seems there are no tests for custom field insertion in the payload, only for context ones. Commit e0f0312 (by @frankel) seems to be the culprit, this situation has a conflict with the exception use case.

Any ideas on how we can make both use cases coexist? Thanks in advance, great lib.

LogStasher.log not 100% JSON

I'm writing an application that outputs a lot of debugging logs and informational logs, and have noticed an issue when using the LogStasher.log function. Whilst it is generating a JSON log output, and it is being written to the log stash log file, I've noticed that instead of it being completely JSON, it contains some plain text before, causing log stash to interpret the messages incorrectly.

For example, if I call LogStasher.log('info', 'Hello World'), I then see that in my logstash_development.log file the following line has been added

I, [2014-07-24T17:59:04.226000 #34589]  INFO -- : {"@source":"myApp","@tags":["log"],"@fields":{"message":"Hello World","level":"info"},"@timestamp":"2014-07-24T16:59:04.226Z"}

The JSON is perfect, but the beginning of the output is not quite right.
When I launch logstash and kibana, I can then see that this output has been interpreted as the following JSON

{
  "_index": "logstash-2014.07.24",
  "_type": "rails logs",
  "_id": "FNNtNHzdTFOMaI9aQho3wg",
  "_score": null,
  "_source": {
    "message": "I, [2014-07-24T17:03:00.220000 #34589]  INFO -- : {\"@source\":\"myApp\",\"@tags\":[\"log\"],\"@fields\":{\"message\":\"Skipping alert processing as it is currently disabled\",\"level\":\"info\"},\"@timestamp\":\"2014-07-24T16:03:00.219Z\"}",
    "@version": "1",
    "@timestamp": "2014-07-24T16:54:19.928Z",
    "type": "rails logs",
    "host": "makalu.home",
    "path": "/Users/Aidy/Documents/Work/Web/Thoughtified/aiAlerts/aiAlerts/log/logstash_development.log"
  },
  "sort": [
    1406220859928,
    1406220859928
  ]
}

Logstash has done a relatively good job here in that it hasn't simply rejected it, but as you can, the JSON output from LogStasher has been interpreted as plain text and put into the message field, rather than being an object in the message field.

I've dug through the code, and the only difference I can find is that for rails requests, the message is logged via

LogStasher.logger << event.to_json + "\n"

While the log method is logging via

self.logger.send severity, event.to_json

I've done a quick monkey patch and changed the log method to work in the same way and the output is now correct. Calling LogStasher.log('info', 'Hello World') now results in logstash parsing the following JSON

{
  "_index": "logstash-2014.07.24",
  "_type": "rails logs",
  "_id": "6nRQE21JQnKQYnP-k6a2oA",
  "_score": null,
  "_source": {
    "@source": "myApp",
    "@tags": [
      "log"
    ],
    "@fields": {
      "message": "Hello World",
      "level": "error"
    },
    "@timestamp": "2014-07-24T17:11:32.505Z",
    "@version": "1",
    "type": "rails logs",
    "host": "makalu.home",
    "path": "/Users/Aidy/Documents/Work/Web/Thoughtified/aiAlerts/aiAlerts/log/logstash_development.log"
  },
  "sort": [
    1406221892505,
    1406221892505
  ]
}

As you can see, the message is no longer in an unparsed plain text state, but is part of the actual JSON which is what I'd expect.

The log method I've got in my monkey patch is now

def log(severity, msg)
  if self.logger && self.logger.send("#{severity}?")
    event = LogStash::Event.new('@source' => self.source, '@fields' => {:message => msg, :level => severity}, '@tags' => ['log'])
    # self.logger.send severity, event.to_json
    self.logger << event.to_json + "\n"
  end
end

0.8 broken? Got spam in log

Hello! Last evening I've updated logstaher from 0.6.5 to 0.8 (even not thinking about it, just bundle update), and this morning I accidentally came to my Kibana instance that gathering logs from production, and ... WOW, 10 mil log entries! After update, logstaher has started to write some **** to logs like "null" and templates render messages. That was amazing experience. Please investigate :)

Here is my production.rb

  config.logstasher.enabled = true
  config.logstasher.suppress_app_log = false
  config.logstasher.source = 'randewoo_production'
  config.logstasher.logger_path = 'log/logstasher_production.log'
  config.logstasher.backtrace = false

and initializer:

if LogStasher.enabled
  LogStasher.add_custom_fields do |fields|
    fields[:request_path] = request.path
    fields[:site] = request.path =~ /^\/api/ ? 'api' : 'frontend'

    controller = "#{params[:controller]}_params".gsub(/\//, '_')
    fields[controller.to_sym] = params.reject { |k| ['controller', 'action'].include? k }
  end
end

Log file name, line

Hey, I want to log the logger.debug, logger.info, etc. call in controller information, sunch as callee , FILE, LINE , so I can trace the souce code information in log file.
How can i do that? Please

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.