Giter Club home page Giter Club logo

logstash-output-google_pubsub's Introduction

Gem Version Travis Build Status

Logstash Output Google Pubsub

A Logstash plugin to upload log events to Google Cloud Pub/Sub. Events are batched and uploaded in the background for the sake of efficiency. Message payloads are serialized JSON representations of the events.

Example use-cases:

  • Stream events to Dataproc via Pub/Sub for real-time analysis.
  • Forward events from an on-prem datacenter to the Logstash in the cloud.
  • Use Pub/Sub as an scalable buffer to even out event flow between processing steps.

Note: While this project is partially maintained by Google, this is not an official Google product.

Documentation

Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one central location.

Need Help?

Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.

Developing

1. Plugin Development and Testing

Code

  • To get started, you'll need JRuby with the Bundler gem installed.

  • You'll also need a Logstash installation to build the plugin against.

  • Create a new plugin or clone and existing from the GitHub logstash-plugins organization. We also provide example plugins.

  • export LOGSTASH_SOURCE=1 and point LOGSTASH_PATH to a local Logstash e.g. export LOGSTASH_PATH=/opt/local/logstash-8.7.0

  • Install Ruby dependencies

bundle install
  • Install Java dependencies - regenerates the lib/logstash-output-google_pubsub_jars.rb script used to load the .jar dependencies when the plugin starts.
./gradlew vendor

NOTE: This step is necessary whenever build.gradle is updated.

Test

  • Update your dependencies
bundle install
  • Run Ruby tests
bundle exec rspec

2. Running your unpublished Plugin in Logstash

2.1 Run in a local Logstash clone

  • Edit Logstash Gemfile and add the local plugin path, for example:
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
  • Install plugin
bin/logstash-plugin install --no-verify
  • Run Logstash with your plugin
bin/logstash -e 'filter {awesome {}}'

At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.

2.2 Run in an installed Logstash

You can use the same 2.1 method to run your plugin in an installed Logstash by editing its Gemfile and pointing the :path to your local plugin development directory or you can build the gem and install it using:

  • Build your plugin gem
gem build logstash-filter-awesome.gemspec
  • Install the plugin from the Logstash home
bin/logstash-plugin install /your/local/plugin/logstash-filter-awesome.gem
  • Start Logstash and proceed to test the plugin

Contributing

All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.

Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.

It is more important to the community that you are able to contribute.

For more information about contributing, see the CONTRIBUTING file.

logstash-output-google_pubsub's People

Contributors

acchen97 avatar andsel avatar colinsurprenant avatar ivantsepp avatar josephlewis42 avatar jsvd avatar karenzone avatar kares avatar robbavey avatar yaauie avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash-output-google_pubsub's Issues

PERMISSION_DENIED: User not authorized to perform this action

Hi, I am trying to use this plugin to publish to pub/sub from outside the google Env. I have created a service account with the Pub/Sub Publisher role and reference the Json file in the config. But i get a permission denied when i restart logstash (Can confirm 403 code from GCP API console for that service account). I tried giving the service account Pub/Sub Admin role and had no luck(tried project owner just to test it and no luck).
I can also confirm LS picking up the key on initialization
Initializing Google API client on projects/my-project/topics/logstash-input key: /etc/logstash/gcp-cloud-423478239.json
Here is the error i get:
:error=>"io.grpc.StatusRuntimeException: PERMISSION_DENIED: User not authorized to perform this action."}
Not sure what i am missing here. Any help would be appreciated.
Here is my LS config:

  google_pubsub {
        project_id => "my-project"
        topic => "logstash-input"
        json_key_file => "/etc/logstash/gcp-cloud-423478239.json"
  } 
} 

Plugin crashes when used alongside logstash-input-google_pubsub

Hi,

I'm trying to replace kafka with pub/sub in our logging stack. Right now I am unable to read from pub/sub in a pipeline that also outputs to pub/sub.

Whenever I input from another source, i.e. stdin or kafka, it works. But as soon as I use the logstash-input-google_pubsub (1.2.0) alongside the logstash-output-google_pubsub (v1.0.0), Logstash crashes.

I am currently testing it in the official Logstash 5.6.10 docker image.

Working config using stdin as input:

input {
  stdin {
    id => "stdin-input"
  }
}

output {
  google_pubsub {
    id => "pubsub-output"
    project_id => "project_name"
    topic => "output_topic"
    json_key_file => "/path/to/service_account.json"
  }
}

Config that crashes:

input {
  google_pubsub {
    id => "pubsub-input"
    project_id => "project_name"
    topic => "input_topic"
    subscription => "logstash_processor"
    json_key_file => "/path/to/service_account.json"
  }
}

output {
  google_pubsub {
    id => "pubsub-output"
    project_id => "project_name"
    topic => "output_topic"
    json_key_file => "/path/to/service_account.json"
  }
}

Here is the stack trace:

[2018-07-30T18:04:53,695][ERROR][logstash.agent           ] Pipeline aborted due to error {:exception=>#<TypeError: cannot convert instance of class org.jruby.RubyString to class com.google.pubsub.v1.TopicName>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-google_pubsub-1.0.0-java/lib/logstash/outputs/pubsub/client.rb:64:in `initialize_google_client'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-google_pubsub-1.0.0-java/lib/logstash/outputs/pubsub/client.rb:14:in `initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-google_pubsub-1.0.0-java/lib/logstash/outputs/google_pubsub.rb:48:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:9:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:in `register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in `register_plugins'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:310:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:235:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:408:in `start_pipeline'"]}

Issues running on the openjdk:8-jre-alpine docker image

@fkoclas mentions issues running the 1.0.0 version of this plugin on Alpine. See #5 for more details.

Config which works on the openjdk:8-jre docker image but not openjdk:8-jre-alpine

input {
  stdin {
  }
}

output {
  google_pubsub {
    project_id => "<snip>"
    topic => "elasticsearch_indexing"
    json_key_file => "/etc/secrets/serviceaccount.json"
  }
}

The error is:

[ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<TypeError: cannot convert instance of class org.jruby.RubyString to class com.google.pubsub.v1.TopicName>, :backtrace=>["/opt/logstash/vendor/local_gems/510e3a38/logstash-output-google_pubsub-1.0.0-java/lib/logstash/outputs/pubsub/client.rb:80:in initialize_google_client'", "/opt/logstash/vendor/local_gems/510e3a38/logstash-output-google_pubsub-1.0.0-java/lib/logstash/outputs/pubsub/client.rb:30:ininitialize'", "/opt/logstash/vendor/local_gems/510e3a38/logstash-output-google_pubsub-1.0.0-java/lib/logstash/outputs/google_pubsub.rb:114:in register'", "/opt/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:9:inregister'", "/opt/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in register'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:290:inregister_plugin'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:301:in register_plugins'", "org/jruby/RubyArray.java:1613:ineach'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:301:in register_plugins'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:310:instart_workers'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:235:in run'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:408:instart_pipeline'"]}

Want to include My effort here

Hello Alvin Chen,

I have Already setup pub/sub output plugin for logstash. I want to include that as I have already submitted the plugin to rubygem, any change from yourside on repo, might effect my existing usage too.

Seems I dont have access to change the code here. So request you to include below repo:

Note below taken care on https://github.com/niravshah2705/logstash-output-google_pubsub:

  1. Retry with Exponential backoff
  2. Dynamic Topic - based on message field data
  3. exclude_fields - Despite pushing all json data, exclude fields
  4. include_fields - Only push specified json data
  5. include_field - Push data for the field not json

Do reply on same.

Enable keyless access to GCP pubsub with workload Identity Federation

Support workload identity federation to authenticate with GCP pubsub:
At present, logstash-output-google_pubsub uses service account key to access GCP pubsub which introduces the risks associated with managing long-lived keys for application. We can avoid this risk by using GCP workload identity federation instead of service account key. Can we get this feature for logstash-output-google_pubsub plugin which will use workload identity federation to access GCP pubsub ?

Details of the GCP workload identity federation is provided in that link :
https://cloud.google.com/blog/products/identity-security/enable-keyless-access-to-gcp-with-workload-identity-federation

Proposed solution:
In logstash-output-google_pubsub plugin, the pubsub client is defined in logstash-output-google_pubsub/lib/logstash/outputs/pubsub/client.rb file, where the following code block (line 51 to 69 in the original file) shows how the service key is used to authenticate with pubsub.


        def initialize_google_client(json_key_file, topic_name, batch_settings)
          @logger.info("Initializing Google API client on #{topic_name} key: #{json_key_file}")

          if use_default_credential? json_key_file
            credentials = com.google.cloud.pubsub.v1.TopicAdminSettings.defaultCredentialsProviderBuilder().build()
          else
            raise_key_file_error(json_key_file)

            key_file = java.io.FileInputStream.new(json_key_file)
            sac = com.google.auth.oauth2.ServiceAccountCredentials.fromStream(key_file)
            credentials = com.google.api.gax.core.FixedCredentialsProvider.create(sac)
          end

          com.google.cloud.pubsub.v1.Publisher.newBuilder(topic_name)
             .setCredentialsProvider(credentials)
             .setHeaderProvider(construct_headers)
             .setBatchingSettings(batch_settings)
             .build
        end



A possible solution would be to replace ServiceAccountCredentials (https://googleapis.dev/java/google-auth-library/latest/com/google/auth/oauth2/ServiceAccountCredentials.html) class with ExternalAccountCredentials (https://googleapis.dev/java/google-auth-library/latest/com/google/auth/oauth2/ExternalAccountCredentials.html). ExternalAccountCredentials class will use a credentials.json file generated from workload identity federation to access gcp pubsub.

ExternalAccountCredentials credentials = 
    ExternalAccountCredentials.fromStream(new FileInputStream("/path/to/credentials.json"));

I am happy to make this contribution if everyone agrees on the proposed solution.

message batching issues

It seems in the docs you have listed an option for high volume by increasing the message_count_threshold.

High Volume
If you find that uploads are going too slowly, you can increase the message batching:

output {
google_pubsub {
project_id => "my_project"
topic => "my_topic"
json_key_file => "service_account_key.json"

# Options for configuring the upload
message_count_threshold => 10000
delay_threshold_secs => 10
request_byte_threshold => 5MB

}
}

Yet, in the description of message_count_threshold you declare that limit must be below 1000. It doesn't operate over 1000 message count
: INVALID_ARGUMENT: The value for 1107 is too large. You passed message_count in the request, but the maximum value is 1000."

The output engine is much slower than expected even with 1000

Your feedback would be much appreciated

Compatibility with GKE Workload Identity

I deployed this in a cluster with Workload Identity enabled but got a permissions error when I tried to publish to a topic that the associated service account had permissions for.

By explicitly creating a key for the service account and providing it as the json_key_file (as I would do in a cluster without Workload Identity) it worked, so I don't think there was anything wrong with the permissions themselves, just that the plugin isn't "Workload Identity-aware".

The workaround is simple so the impact is just a little extra work to get it up and running and some extra kubernetes cruft in our terraform for provisioning the cluster, which was a shame because Workload Identity had previously done away with that.

Add support for retries

We've noticed an interesting edge case where PubSub encountered a io.grpc.StatusRuntimeException for a messages in a minute long window. What I'm guessing happened here is that PubSub had a hiccup for a moment, however the messages that were attempted to be sent in that window were dropped.

We depend on both Logstash and Pubsub for an ingestion pipeline with guaranteed at least one delivery, so this edge case is quite concerning for us.

From what I can tell however, this failure is retry-able, so similarly to how the retry mechanism works on the Kafka output, can we get one for PubSub which will handle these edge cases?

Last batch of events is not published to Google Pub/Sub

Using Logstash 7.1.1

While using this plugin, me and my team discovered that a last batch of events is not published.
We retrieve data using JDBC plugin, process the data with some json filters and output jsons to Pub/Sub using this plugin.

What happens?
It looks like when a pipelines finishes, Logstash triggers a shutdown phase but that is not communicated to the plugin to send remaining events.

How to fix?
The fix I implemented was to overwrite this plugin and add a method in the main class:

def close
  @logger.info("Received shutdown command. Sending all outstanding messages!")
  stop
end

The original stop method of the plugin is not triggered by Logstash but the close is!
I hope it helps if anyone else has this issue.

Dynamic Attributes

We are wanting to add a correlation Id to the attributes on the messages we are publishing. We don't see a way to dynamically add attributes to the messages. The ability to do that would be really helpful. Thanks for considering ๐Ÿ‘

Messages are not published to PubSub when using the google_cloud_storage input plugin.

Problem: Messages are not published to PubSub properly when using the google_cloud_storage input plugin. Messages are published fine when using file{} as the input source.

Steps to reproduce:
The following config file will reproduce the condition:

input {
google_cloud_storage {
interval => 30
bucket_id => "redacted"
file_matches => "log.txt"
json_key_file => "/etc/logstash/service_account_key.json"
}
}

output {
google_pubsub {
project_id => "redacted"
topic => "record_processing"
json_key_file => "/etc/logstash/service_account_key.json"
}
stdout { codec => rubydebug }
}

I can tell the contents of log.txt are being processed and I can even see the following debug message:

[DEBUG] 2020-05-12 09:24:22.741 [[main]>worker15] googlepubsub - Sending message {"message":"{"timestamp":"1586767575","name":"zzz-XXX.domain.com","type":"a","value":"123.44.67.89"}\n","@Version":"1","@timestamp":"2020-05-12T13:24:22.491Z"}

Jetty NPN/ALPN unavailable for /logstash/logstash:7.12.0-arm64

Logstash information:

Please include the following information:

  1. Logstash version: 7.12.0
  2. Logstash installation source: docker
  3. How is Logstash being run: kubernetes
  4. How was the Logstash Plugin installed:

Inside our Dockerfile:

RUN logstash-plugin install logstash-input-sqs && \
    logstash-plugin install logstash-input-google_pubsub && \
    logstash-plugin install logstash-output-google_pubsub

OS version: Amazon Linux 2 ARM64

Description of the problem including expected versus actual behavior:

Steps to reproduce:

  1. Make sure you're using a ARM image of logstash, in our case, it's: /logstash/logstash:7.12.0-arm64
  2. Create a sample pipeline with google_pubsub output
  3. Start logstash
  4. See the following error:

More info:

  • Upon changing the image to logstash 7.12.0 on classical x86 arch everything runs ok

Provide logs (if relevant):

[INFO ] 2021-06-17 13:57:32.698 [[sqs_pipeline]-pipeline-manager] googlepubsub - Registering Google PubSub Output plugin: projects/bi-prd-brlm/topics/boitatics-events
[INFO ] 2021-06-17 13:57:32.704 [[sqs_pipeline]-pipeline-manager] googlepubsub - Initializing Google API client on projects/bi-prd-brlm/topics/boitatics-events key: /etc/logstash/datalab_prod.json
[ERROR] 2021-06-17 13:57:32.707 [[sqs_pipeline]-pipeline-manager] javapipeline - Pipeline error {:pipeline_id=>"sqs_pipeline", :exception=>java.lang.IllegalArgumentException: SunJSSE selected, but Jetty NPN/ALPN unavailable, :backtrace=>["io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.configure(io/grpc/netty/shaded/io/grpc/netty/GrpcSslContexts.java:223)", "io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.configure(io/grpc/netty/shaded/io/grpc/netty/GrpcSslContexts.java:189)", "io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.configure(io/grpc/netty/shaded/io/grpc/netty/GrpcSslContexts.java:171)", "io.grpc.netty.shaded.io.grpc.netty.GrpcSslContexts.forClient(io/grpc/netty/shaded/io/grpc/netty/GrpcSslContexts.java:120)", "io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder$NettyTransportFactory$DefaultNettyTransportCreationParamsFilterFactory.<init>(io/grpc/netty/shaded/io/grpc/netty/NettyChannelBuilder.java:558)", "io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder$NettyTransportFactory$DefaultNettyTransportCreationParamsFilterFactory.<init>(io/grpc/netty/shaded/io/grpc/netty/NettyChannelBuilder.java:551)", "io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.<init>(io/grpc/netty/shaded/io/grpc/netty/NettyChannelBuilder.java:489)", "io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.buildTransportFactory(io/grpc/netty/shaded/io/grpc/netty/NettyChannelBuilder.java:337)", "io.grpc.internal.AbstractManagedChannelImplBuilder.build(io/grpc/internal/AbstractManagedChannelImplBuilder.java:405)", "com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(com/google/api/gax/grpc/InstantiatingGrpcChannelProvider.java:206)", "com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(com/google/api/gax/grpc/InstantiatingGrpcChannelProvider.java:157)", "com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(com/google/api/gax/grpc/InstantiatingGrpcChannelProvider.java:149)", "com.google.api.gax.rpc.ClientContext.create(com/google/api/gax/rpc/ClientContext.java:151)", "com.google.cloud.pubsub.v1.stub.GrpcPublisherStub.create(com/google/cloud/pubsub/v1/stub/GrpcPublisherStub.java:161)", "com.google.cloud.pubsub.v1.Publisher.<init>(com/google/cloud/pubsub/v1/Publisher.java:154)", "com.google.cloud.pubsub.v1.Publisher.<init>(com/google/cloud/pubsub/v1/Publisher.java:83)", "com.google.cloud.pubsub.v1.Publisher$Builder.build(com/google/cloud/pubsub/v1/Publisher.java:607)", "jdk.internal.reflect.GeneratedMethodAccessor94.invoke(jdk/internal/reflect/GeneratedMethodAccessor94)", "jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:441)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:305)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_google_pubsub_minus_1_dot_0_dot_2_minus_java.lib.logstash.outputs.pubsub.client.initialize_google_client(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_pubsub-1.0.2-java/lib/logstash/outputs/pubsub/client.rb:64)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_google_pubsub_minus_1_dot_0_dot_2_minus_java.lib.logstash.outputs.pubsub.client.initialize(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_pubsub-1.0.2-java/lib/logstash/outputs/pubsub/client.rb:14)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_google_pubsub_minus_1_dot_0_dot_2_minus_java.lib.logstash.outputs.google_pubsub.register(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_pubsub-1.0.2-java/lib/logstash/outputs/google_pubsub.rb:48)", "org.jruby.RubyClass.finvoke(org/jruby/RubyClass.java:572)", "org.jruby.RubyBasicObject.callMethod(org/jruby/RubyBasicObject.java:354)", "org.logstash.config.ir.compiler.OutputStrategyExt$SimpleAbstractOutputStrategyExt.reg(org/logstash/config/ir/compiler/OutputStrategyExt.java:275)", "org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.register(org/logstash/config/ir/compiler/OutputStrategyExt.java:131)", "org.logstash.config.ir.compiler.OutputDelegatorExt.doRegister(org/logstash/config/ir/compiler/OutputDelegatorExt.java:117)", "org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.register(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:68)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.maybe_setup_out_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:585)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$maybe_setup_out_plugins$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_workers$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.run(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$run$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)", "java.lang.Thread.run(java/lang/Thread.java:834)"], "pipeline.sources"=>["central pipeline management"], :thread=>"#<Thread:0x30d6d959 run>"}
[INFO ] 2021-06-17 13:57:32.708 [[sqs_pipeline]-pipeline-manager] javapipeline - Pipeline terminated {"pipeline.id"=>"sqs_pipeline"}
[ERROR] 2021-06-17 13:57:32.717 [Converge PipelineAction::Create<sqs_pipeline>] agent - Failed to execute action {:id=>:sqs_pipeline, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<sqs_pipeline>, action_result: false", :backtrace=>nil}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.