Using bundled JDK: /usr/share/logstash/jdk
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2023-06-13T17:28:06,517][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
[2023-06-13T17:28:06,532][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.8.1", "jruby.version"=>"jruby 9.3.10.0 (2.6.8) 2023-02-01 107b2e6697 OpenJDK 64-Bit Server VM 17.0.7+7 on 17.0.7+7 +indy +jit [x86_64-linux]"}
[2023-06-13T17:28:06,537][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-06-13T17:28:06,551][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2023-06-13T17:28:06,553][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2023-06-13T17:28:06,750][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"2445bebd-b123-4e67-8940-0aa4175758fc", :path=>"/usr/share/logstash/data/uuid"}
[2023-06-13T17:28:07,340][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-06-13T17:28:07,674][INFO ][org.reflections.Reflections] Reflections took 109 ms to scan 1 urls, producing 132 keys and 464 values
[2023-06-13T17:28:07,939][INFO ][logstash.codecs.json ] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-06-13T17:28:07,960][INFO ][org.logstash.ackedqueue.QueueUpgrade] No PQ version file found, upgrading to PQ v2.
[2023-06-13T17:28:08,027][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-06-13T17:28:08,133][INFO ][logstash.outputs.dynatrace][main] Client {:client=>"#<Net::HTTP <snipped>.live.dynatrace.com:443 open=false>"}
[2023-06-13T17:28:08,169][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x7a18fb44@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-06-13T17:28:09,232][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.06}
[2023-06-13T17:28:09,244][INFO ][logstash.codecs.json ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-06-13T17:28:09,330][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-06-13T17:28:09,333][INFO ][logstash.inputs.http ][main][f0db974808adf9a8e2d0c031d75e3a8e944a03568b45634309250e38e174b6dd] Starting http input listener {:address=>"0.0.0.0:8080", :ssl=>"false"}
[2023-06-13T17:28:09,338][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-06-13T17:29:45,275][INFO ][logstash.codecs.json ][main][f0db974808adf9a8e2d0c031d75e3a8e944a03568b45634309250e38e174b6dd] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-06-13T17:29:51,879][INFO ][logstash.codecs.json ][main][f0db974808adf9a8e2d0c031d75e3a8e944a03568b45634309250e38e174b6dd] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-06-13T17:29:53,056][INFO ][logstash.codecs.json ][main][f0db974808adf9a8e2d0c031d75e3a8e944a03568b45634309250e38e174b6dd] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-06-13T17:29:53,742][ERROR][logstash.outputs.dynatrace][main][dynatrace_output] Unknown error raised {:error=>"#<OpenSSL::SSL::SSLErrorWaitReadable: read would block>"}
[2023-06-13T17:29:53,745][ERROR][logstash.javapipeline ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"(SSLErrorWaitReadable) read would block", :exception=>Java::OrgJrubyExceptions::StandardError, :backtrace=>[], :thread=>"#<Thread:0x7a18fb44@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 sleep>"}
[2023-06-13T17:29:55,490][ERROR][logstash.outputs.dynatrace][main][dynatrace_output] Unknown error raised {:error=>"#<OpenSSL::SSL::SSLErrorWaitReadable: read would block>"}
[2023-06-13T17:29:55,493][ERROR][logstash.javapipeline ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"(SSLErrorWaitReadable) read would block", :exception=>Java::OrgJrubyExceptions::StandardError, :backtrace=>[], :thread=>"#<Thread:0x7a18fb44@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 sleep>"}
[2023-06-13T17:29:55,508][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2023-06-13T17:29:55,595][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2023-06-13T17:29:55,598][INFO ][logstash.runner ] Logstash shut down.
This happens when running the container in Rancher as a Kubernetes deployment, Kubernetes Version: v1.24.10 +k3s1.
I'm not seeing that behavior when running the same docker image in Docker Desktop or as a Kubernetes deployment in Rancher Desktop on an Intel MacBookPro.
# logstash.yml
allow_superuser: false
queue.type: persisted
path.data: /usr/share/logstash/data
path.queue: /usr/share/logstash/data/queue
api.http.host: "0.0.0.0"
# logstash.conf
input {
http {
port => 8080
}
}
output {
dynatrace {
id => "dynatrace_output"
ingest_endpoint_url => "${LOGSTASH_DYNATRACE_INGEST_URL:false}"
api_key => "${LOGSTASH_DYNATRACE_API_KEY:false}"
}
}
# Dockerfile
FROM docker.elastic.co/logstash/logstash-oss:8.8.1
RUN rm -f /usr/share/logstash/pipeline/logstash.conf
RUN /usr/share/logstash/bin/logstash-plugin install logstash-input-http
RUN /usr/share/logstash/bin/logstash-plugin install logstash-output-dynatrace
COPY logstash.conf /usr/share/logstash/pipeline/
COPY logstash.yml /usr/share/logstash/config/
# Kubernetes definitions
---
apiVersion: v1
kind: Service
metadata:
name: logstash
spec:
type: ClusterIP
sessionAffinity: None
ports:
- port: 80
name: http
protocol: TCP
targetPort: 8080
selector:
app: logstash
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
name: logstash
namespace: logstash
spec:
serviceName: logstash
replicas: 1
selector:
matchLabels:
app: logstash
template:
metadata:
labels:
app: logstash
spec:
securityContext:
seccompProfile:
type: RuntimeDefault
containers:
- name: logstash
image: custom-logstash-image:8.8.1
imagePullPolicy: IfNotPresent
resources:
limits:
cpu: 2000m
memory: 2Gi
ports:
- name: http
containerPort: 8080
protocol: TCP
- name: monitoring
containerPort: 9600
protocol: TCP
startupProbe:
httpGet:
port: monitoring
path: /
initialDelaySeconds: 30
periodSeconds: 1
failureThreshold: 300
timeoutSeconds: 10
env:
- name: LOGSTASH_DYNATRACE_INGEST_URL
value: "https://<snipped>.live.dynatrace.com/api/v2/logs/ingest"
- name: LOGSTASH_DYNATRACE_API_KEY
value: "<api key>"
- name: TZ
value: "America/Kentucky/Louisville"