You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
my running docker command sudo docker run -p 5601:5601 -p 9200:9200 -e "AWS_ACCESS_KEY=any_key" -e AWS_SECRET_KEY=any_value_here -v /var/log/logstash/:/var/log/logstash -it ron
**I notice that no matter what I wrote in AWS_SECRET_KEY AWS_ACCESS_KEY, I don't see event in the logs if the authentication is success...
My log stash logs:**
[2020-07-06T14:42:47,330][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/opt/logstash/data/queue"}
[2020-07-06T14:42:47,402][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/opt/logstash/data/dead_letter_queue"}
[2020-07-06T14:42:47,574][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.8.0", "jruby.version"=>"jruby 9.2.11.1 (2.5.7) 2020-03-25 b1f55b1a40 OpenJDK 64-Bit Server VM 11.0.7+10-post-Ubuntu-2ubuntu218.04 on 11.0.7+10-post-Ubuntu-2ubuntu218.04 +indy +jit [linux-x86_64]"}
[2020-07-06T14:42:47,585][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"9c297e4f-9b6a-4e53-8f73-34413878be68", :path=>"/opt/logstash/data/uuid"}
[2020-07-06T14:42:48,754][INFO ][org.reflections.Reflections] Reflections took 18 ms to scan 1 urls, producing 21 keys and 41 values
[2020-07-06T14:42:49,359][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-07-06T14:42:49,453][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-07-06T14:42:49,485][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-07-06T14:42:49,487][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2020-07-06T14:42:49,497][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2020-07-06T14:42:49,505][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-07-06T14:42:49,511][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-07-06T14:42:49,516][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-07-06T14:42:49,517][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2020-07-06T14:42:49,519][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2020-07-06T14:42:49,541][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-07-06T14:42:49,543][INFO ][logstash.outputs.elasticsearch][main] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2020-07-06T14:42:49,559][INFO ][logstash.outputs.elasticsearch][main] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2020-07-06T14:42:49,560][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@Version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-07-06T14:42:49,571][INFO ][logstash.outputs.elasticsearch][main] Installing elasticsearch template to _template/logstash
[2020-07-06T14:42:49,659][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/02-beats-input.conf", "/etc/logstash/conf.d/04-tcp-input.conf", "/etc/logstash/conf.d/10-syslog.conf", "/etc/logstash/conf.d/11-nginx.conf", "/etc/logstash/conf.d/30-output.conf"], :thread=>"#<Thread:0x7ba78e38 run>"}
[2020-07-06T14:42:50,258][INFO ][logstash.inputs.beats ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-07-06T14:42:50,510][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-07-06T14:42:50,552][INFO ][org.logstash.beats.Server][main][0aa22c99d1706de5aa2424832fa6aff42eccda4a9c4ecd9f902b8d0b1452e70d] Starting server on port: 5044
[2020-07-06T14:42:50,578][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-07-06T14:42:50,735][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Any suggesting?
The text was updated successfully, but these errors were encountered:
**I'm try to use that plugin using elk docker. But nothing happen, and I don't see any clue using logs...
my config**
input {
kinesis {
kinesis_stream_name => "my_stream_name"
region => "eu-central-1"
codec => json { }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "tando123"
}
stdout { codec => rubydebug }
}
my running docker command
sudo docker run -p 5601:5601 -p 9200:9200 -e "AWS_ACCESS_KEY=any_key" -e AWS_SECRET_KEY=any_value_here -v /var/log/logstash/:/var/log/logstash -it ron
**I notice that no matter what I wrote in AWS_SECRET_KEY AWS_ACCESS_KEY, I don't see event in the logs if the authentication is success...
My log stash logs:**
[2020-07-06T14:42:47,330][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/opt/logstash/data/queue"}
[2020-07-06T14:42:47,402][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/opt/logstash/data/dead_letter_queue"}
[2020-07-06T14:42:47,574][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.8.0", "jruby.version"=>"jruby 9.2.11.1 (2.5.7) 2020-03-25 b1f55b1a40 OpenJDK 64-Bit Server VM 11.0.7+10-post-Ubuntu-2ubuntu218.04 on 11.0.7+10-post-Ubuntu-2ubuntu218.04 +indy +jit [linux-x86_64]"}
[2020-07-06T14:42:47,585][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"9c297e4f-9b6a-4e53-8f73-34413878be68", :path=>"/opt/logstash/data/uuid"}
[2020-07-06T14:42:48,754][INFO ][org.reflections.Reflections] Reflections took 18 ms to scan 1 urls, producing 21 keys and 41 values
[2020-07-06T14:42:49,359][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-07-06T14:42:49,453][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-07-06T14:42:49,485][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-07-06T14:42:49,487][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the
type
event field won't be used to determine the document _type {:es_version=>7}[2020-07-06T14:42:49,497][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2020-07-06T14:42:49,505][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-07-06T14:42:49,511][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-07-06T14:42:49,516][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-07-06T14:42:49,517][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the
type
event field won't be used to determine the document _type {:es_version=>7}[2020-07-06T14:42:49,519][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2020-07-06T14:42:49,541][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-07-06T14:42:49,543][INFO ][logstash.outputs.elasticsearch][main] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2020-07-06T14:42:49,559][INFO ][logstash.outputs.elasticsearch][main] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2020-07-06T14:42:49,560][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@Version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-07-06T14:42:49,571][INFO ][logstash.outputs.elasticsearch][main] Installing elasticsearch template to _template/logstash
[2020-07-06T14:42:49,659][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/02-beats-input.conf", "/etc/logstash/conf.d/04-tcp-input.conf", "/etc/logstash/conf.d/10-syslog.conf", "/etc/logstash/conf.d/11-nginx.conf", "/etc/logstash/conf.d/30-output.conf"], :thread=>"#<Thread:0x7ba78e38 run>"}
[2020-07-06T14:42:50,258][INFO ][logstash.inputs.beats ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-07-06T14:42:50,510][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-07-06T14:42:50,552][INFO ][org.logstash.beats.Server][main][0aa22c99d1706de5aa2424832fa6aff42eccda4a9c4ecd9f902b8d0b1452e70d] Starting server on port: 5044
[2020-07-06T14:42:50,578][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-07-06T14:42:50,735][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Any suggesting?
The text was updated successfully, but these errors were encountered: