You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am having trouble in pushing the relatively large about 200KB of Json file through http output. We use filebeat to push the data. And http to push the data to cloudant DB.
Just to rule out the issue with filebeat, i modified the configuration file to point the input to the a file. Even then i see the same issue. When we reduce the size of the json to say about 100 KB, things are going through.
Again to rule out the issue with the cloudant, i used POSTMAN and performed the POST with the same json content about 200KB. POST was scucessful and i was able to see the data in cloudant DB. Infact i increased the size upto 900KB, and no issues were seen and POST done using POSTMAN was successful. Data was seen in cloudant. Cloudant has a limit of 1MB size limit for the JSON document. While in my usecase, i had JSON document lesser than 400KB.
logstash version : 6.2.2
http pulgin : 5.2.0
Using logstash docker :
bash-4.2$ uname -a
Linux xxxxx 4.4.0-139-generic #165-Ubuntu SMP Wed Oct 24 10:58:50 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
bash-4.2$
Error Seen in the log ;
[2019-01-13T05:01:25,347][ERROR][logstash.outputs.http ].......
Part of the JSON posted.....
....
....
:headers=>{"Content-Type"=>"application/json"}, :message=>"Broken pipe (Write failed)", :class=>"Manticore::SocketException", :backtrace=>nil, :will_retry=>true}
[2019-01-13T05:01:25,443][INFO ][logstash.outputs.http ] Retrying http request, will sleep for 56 seconds
Also, i see that the retry is going in infinite loop, and over a period of time, it blocks other smaller json going through. That is another issue.
I am having trouble in pushing the relatively large about 200KB of Json file through http output. We use filebeat to push the data. And http to push the data to cloudant DB.
Just to rule out the issue with filebeat, i modified the configuration file to point the input to the a file. Even then i see the same issue. When we reduce the size of the json to say about 100 KB, things are going through.
Again to rule out the issue with the cloudant, i used POSTMAN and performed the POST with the same json content about 200KB. POST was scucessful and i was able to see the data in cloudant DB. Infact i increased the size upto 900KB, and no issues were seen and POST done using POSTMAN was successful. Data was seen in cloudant. Cloudant has a limit of 1MB size limit for the JSON document. While in my usecase, i had JSON document lesser than 400KB.
logstash version : 6.2.2
http pulgin : 5.2.0
Using logstash docker :
bash-4.2$ uname -a
Linux xxxxx 4.4.0-139-generic #165-Ubuntu SMP Wed Oct 24 10:58:50 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
bash-4.2$
Config File :
input {
file {
path => "/usr/share/logstash/pipeline/benchmark/data/performance.json"
start_position => "beginning"
type => "performance"
}
}
filter {
else if [type] == "performance" {
json {
source => "message"
remove_field => ["message"]
}
}
}
output {
if [type] == "performance" {
http {
url => "xxxxx/performance"
headers => { "Content-Type" => "application/json"}
http_method => "post"
codec => "json"
automatic_retries => 3
id => "http_inventory"
socket_timeout => 60
}
}
}
Error Seen in the log ;
[2019-01-13T05:01:25,347][ERROR][logstash.outputs.http ].......
Part of the JSON posted.....
....
....
:headers=>{"Content-Type"=>"application/json"}, :message=>"Broken pipe (Write failed)", :class=>"Manticore::SocketException", :backtrace=>nil, :will_retry=>true}
[2019-01-13T05:01:25,443][INFO ][logstash.outputs.http ] Retrying http request, will sleep for 56 seconds
Also, i see that the retry is going in infinite loop, and over a period of time, it blocks other smaller json going through. That is another issue.
I had done some search in the web, and found that similar issues were reported earlier : #50
https://discuss.elastic.co/t/logstash-http-output-retries-forever-on-failure/141239
I too have similar issue now. Appreciate your help in getting this resolved. Please let me know if you need any more info.. Thanks!
-Manju
The text was updated successfully, but these errors were encountered: