logstash-plugins / logstash-input-s3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Taking too much time and don't ingest to ES

Kaiohenriqueps opened this issue · comments

Hello there,

I've been trying to use this input but I don't know what I'm doing wrong. I have some files in a subfolder in a s3 bucket that I want to ingest into elasticsearch. My conf file is like that:

`input {
s3 {
bucket => "<my_bucket>"
access_key_id => "<my_access_key_id>"
secret_access_key => "<my_secret_access_key>"
prefix => "path/to/my/file/"
}
}

filter {
csv {
separator => "|"
columns => [ "my_columns" ]
}
mutate { remove_field => ["host","message","path"] }
}

output {
elasticsearch {
hosts => "localhost:9200"
index => "teste_s3"
document_type => "doc"
}
stdout { codec => rubydebug }
}`

I've been using the file input for a while, with the same output and has been working just fine. But with this s3 input, nothing happens. Logstash didn't show any error message. Don't know what to do anymore. Can somebody please, help me?

Here are logstash output.
output_logstash

Can you post logstash-plain.log after you have set log level in logstash.yml to debug?