d***@gmail.com
2015-05-28 10:34:13 UTC
Dear all:
I HAC who is using kafka and use logstash 1.5.
Logs are send to kafka using flume and flume will send the following
logs to kafka topic.
Our logs would look like as
https://www.codatlas.com/github.com/apache/flume/HEAD/flume-ng-sinks/flume-ng-morphline-solr-sink/src/test/resources/test-documents/multiline-stacktrace.log?line=30
It is multi-lines, and it works fine when we use kafka plugin with
muliline codec(input plugins).
input {
kafka {
zk_connect => "localhost:2182"
group_id => "input_topic_T2"
topic_id => "input_topic_T2"
reset_beginning => false # boolean (optional)ïŒ default: false
consumer_threads => 8 # number (optional)ïŒ default: 1
decorate_events => true # boolean (optional)ïŒ default: false
codec => multiline {
pattern => "^\s"
what => "previous"
}
}
}
filter {
grok {
match => { "message" => "%{IPORHOST} %{PATH:source} %{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
stdout {codec => rubydebug}
}
Above configurations works fine, but now the problems is , input is
change: [hostname][sourcetype] is add to each line of logs before they
reach kafka topic, we need to remove the [hostname][sourcetype] before
codec runs. Can we handle this in the input plugin?
Regards
David
I HAC who is using kafka and use logstash 1.5.
Logs are send to kafka using flume and flume will send the following
logs to kafka topic.
Our logs would look like as
https://www.codatlas.com/github.com/apache/flume/HEAD/flume-ng-sinks/flume-ng-morphline-solr-sink/src/test/resources/test-documents/multiline-stacktrace.log?line=30
It is multi-lines, and it works fine when we use kafka plugin with
muliline codec(input plugins).
input {
kafka {
zk_connect => "localhost:2182"
group_id => "input_topic_T2"
topic_id => "input_topic_T2"
reset_beginning => false # boolean (optional)ïŒ default: false
consumer_threads => 8 # number (optional)ïŒ default: 1
decorate_events => true # boolean (optional)ïŒ default: false
codec => multiline {
pattern => "^\s"
what => "previous"
}
}
}
filter {
grok {
match => { "message" => "%{IPORHOST} %{PATH:source} %{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
stdout {codec => rubydebug}
}
Above configurations works fine, but now the problems is , input is
change: [hostname][sourcetype] is add to each line of logs before they
reach kafka topic, we need to remove the [hostname][sourcetype] before
codec runs. Can we handle this in the input plugin?
Regards
David
--
Remember: if a new user has a bad time, it's a bug in logstash.
---
You received this message because you are subscribed to the Google Groups "logstash-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to logstash-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Remember: if a new user has a bad time, it's a bug in logstash.
---
You received this message because you are subscribed to the Google Groups "logstash-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to logstash-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.