Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using Kafka as alternative to Filebeats and Logstash

I'm new to the ELK stack and I was just wondering whether if it is possible to ship our log files to Elasticsearch using Kafka. But I need the job of Logstash ( parsing logs using filters like grok ) to be done in Kafka as well. Is this entire thing possible? Basically what I'm trying to do is to replace the combination of Filebeats and Logstash with Kafka and I want to know whether if it is possible or not.

Thank you :)

Note: What I am trying to do is to Ship + Parse the logs in Kafka. I know that shipping logs to elasticsearch is possible using the Elasticsearch connector but what I'm asking is that whether Parsing data (Logstash's job) is possible with Kafka.

like image 827
Dasun Pubudumal Avatar asked Sep 05 '25 16:09

Dasun Pubudumal


1 Answers

I'll break down your question in two:

1. Can events being streamed via kafka can be indexed in ElasticSearch

Yes, if you consider Confluent kafka-connect as part of Kafka. It's not kafka itself that does the indexing but a kafka-connect sink connector that will be configured to consume from your kafka topics and index the events in Elasticsearch.

You can find more information here: https://docs.confluent.io/current/connect/kafka-connect-elasticsearch/index.html

2. Can I achieve the same sort of parsing, transformation and flow control features of logstash directly in Kafka

The only Kafka ecosystem feature I'm aware that can help you do something like that is Kstreams (but you have to know how to develop using Kstreams API) or using another Confluent piece of software called KSQL that allows to do SQL Stream Processing on top of Kafka Topics which is more oriented to Analytics (i.e: Data filtering, transformations, aggregations, joins, windowing and sessionization)

You can find more information on KStreams here: https://kafka.apache.org/documentation/streams/

And you can find more information on KSQL here: https://docs.confluent.io/current/ksql/docs/index.html

Conclusion

In my opinion you wouldn't be able to achieve ALL sort of parsing and transformation capabilities of Logstash / NiFi without having to program with the Kafka Streams API, but you definetely can use kafka-connect to get data into kafka or out of kafka for a wide array of technologies just like Logstash does.

A nice illustration of such setup (taken from Confluent) would be: enter image description here

like image 178
Alexandre Juma Avatar answered Sep 07 '25 11:09

Alexandre Juma