Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to tail multiple files in fluentd

I have setup fluentd logger and I am able to monitor a file by using fluentd tail input plugin. All the data is received by fluentd is later published to elasticsearch cluster. Below is the configuration file for fluentd:

<source>
  @type tail
  path /home/user/Documents/log_data.json
  format json
  tag myfile
</source>

<match *myfile*>
  @type elasticsearch
  hosts 192.168.48.118:9200
  user <username>
  password <password>
  index_name fluentd
  type_name fluentd
</match>

As you can see I am monitoring log_data.json file by using tail. I also have a file in the same directory log_user.json, I want to monitor it also and publish it logs to elasticsearch. To do this, I thought of creating another <source> & <match> with different tag but it started showing errors.

How can I monitor multiple files in fluentd and publish them to elasticsearch. I see when we start fluentd its worker is started. Is it possible to start multiple worker so that each one of them is monitoring different files, or any other way of doing it. Can anyone point me to some good links/tutorials.

Thanks.

like image 835
S Andrew Avatar asked Nov 19 '25 15:11

S Andrew


1 Answers

You can use multiple source+match tags.

Label can help you to bind them.

Here an example:

<source>     
  @label @mainstream
  @type tail /home/user/Documents/log_data.json
  format json
  tag myfile
</source>


<label @mainstream>
  <match **>
    @type copy

    <store>
      @type               elasticsearch
      host                elasticsearch
      port                9200
      logstash_format     true
      logstash_prefix     fluentd
      logstash_dateformat %Y%m%d
      include_tag_key     true
      type_name           access_log
      tag_key             @log_name
      <buffer>
        flush_mode            interval
        flush_interval        1s
        retry_type            exponential_backoff
        flush_thread_count    2
        retry_forever         true
        retry_max_interval    30
        chunk_limit_size      2M
        queue_limit_length    8
        overflow_action       block
      </buffer>
    </store>

  </match>
</label>
like image 171
Nicola Ben Avatar answered Nov 22 '25 05:11

Nicola Ben



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!