Tags

By year

  1. 2018 (2)
  2. 2017 (4)
  3. 2015 (9)

Short Example of Logstash Multiple Pipelines

Logstash Elasticsearch Kibana Filebeat Elastic Snippet

Posted on Aug 25


Short Example of Logstash Multiple Pipelines

I trid out Logstash Multiple Pipelines just for practice purpose.

The following summary assumes that the PATH contains Logstash and Filebeat executables and they run locally on localhost.

Logstash config

pipelines.yml

This file refers to two pipeline configs pipeline1.config and pipeline2.config.

- pipeline.id: pipeline_1
  path.config: "pipeline1.config"
- pipeline.id: pipeline_2
  path.config: "pipeline2.config"

Unlike logstash.yml, environment variables cannot be used in the pipelines.yml for some reason. See Issue #8452.

Each path.config here specifies only a file name, so Logstash has to be launched from the directory where the following config files reside.

pipeline1.config

Deals with syslog line input and listens to port 5044.

input {
    beats {
        port => "5044"
    }
}
filter {
    grok {
        match => { "message" => "%{SYSLOGLINE}"}
    }
}
output {
    stdout { codec => rubydebug }
}

pipeline2.config

Deals with Apache log input and listens to port 5045.

input {
    beats {
        port => "5045"
    }
}
filter {
    grok {
        match => { "message" => "%{COMBINEDAPACHELOG}"}
    }
    geoip {
        source => "clientip"
    }
}
output {
    stdout { codec => rubydebug }
}

Almost the same as the example found in Parsing Logs with Logstash.

Run Logstash

Here we set --path.settings just to let Logstash point to our $THIS_GIST_DIR to look for config files. We have to make sure not to specify -f option because Logstash reads pipelines.yml by default.

pushd $THIS_GIST_DIR
mkdir data
mkdir logs
logstash --path.settings $THIS_GIST_DIR --path.logs $THIS_GIST_DIR/logs --path.data $THIS_GIST_DIR/data

First Filebeat

The first filebeat config is filebeat1.yml that specifies only syslog in the paths setting.

filebeat.inputs:
- type: log
  paths:
    - syslog
output.logstash:
  hosts: ["localhost:5044"]

Here we use a copy of syslog as an input.

# Works only if syslog is written in /var/log...
cp /var/log/syslog $THIS_GIST_DIR/

If the previous registry file of Filebeat remains, we remove it.

rm -i $THIS_GIST_DIR/data/registry

Now we can try Filebeat and see Logstash emits filtered output onto stdout.

pushd $THIS_GIST_DIR
mkdir data
mkdir logs
filebeat --path.home $THIS_GIST_DIR -c filebeat1.yml

After all the log lines are printed, we can shutdown Filebeat, e.g. by CTRL-C.

Second Filebeat

filebeat.inputs:
- type: log
  paths:
    - logstash-tutorial.log
output.logstash:
  hosts: ["localhost:5045"]

For the second pipeline, we download a sample Apache log file from logstash-tutorial.log and unzip it to obtain logstash-tutorial.log. This file can also be found from Parsing Logs with Logstash.

This time, Logstash emits parsed Apache log onto stdout.

pushd $THIS_GIST_DIR
mkdir data
mkdir logs
filebeat -path.home $THIS_GIST_DIR -c filebeat2.yml

Output to Elasticsearch

By replacing the output in the pipeline{1,2}.config with the following one, we can direct filtered log outputs to Elasticsearch.

output {
    elasticsearch {
        hosts => [ "localhost:9200" ]
    }
}

Then we can issue a search query like below from the Dev Tools Console of Kibana.

GET /logstash-*/_search
{
  "query": {
    "match": {
      "program.keyword": "cron"
    }
  }
}

References


2015 My gh-pages