- we have over 40 EC2 instances we'd like to use ELK to monitor and collect logs from, some of them are on Linux others are Windows with various applications on these EC2 instances. How can we configure ELK to collect logs from all of those 40+ instances for analysis?
In this case, you will need to install logstash-forwarder or Filebeat.
We don't currently provide any of those in any of our solutions currently, but we would recommend installing Filebeat (since it's the replacement for logstash-forwarder).
Filebeat is easy to install, as you can read from their docs. See some sample instructions below:
Installing and configuring Filebeat
We'll suppose you have already launched a Bitnami ELK instance, and you have the following information: IP/host, user and password for accessing ELK.
In each of those 40 instances, you should do something like this:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.3.1-amd64.deb
sudo dpkg -i filebeat-6.3.1-amd64.deb
- Next, open
/etc/filebeat/filebeat.yml and ensure it has a valid configuration, for instance:
- type: log
- After that, restart Filebeat and ensure the config is OK:
sudo filebeat test config
sudo filebeat test output
- Finally, import the Kibana dashboards:
sudo filebeat setup --dashboards
- Now, in your ELK dashboard, you should receive all logs:
I hope this is useful for you!
- On production, we may need elasticserch on one node initially, Logstash on one node and Kibana on another node... how should we setup this?
Regarding this question, I believe Kevin Franklin is already discussing possibilities of using Stacksmith to create a multi-node deployment of ELK stack.