Setup and configure ELK on AWS to monitor multiple EC2 Instances

Keywords: ELK - Amazon Web Services - How to - Other
Description:
We are going to implement ELK from Bitnami on AWS. I have few questions:

  1. we have over 40 EC2 instances we’d like to use ELK to monitor and collect logs from, some of them are on Linux others are Windows with various applications on these EC2 instances. How can we configure ELK to collect logs from all of those 40+ instances for analysis?
  2. On production, we may need elasticserch on one node initially, Logstash on one node and Kibana on another node… how should we setup this?

Could anyone please let us know the documents on how to set those up?
Thank you very much in advance

Hi @lcui,

Thank you for your interest in our ELK solution. Our team is evaluating all the requirements you mentioned and it will either provide you support in this community forum or redirect you to the proper support platform to obtain more information about how to configure the solution properly.

Hi @lcui,

  1. we have over 40 EC2 instances we’d like to use ELK to monitor and collect logs from, some of them are on Linux others are Windows with various applications on these EC2 instances. How can we configure ELK to collect logs from all of those 40+ instances for analysis?

In this case, you will need to install logstash-forwarder or Filebeat.

We don’t currently provide any of those in any of our solutions currently, but we would recommend installing Filebeat (since it’s the replacement for logstash-forwarder).

Filebeat is easy to install, as you can read from their docs. See some sample instructions below:

Installing and configuring Filebeat

We’ll suppose you have already launched a Bitnami ELK instance, and you have the following information: IP/host, user and password for accessing ELK.

In each of those 40 instances, you should do something like this:

  • First, install Filebeat:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.3.1-amd64.deb
sudo dpkg -i filebeat-6.3.1-amd64.deb
  • Next, open /etc/filebeat/filebeat.yml and ensure it has a valid configuration, for instance:
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/*.log
setup.kibana:
  host: "YOUR_ELK_IP"
  protocol: "http"
  username: "YOUR_ELK_USER"
  password: "YOUR_ELK_PASSWORD"
output.elasticsearch:
  hosts: ["YOUR_ELK_IP"]
  protocol: "http"
  path: "/elasticsearch"
  username: "YOUR_ELK_USER"
  password: "YOUR_ELK_PASSWORD"
  • After that, restart Filebeat and ensure the config is OK:
sudo filebeat test config
sudo filebeat test output
  • Finally, import the Kibana dashboards:
sudo filebeat setup --dashboards
  • Now, in your ELK dashboard, you should receive all logs:

I hope this is useful for you!

  1. On production, we may need elasticserch on one node initially, Logstash on one node and Kibana on another node… how should we setup this?

Regarding this question, I believe Kevin Franklin is already discussing possibilities of using Stacksmith to create a multi-node deployment of ELK stack.

Best regards

Marcos,

Thank you very much… we were looking into Filebeat as well. Initial thought was the Logstash in ELK would do that but looks like it doesn’t.
And our 40+ instances are Linux and Windows…

Thank you very much.

Li

Hi @lcui,

I’m glad it was helpful to you.

Since you are using 40 instances, you could somehow automate the installation for Filebeat with shell/batch scripts, in order not to spend too much time setting the environment up.

Don’t hesitate to let us know if you have any other question.

Best regards

Marcos:

Thank you… it would be great if you can share any templates for the installation of Filebeat on multiple instances (Linux and Windows).

Li

Hi @lcui,

We recommend creating Shell scripts for automating the Filebeat installation. This

Please find below an example for Linux instances:

setup_filebeat.sh

#!/bin/bash

set -ex

ELK_IP=$1
ELK_USERNAME=$2
ELK_PASSWORD=$3

if [ "$(id -u)" != 0 ]; then
    echo "You must run this script with sudo or as root!"
    exit 1
fi

if [ "$ELK_IP" = "" ] || [ "$ELK_USERNAME" = "" ] || [ "$ELK_PASSWORD" = "" ]; then
    echo "Usage: $0 ip_address elk_username elk_password"
    exit 1
fi

echo "Installing Filebeat"
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.3.1-amd64.deb
dpkg -i filebeat-6.3.1-amd64.deb

echo "Configuring Filebeat"
cat >/etc/filebeat/filebeat.yml <<EOF
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/*.log
setup.kibana:
  host: "$ELK_IP:80"
  protocol: "http"
  path: "/elk"
  username: "$ELK_USERNAME"
  password: "$ELK_PASSWORD"
output.elasticsearch:
  hosts: ["$ELK_IP:80"]
  protocol: "http"
  path: "/elasticsearch"
  username: "$ELK_USERNAME"
  password: "$ELK_PASSWORD"
EOF
service filebeat restart

echo "Testing Filebeat configuration"
filebeat test config

echo "Testing Filebeat connection to Elasticsearch"
filebeat test output

echo "Installing Filebeat dashboard into Kibana"
filebeat setup --dashboards

In order to run it, you need to do the following in each of your Linux nodes:

sudo chmod a+x setup_filebeat.sh
sudo ./setup_filebeat.sh YOUR_IP YOUR_ELK_USERNAME YOUR_ELK_PASSWORD

We recommend you to take a look at the Windows setup instructions in order to automate the installation on Windows instances, but the script should be executed on each node and follow similar steps:

  • Download and install Filebeat
  • Update the configuration file
  • Restart Filebeat
  • Optionally, test that the configuration is OK
  • Setup Kibana dashboard

Marcos,

We’d like to try Bitnami ELK on one node initially and use filebeat and/or metricbeat, etc… to ship the logs/events from separate AWS instances/servers, to Logstash and use view the data on Kibana.
Is it possible to set up a websession to help us to set this up?

Thanks

Li

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.