Sometimes, software just goes together. Linux, the Apache Web server, MySQL, and PHP, the four ingredients of the LAMP stack, which revolutionized data centers and made open source a big deal two decades ago, are probably the most famous example. But there are lots of others.

Here's another open source software stack you should know about in our present age of cloud and big data: the Elastic Stack, or ELK. Based on Elasticsearch, Logstash and Kibana, ELK is a fully open source solution for searching, analyzing and visualizing data in any format, at any scale.

Since ELK has multiple parts, and some of them have other dependencies, setting up ELK is not as simple as installing other stacks, which sometimes require a simple one-line yum installation command. But fear not. ELK is still easy enough to install if you follow the proper steps.

Below, we'll walk through configuring a Red Hat Enterprise Linux (RHEL) server for ELK, installing each of the requisite components and configuring them to work with one another. (RHEL is now free for development use --- download it here.)


First, let's go over some prerequisites. This guide assumes:

  • Your server runs RHEL 6 or RHEL 7. Some parts of the ELK stack support RHEL 5, but not all, so don't try using it. (By the way, if you're still using RHEL 5, you should probably be upgrading soon, since its EOL date is fast approaching.)
  • You want to install ELK using the official RPM packages from the developers. (Yes, you could pull the source through git and compile it yourself, but that's outside the scope of this post.)
  • You want to install the latest version of ELK. You could install older versions by telling yum to download specific packages (but I assume you want to be as up-to-date as possible).
  • You're happy using Oracle Java, which is what we'll install to meet the Java dependency of ELK. If you want to use a different version of Java, check the Elasticsearch support matrix to make sure it will work. (Yes, OpenJDK is supported.)

Step 1: Install Java

Let's dive into the installation process. Our first task is to install Java, since Elasticsearch and Logstash require Java to run. As noted above, we'll install Oracle Java, although you could use OpenJDK if you like.

To do this, we'll use curl to download the latest Java RPM (Java 8 revision 77) from Oracle's site, then install it via yum, with these two commands:

cd /tmp
curl --insecure --header "Cookie:; oraclelicense=accept-securebackup-cookie" -L "" > jdk-8u77-linux-x64.rpm

Lest I start a war between curl and wget fans, I suppose I should tell you how to download Java using wget, too. The commands are:

cd /tmp
wget --no-cookies --no-check-certificate --header "Cookie:; oraclelicense=accept-securebackup-cookie" ""

Whether you use curl or wget to download Java, you should end up with a file in /tmp called jdk-8u77-linux-x64.rpm. The next step is to install this file by running yum (as root) like so:

yum -y localinstall jdk-8u73-linux-x64.rpm

All set. You just installed Java.

Step 2: Install Elasticsearch

The next step, installing Elasticsearch, is easier, since we can do it all using yum. (If only Java installation were equally simple...)

To do this, first import the Elasticsearch GPG key with:

rpm --import

Next, open a blank text file in your favorite text editor (be sure to run the editor as root, so that you have the necessary save permissions) and fill it with these lines:

name=Elasticsearch repository for 2.x packages

Save the file as /etc/yum.repos.d/elasticsearch.repo

Next, install the Elasticsearch package with:

sudo yum -y install elasticsearch

Step 3: Configure and Start Elasticsearch

Before installing the rest of the ELK stack, we should tweak Elasticsearch a little and start it up.

Specifically, we want to prevent outside connections to the Elasticsearch HTTP API. To do this, open the file /etc/elasticsearch/elasticsearch.yml in an editor (again, run the editor as root), find the line with the value, uncomment it and replace it with the (uncommented) line localhost Then save the file and exit.

You can now start Elasticsearch using the systemctl interface. systemctl start elasticsearch starts it, systemctl stop elasticsearch stops it, and systemctl systemctl enable configures it to start automatically at boot.

Step 4: Install Kibana

We're going to follow a similar process to install Kibana. First, create a yum repository file for it by opening a blank text file and adding these lines:

name=Kibana repository for 4.4.x packages

Save the file as /etc/yum.repos.d/kibana.repo

Then download and install the Kibana package with:

yum -y install kibana

We also need to do a quick configuration tweak for Kibana. Open the file /opt/kibana/config/kibana.yml, find the line that reads "" and replace it with "localhost"

You should now start Kibana with systemctl start kibana. You can stop it using systemctl as well. And to configure Kibana to start automatically at boot, run chkconfig kibana on

Step 5: Install Logstash

Our final step is to install the last piece of the ELK stack, Logstash. Here again, we need to create a yum repository file by adding the following lines to a blank text file:

name=logstash repository for 2.2 packages

Save the file as /etc/yum.repos.d/logstash.repo. Then install Logstash with:

yum -y install logstash

Congrats! Logstash is installed.

Using ELK

You now have a basic ELK stack installed. Depending on exactly what you want to do with it, however, you may want to take some additional steps. Consider the following tweaks and additions:

  • By default, Kibana (the Web interface that you use to search and analyze your Elasticsearch data) listens on localhost at port 5601. That means you can connect to it from your RHEL server at localhost:5601. If you want to be able to access Kibana from other servers, too, you'll need to do some additional configuration. Refer here for details.
  • Above, we did a basic installation of Logstash. Your exact configuration of the tool will depend on which plugins you want to use and how you want to work with data. Basic configuration instructions are available here, and more complex examples are here.
  • By default, your ELK stack will only let you collect and analyze logs from your local server. But you can add remote logs to the mix by using Filebeat, which collects logs from other hosts. Here are instructions for installing and setting up Filebeat to work with your ELK stack.

Screen Shot 2016-05-27 at 3.47.23 PMAbout Hemant Jain

Hemant Jain is the founder and owner of Rapidera Technologies, a full service software development shop. He and his team focus a lot on modern software delivery techniques and tools. Prior to Rapidera he managed large scale enterprise development projects at Autodesk and Deloitte.

Last updated: January 19, 2023