02 August 2017 Engineering

Tutorial: Getting Started on Elastic Cloud with a Sample Dataset for hosted Elasticsearch and the Elastic Stack 5.x

By Chad PryorJamie Smith

This article contains updated instructions for working with the Elastic Stack version 5.x.  A previous version covered a sample dataset for Elasticsearch 2.x and Kibana 4.x.

Getting a hosted Elasticsearch environment up and running has never been easier. With Elastic Cloud, you can launch your cluster and start ingesting data in literally minutes. See how in this 3-minute video. This step-by-step set of instructions will walk you through setting up an Elastic Cloud account, creating and securing a Elasticsearch cluster, importing data, and visualizing it in Kibana. So, let's get started.

Log into Elastic Cloud 

  1. Go to https://www.elastic.co/cloud/as-a-service and sign up for a free trial
  2. You will receive a confirmation email. Open it and follow the instructions to start your trial
  3. Log into your Elastic Cloud account

Create your first hosted Elasticsearch cluster

  1. Once you have signed in, go to the Cluster tab to create your cluster
  2. Select cluster size. I will select a cluster with 4GB memory and 96GB storage 
  3. Choose a region that is close to your location. I am choosing US East
  4. Select replication. I am choosing only one Datacenter location
  5. Click on Create to provision your cluster.  Provisioning will start and you will receive a notification once it's complete
  6. Save the password that pops up for the elastic user, we will need it later


Enable Kibana

Next, let's enable Kibana so we can configure cluster access and security:

  1. To enable Kibana, go to the Configuration link and scroll down to the Kibana section
  2. Wait until the cluster provisioning has completed, and the Enable button appears in the kibana section
  3. Click on the Enable button
  4. Click on the link and to open Kibana in a new tab.  Note that the Kibana enablement step make take a minute until which time you may see an "unavailable" response.  Wait a few seconds then try reloading
  5. Log into Kibana with the username elastic & the password you copied from the initial provisioning step


Secure your Cluster

Next, let's configure cluster access and security, using the X-Pack security module, included with all Elastic Cloud accounts:

  1. Make sure that you are logged in to the Kibana interface with the username elastic & the password you copied from the initial provisioning step
  2. Create additional users and/or configure existing usernames and passwords. I will create a new user, sa_admin
  3. If you added new users, configure their roles. I will configure sa_admin user with the superuser role 
  4. Specify the password for the user, I am using something super-secret
  5. Make sure that you remember the username and password that you added- save your credentials in a secure vault. Or a Post-it note 😉. 


You can update your passwords or add additional users using the same process. You may also use the new security API that is included in 5.5 by following the instructions in our documentation.

Elasticsearch Endpoint

Once you are logged into Kibana, you will first see the Discovery tab. However, there is no data to visualize. Next, we will work on ingesting data into Elasticsearch. Let's gather some information so we can be successful!  

  1. Go to the Overview link in the Elastic Cloud console. The listed endpoints are used for API access for your cluster. 
  2. Click on the https link, this will show the cluster. Copy the https URL; this will be used in the following steps.


Import Data

Now let’s get some data into our Elasticsearch cluster to see the Elastic Stack in action. If you don’t have a sample dataset handy, use one from the various data samples in our GitHub Examples Repo. I will be using the Apache logs sample and Logstash (download your system version). To ingest the logs into our hosted Elasticsearch cluster, we will need to modify the elasticsearch output section of the Logstash config included in the example.

1. Download the repository, and change to the directory that contains the apache_logstash.conf file. Be sure to replace hosts endpoint in the config with your own cluster endpoint (copied in the previous step)

2. Modify the username and password to the user account with write access configured Secure Elasticsearch section. I will be using our sa_admin user which we added earlier:

elasticsearch {
   hosts => "https://e66e6e11692c749cc8e09f25e1af4efa.us-west-1.aws.found.io:9243/"
   user => "sa_admin"
   password => "s00p3rS3cr3t"
   index => "apache_elastic_example"
   template => "./apache_template.json"
   template_name => "apache_elastic_example"
   template_overwrite => true 

3. Run the following command to index the data into Elasticsearch via Logstash:

cat ../apache_logs | <Logstash_Install_Dir>/bin/logstash -f apache_logstash.conf

4. I find it useful to store the value of the ES_ENDPOINT in an environment variable, and this is the convention that I follow with some tests, below:

export ES_ENDPOINT=https://somelongid.us-east-1.aws.found.io:9243

5. You can verify that your data exists in Elasticsearch by going to ES_ENDPOINT/apache_elastic_example/_count, where ES_ENDPOINT is the Elasticsearch endpoint URL. You should see the count as 10000.  Assuming that you set the environment variable above, you can do this using the curl command (make sure that you specify the user parameter for basic authentication for our https endpoint)

%> curl --user sa_admin:s00p3rS3cr3t ${ES_ENDPOINT}/apache_elastic_example/_count

6. You can verify the health of your cluster by going to ES_ENDPOINT/_cat/indices. You should see your apache_elastic_example index listed along with its statistics:

%> curl --user sa_admin:s00p3rS3cr3t ${ES_ENDPOINT}/_cat/indices
health status index              pri rep docs.count docs.deleted store.size pri.store.size
yellow open   .kibana              1   1          2            0     19.1kb         19.1kb
yellow open   apache_elastic_example   1   1      10000            0      7.3mb          7.3mb

Visualize Data

Now let's access your Kibana instance and continue with the example instructions to visualize our data:

  1. Go to Index Pattern options under the Management tab and add the apache_elastic_example index to Kibana, then click Create


  1. You can look at your data by selecting apache_elastic_example index on the Discover tab.


  1. Import the examples dashboard by clicking on Management > Saved Objects > Import and choosing the apache_kibana.json file. 


  1. Check out the new dashboard you imported- Navigate over to the "Dashboard" tab on the left, and select the Sample Dashboard for Apache Logs



We have successfully set up a cluster, imported sample data and looked at our first saved dashboard!  Now you have some sample Apache log data in Elasticsearch and you can begin to get some insight and more importantly value from your logs. You can continue exploring with other sample datasets from the Examples repo and the Kibana getting started guide  or start sending your own data by using Logstash or Beats.

Here are some other useful links to help you on the journey of using the Elastic Stack on Cloud.

Or, you can continue your Training with some official classes by some world class Education Engineers: https://www.elastic.co/training