Store and Visualize your Data with Elasticsearch and Kibana
This post requires an Elastic Stack consisting of one Elasticsearch database and one Kibana instance connected to it. If you just read my article on setting up your own Elastic Stack you're good to go. If not, you should totally read it. I will show you how to setup your own Elastic Stack on your local machine such that your data is completely private since it is under your control only.
In todays post however I want to show you how you can save data in your Elasticsearch database and how you can visualize it in Kibana.
Prerequisites
As said in the introduction we need an Elastic Stack for this tutorial. That means that we need access to an Elasticsearch database and the corresponding Kibana instance. If you do not have an Elastic Stack follow my tutorial to set one up.
About Elasticsearch
Before we get to the fun part of coding I want to explain how Elasticsearch handles data. Since Elasticsearch is a NoSQL database it stores its data as so called documents
. These documents are JSON objects containing the data and they are stored in so called indices
. You can think of indices and documents as books and pages. An index is the book while each book has multiple pages - documents. To query a document you need to know the index where it is stored in. You can also query one index for its documents.
OpenWeatherMap
Before we can visualize data in Kibana we first need to collect some. Therefore, we will write some Python code to query the OpenWeatherMap API and store the weather information in our database.
Therefore, we need to signup at OpenWeatherMap. After that we can login and generate a new API key here.
Implementing the Crawler
Now before we implement our crawler let us think about what we want. First we need a connector who is able to save data to our Elasticsearch database. We will call it ElasticsearchConnector
. Second we need a connector who is able to query OpenWeatherMap to obtain some weather data. Let's follow the schema and name it OpenWeatherMapConnector
. And last but not least we need our crawler which we will just name Crawler
.
ElasticsearchConnector
OpenWeatherMapConnector
Crawler
Still a good idea. I like the way you think ;)
The first to files elastic.py
and openweathermap.py
can just be copied from above. However, when we copy the crawler.py
we must make sure to update the following variables in the crawlers' __init__()
function.
es_url
which must be the URL of your Elasticsearch instance. Note that it must include the protocol (e.g. https) and the port (e.g. 9200)es_user
which is the username for your Elasticsearch instancees_pass
which is the corresponding password for your Elasticsearch usernameapi_key
which is your OpenWeatherMap API key
After we updated the file let's run the code.
# Make sure we have all files copied correctly
ls
> elastic.py openweathermap.py crawler.py
# Install the required packages
pip3 install elasticsearch
# Run the code
python3 crawler.py
If the output looks somethink like this we know everything worked out well.
created
created
created
created
Great. We now have stored our collected weather data in our Elasticsearch database. Let's take a look at it using Kibana
Data visualization with Kibana
Since we created a new index for our crawler we named weather
the first step we need to take is to create a new data view
to see our data. Therefore, we must navigate to Analytics
and click on Create data view
.
On the right we see a list of all our indices and on the left we can put in all information we need to provide. Then we click on Save data view to Kibana
.
Now we navigate to Discovery
and after we set the time range to Today
we will see our crawled data like below.
That's it. Congratulations! From now on, I would suggest playing around with Kibana a bit. It's fun and you learn a lot. I recommend getting your hands dirty with dashboards. It's a way to visualize your data with great looking charts, tables and more.