Kibana 简明教程
Kibana - Introduction To Elk Stack
Kibana 是一个开源可视化工具,主要用于分析大量日志,形式为折线图、条形图、饼图、热图等。Kibana 与 Elasticsearch 和 Logstash 同步工作,共同形成所谓的 ELK 堆栈。
Kibana is an open source visualization tool mainly used to analyze a large volume of logs in the form of line graph, bar graph, pie charts, heatmaps etc. Kibana works in sync with Elasticsearch and Logstash which together forms the so called ELK stack.
ELK 代表 Elasticsearch、Logstash 和 Kibana。 ELK 是全球用于日志分析的最流行的日志管理平台之一。
ELK stands for Elasticsearch, Logstash, and Kibana. ELK is one of the popular log management platform used worldwide for log analysis.
在 ELK 堆栈中 −
In the ELK stack −
-
Logstash extracts the logging data or other events from different input sources. It processes the events and later stores it in Elasticsearch.
-
Kibana is a visualization tool, which accesses the logs from Elasticsearch and is able to display to the user in the form of line graph, bar graph, pie charts etc.
在本教程中,我们将紧密配合 Kibana 和 Elasticsearch,并以不同的形式可视化数据。
In this tutorial, we will work closely with Kibana and Elasticsearch and visualize the data in different forms.
在本章中,让我们了解如何使用 ELK 堆栈。此外,您还将看到如何 −
In this chapter, let us understand how to work with ELK stack together. Besides, you will also see how to −
-
Load CSV data from Logstash to Elasticsearch.
-
Use indices from Elasticsearch in Kibana.
Load CSV data from Logstash to Elasticsearch
我们将使用 CSV 数据来上传使用 Logstash 到 Elasticsearch 的数据。要对数据分析开展工作,我们可以从 kaggle.com 网站获取数据。Kaggle.com 网站已上传所有类型的数据,用户可以使用这些数据来开展数据分析。
We are going to use CSV data to upload data using Logstash to Elasticsearch. To work on data analysis, we can get data from kaggle.com website. Kaggle.com site has all types of data uploaded and users can use it to work on data analysis.
我们从这里获取了 countries.csv 数据: https://www.kaggle.com/fernandol/countries-of-the-world 。您可以下载 csv 文件并使用它。
We have taken the countries.csv data from here: https://www.kaggle.com/fernandol/countries-of-the-world. You can download the csv file and use it.
我们将使用 csv 文件包含以下详细信息。
The csv file which we are going to use has following details.
文件名 − countriesdata.csv
File name − countriesdata.csv
列 − “国家/地区”、“区域”、“人口”、“面积”
Columns − "Country","Region","Population","Area"
您还可以创建一个虚拟的 csv 文件并使用它。我们将使用 Logstash 将此数据从 countriesdata.csv 转储到 Elasticsearch。
You can also create a dummy csv file and use it. We will be using logstash to dump this data from countriesdata.csv to elasticsearch.
在您的终端启动 elasticsearch 和 Kibana 并保持运行。我们必须创建 logstash 的配置文件,其中包含有关 CSV 文件列的详细信息,以及其他详细信息,如下面的 logstash-config 文件所示 −
Start the elasticsearch and Kibana in your terminal and keep it running. We have to create the config file for logstash which will have details about the columns of the CSV file and also other details as shown in the logstash-config file given below −
input {
file {
path => "C:/kibanaproject/countriesdata.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => ["Country","Region","Population","Area"]
}
mutate {convert => ["Population", "integer"]}
mutate {convert => ["Area", "integer"]}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
=> "countriesdata-%{+dd.MM.YYYY}"
}
stdout {codec => json_lines }
}
在配置文件中,我们创建了 3 个组件 −
In the config file, we have created 3 components −
Input
我们需要指定输入文件的路径,在我们的例子中是一个 csv 文件。csv 文件存储的路径被赋予 path 字段。
We need to specify the path of the input file which in our case is a csv file. The path where the csv file is stored is given to the path field.
Filter
将使用分隔符 csv 组件,在我们的例子中是逗号,以及 csv 文件中可用的列。由于 logstash 将所有传入的数据都视为字符串,如果我们想要将任何列用作整数,则必须使用 mutate 指定相同的浮点数,如上所示。
Will have the csv component with separator used which in our case is comma, and also the columns available for our csv file. As logstash considers all the data coming in as string , in-case we want any column to be used as integer , float the same has to be specified using mutate as shown above.
Output
对于输出,我们需要指定放置数据的位置。在这里,我们使用的案例是 elasticsearch。需要提供给 elasticsearch 的数据是它正在运行的主机,我们已将它指定为 localhost。下一个字段是索引,我们已将名称指定为 countries-currentdate。一旦将数据更新到 Elasticsearch 中,我们必须在 Kibana 中使用相同的索引。
For output, we need to specify where we need to put the data. Here, in our case we are using elasticsearch. The data required to be given to the elasticsearch is the hosts where it is running, we have mentioned it as localhost. The next field in is index which we have given the name as countries-currentdate. We have to use the same index in Kibana once the data is updated in Elasticsearch.
将上述配置文件保存为 logstash_countries.config。请注意,我们需要在下一步的 logstash 命令中提供此配置的路径。
Save the above config file as logstash_countries.config. Note that we need to give the path of this config to logstash command in the next step.
要将数据从 csv 文件加载到 elasticsearch,我们需要启动 elasticsearch 服务器 −
To load the data from the csv file to elasticsearch, we need to start the elasticsearch server −

现在,在浏览器中运行 http://localhost:9200 以确认 elasticsearch 是否运行成功。
Now, run http://localhost:9200 in the browser to confirm if elasticsearch is running successfully.

我们有正在运行的 elasticsearch。现在,转到 logstash 已安装的路径,并运行以下命令将数据上传到 elasticsearch。
We have elasticsearch running. Now go to the path where logstash is installed and run following command to upload the data to elasticsearch.
> logstash -f logstash_countries.conf


以上屏幕显示从 CSV 文件加载数据到 Elasticsearch。要了解我们在 Elasticsearch 中创建了索引,我们可以按以下方式检查 −
The above screen shows data loading from the CSV file to Elasticsearch. To know if we have the index created in Elasticsearch we can check same as follows −
我们可以看到创建了如上所示的 countriesdata-28.12.2018 索引。
We can see the countriesdata-28.12.2018 index created as shown above.

索引 − countries-28.12.2018 的详细信息如下 −
The details of the index − countries-28.12.2018 is as follows −

请注意,当数据从 logstash 上传到 elasticsearch 时,会创建具有属性的映射详细信息。
Note that the mapping details with properties are created when data is uploaded from logstash to elasticsearch.
Use Data from Elasticsearch in Kibana
现在,我们有在 localhost 上运行的 Kibana,端口 5601 − http://localhost:5601 。Kibana 的 UI 显示如下 −
Currently, we have Kibana running on localhost, port 5601 − http://localhost:5601. The UI of Kibana is shown here −

请注意,我们已经将 Kibana 连接到 Elasticsearch,我们应该能够在 Kibana 内看到 index :countries-28.12.2018 。
Note that we already have Kibana connected to Elasticsearch and we should be able to see index :countries-28.12.2018 inside Kibana.
在 Kibana UI 中,单击左侧的 Management Menu 选项 −
In the Kibana UI, click on Management Menu option on left side −

现在,单击 Index Management −
Now, click Index Management −

Elasticsearch 中存在的索引显示在索引管理中。我们将在 Kibana 中使用的索引是 countriesdata-28.12.2018。
The indices present in Elasticsearch are displayed in index management. The index we are going to use in Kibana is countriesdata-28.12.2018.
因此,因为我们已经在 Kibana 中有了 elasticsearch 索引,所以下一步将了解如何在 Kibana 中使用索引以饼状图、条形图、折线图等形式可视化数据。
Thus, as we already have the elasticsearch index in Kibana, next will understand how to use the index in Kibana to visualize data in the form of pie chart, bar graph, line chart etc.