ELK - Elasticsearch Logstash Kibana
Purpose of each component:
Elasticsearch: A NoSQL database based on the Lucene search engine which stores logs.
Logstash: Log pipeline tool that accepts logs from various sources and exports them to various targets.
Kibana: Visualization UI that displays the log.
Log processing workflow:
1 Starting Elasticsearch
Download and install Elasticsearch: https://www.elastic.co/downloads/elasticsearch
Unzip the file and go to the Elasticsearch folder:
To start Elasticsearch with security enabled:
a. On MAC: Run bin/elasticsearch in the unzipped folder.
running at same time to change the password.
b. On windows: Run bin/elasticsearch.bat in the unzipped folder.
2 Starting Kibana
Download and install Kibana: https://www.elastic.co/downloads/kibana
Unzip the file and go to the Kibana folder:
Go to the config folder and open the kibana.yml file in any code pad.
Scroll down to line 43: #elasticsearch.hosts: ["http://localhost:9200"]
Uncomment this line and save the file.
Go to the root folder and go to bin and start Kibana
a. On MAC: bin/kibana
b. On windows: Run bin/kibana.bat
Once Kibana is started for the first time, you may need to configure Elasticsearch for Kibana, if you see this message in your Kibana terminal, go to the website:
You will need to copy and paste the enrollement token here.
To get the enrollment token, open another terminal and cd to /elasticsearch:
Type in: sudo bin/elasticsearch-create-enrollment-token –scope kibana
The token should be generate below. Copy and paste this token to the page before.
After clicking on “Configure Elastic” you should be prompted to enter your username and password:
The default username and password should be “elastic” and “changeme”, but if that doesn’t work, open another terminal and cd to the elasticsearch folder and reset the password.
Type in: sudo bin/elasticsearch-reset-password –username elastic
Then you should be able to reset the password for the default user “elastic” and the new generated password will be displayed.
After loggin in, you are on the landing page of Elasticsearch:
Now once you are able to log into this web page, if you go back to the Kibana terminal, your kibana should also be running on port 5601.
3 Starting Logstash
Create a springboot application that simply contains an endpoint and generates some log messages:
src/main/java/com/example/elkdemo/controller/ElkLogController.java
:
package com.example.elkdemo.controller;
import com.example.elkdemo.ElkDemoApplication;
import com.example.elkdemo.domain.User;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RestController;
import java.util.Arrays;
import java.util.List;
/**
* @Author: eve
* @Date: 2025-03-21 11:30
*/
@RestController
public class ElkLogController {
private final List<User> userList = Arrays.asList(
new User(1, "someone"),
new User(2, "someone else"),
new User(3, "another person")
);
Logger logger = LoggerFactory.getLogger(ElkDemoApplication.class);
@GetMapping("/user/{id}")
public User getUserById(@PathVariable("id") int id){
User user = userList.stream()
.filter(u -> u.getId() == id).findAny().orElse(null);
if (user != null){
logger.info("User found: " + user);
}else{
try{
throw new Exception("User not found");
}catch (Exception e){
e.printStackTrace();
logger.error("User not found", e);
}
}
return user;
}
}
src/main/java/com/example/elkdemo/domain/User.java
:
package com.example.elkdemo.domain;
import jdk.jfr.DataAmount;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.ToString;
/**
* @Author: eve
* @Date: 2025-03-21 11:30
*/
@Data
@AllArgsConstructor
@NoArgsConstructor
@ToString
public class User {
private int id;
private String name;
}
Now minimize your IntelliJ and create a folder on your machine called logs:
Copy and paste the log folder’s path to the properties file in your application, at the end of the path, add a xxx.log file to it. This will generate a log file and store the log in there once your application starts running.
java: java.lang.NoSuchFieldError: Class com.sun.tools.javac.tree.JCTree$JCImport does not have member field 'com.sun.tools.javac.tree.JCTree qualid'
Solution:
update lambook to:
<dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>1.18.32</version> <scope>compile</scope> </dependency>
*************************** APPLICATION FAILED TO START *************************** Description: Web server failed to start. Port 8080 was already in use. Action: Identify and stop the process that's listening on port 8080 or configure this application to listen on another port. Process finished with exit code 1
Run your application and try to retrieve a user by an ID see if the log file is generated:
After you try to look up user with id 1, the log should be generated in the log file.
Now to have this log automatically picked up by Elasticsearch, we need to install Logstash
: https://www.elastic.co/downloads/logstash
First you need to configure the logstash.conf
file, you can find an example of it by going to logstash folder -> config -> logstash.conf. Open it and copy and paste the path to the xxx.log file that was generated earlier over and provide the below cnfigurations:
Save the logstash.conf file and copy and paste to the Logstash/bin folder.
Go to the config folder again and open the pipelines.yml file and uncomment these lines, also replace the path.config with the path to your logstash.conf:
Now run the logstash executable and your logstash should be running and if your application is running and you test your endpoint, the log should automatically be displayed in the Kibana console as shown below:
path to requested target>} [2025-03-21T12:46:47,735][WARN ][logstash.outputs.elasticsearch][another_test] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://localhost:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"} .....[2025-03-21T12:46:52,771][INFO ][logstash.outputs.elasticsearch][another_test] Failed to perform request {:message=>"PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target", :exception=>Manticore::ClientProtocolException, :cause=>#<Java::JavaxNetSsl::SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target>} [2025-03-21T12:46:52,773][WARN ][logstash.outputs.elasticsearch][another_test] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://localhost:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
Go to your browser and type in https://localhost:9200/_cat/indices
, and you should see your logstash instance running.
Copy the highlighted text, this will be the index pattern.
http://localhost:5601/app/home#/
Now go to your Elasticsearch web page on port 5601, follow the guide there to add the Kibana Logs integration to the Elasticsearch web page if you don’t already have one.
Left menu -> Management -> Stack Management-> Kibana -> Data Views -> Create Data view
Name(名称): 可以填写和索引模式一样的名字
Now if you go to the discover page: