```
引言:
對于大型web網站一般會需要一個ELK日志分析系統服務,對這個網站運作時,伺服器所報出的所有正常和異常的log日志資料進行統計,并進行web網站的日志資料分析,由此本人決定結合項目搭建一個ELK處理日志伺服器。
整合步驟:
1.springboot項目pom中添加如下依賴:
<!-- ELK日志分析系統jar包 -->
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>1.1.11</version>
</dependency>
<dependency>
<groupId>com.github.danielwegener</groupId>
<artifactId>logback-kafka-appender</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>0.1.5</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>5.1</version>
</dependency>
2.修改springboot項目中logback配置,新增内容如下:
<appender name="KafkaAppender" class="com.github.danielwegener.logback.kafka.KafkaAppender">
<encoder class="com.github.danielwegener.logback.kafka.encoding.LayoutKafkaMessageEncoder">
<layout class="net.logstash.logback.layout.LogstashLayout" >
<includeContext>true</includeContext>
<includeCallerData>true</includeCallerData>
<customFields>{"system":"test"}</customFields>
<fieldNames class="net.logstash.logback.fieldnames.ShortenedFieldNames"/>
</layout>
<charset>UTF-8</charset>
</encoder>
<!--kafka topic 需要與配置檔案裡面的topic一緻 否則kafka會沉默并鄙視你-->
<topic>mcloud-log</topic>
<keyingStrategy class="com.github.danielwegener.logback.kafka.keying.HostNameKeyingStrategy" />
<deliveryStrategy class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy" />
<producerConfig>bootstrap.servers=127.0.0.1:9092</producerConfig>
</appender>
<!--你可能還需要加點這個玩意兒-->
<logger name="Application_ERROR">
<appender-ref ref="KafkaAppender"/>
</logger>
<!--還有這個玩意兒-->
<root>
<level value="INFO" />
<appender-ref ref="CONSOLE" />
<appender-ref ref="KafkaAppender" />
</root>
3.部署配置logstash,步驟如下:
1.官網下載下傳logstash https://www.elastic.co/cn/downloads/logstash (對應版本與ElasticSearch版本一緻)
2.添加logstash.conf啟動配置檔案
下載下傳好的logstash解壓在E目錄,進入到 E:\logstash-5.6.1\config 目錄下 ,新增logstash.conf配置檔案,内容如下:
input{
#kafka伺服器連接配接配置
kafka{
bootstrap_servers => "127.0.0.1:9092"
topics => "mcloud-log"
}
}
output{
#ES伺服器連接配接配置
elasticsearch{
hosts => ["127.0.0.1:9200"]
index => "logstash-%{type}-%{+YYYY.MM.dd}"
flush_size => 20000
idle_flush_time => 10
template_overwrite => true
}
}
注意:yml和conf中不要用TAB,用空格,不然啟動會報錯
啟動logstash
#DOS指令進去E:\logstash-5.6.1\bin目錄下,鍵入如下指令,啟動logstash伺服器
logstash -f logstash.conf
4.部署配置kinaba,步驟如下:
1.官網下載下傳kinaba https://www.elastic.co/cn/downloads/kibana -> Not the version you're looking for? View past releases.(對應版本與ElasticSearch版本一緻)
2.修改kibana.yml配置檔案
下載下傳好的kinaba解壓在E目錄,進入到 E:\kibana-5.6.1-windows-x86\config 目錄下,修改kibana.yml配置檔案内容,修改内容如下:
server.port: 5601
server.name: "kibana"
server.host: "127.0.0.1"
elasticsearch.url: "http://127.0.0.1:9200"
啟動服務
#DOS指令進去E:\kibana-5.6.1-windows-x86目錄下,鍵入如下指令,啟動kinaba伺服器
bin\kibana.bat
提示:由于ElasticSearch和Kafka環境部署在本人其他部落格文章中有寫過,這裡就不在做配置
```
參考部落格:
https://blog.csdn.net/yy756127197/article/details/78873310
https://blog.csdn.net/yy756127197/article/details/78873310