天天看點

elk 5.1 用于mysql slow 日志elk 暫時不用了

最近面臨一個任務,要對mysql的慢查詢日志進行整理輸出。之前的Le的時候,寫過一個腳本,multiline處理mysql的日志,然後每天發送日志。不過,由于日志格式不同,有些可以解析,但是有些也是解析錯誤了。

聽說elk很牛已經很久了,之前也測試過,當時是為了打造一款日志進行中心,但是需要有RBA功能,role based authorization,區分不同的使用者權限,但是當時的測試非常糟糕,一個是不懂elk,一個是RBA需要付費才有,是以後來也就擱淺了。

這次重裝上陣,直搗黃龍,目标隻有一個,就是解析慢查詢日志。

e

下載下傳

配置

# config/elasticsearch.yml

node.name: t17
path.data: ./data
path.logs: ./logs
network.host: 

http.cors.enabled: true
http.cors.allow-origin: "*"
           

插件 (head)

git clone git://github.com/mobz/elasticsearch-head.git
cd elasticsearch-head
npm install -f
grunt server
open http://localhost:9100/
           

啟動

# 直接啟動,也不要啥啟動腳本了。
./elasticsearch 
           

l

input {
  #stdin {
  file {
    type => "mysql-slow"
    path => "/opt/elk/logstash-5.1.1/bin/master-slow.log"
    start_position => "beginning"
    codec => multiline {
      pattern => "^# [email protected]:"
      negate => true
      what => previous
    }
  }
}

filter {
  # drop sleep events
  grok {
    match => { "message" => "SELECT SLEEP" }
    add_tag => [ "sleep_drop" ]
    tag_on_failure => [] # prevent default _grokparsefailure tag on real records
  }
  if "sleep_drop" in [tags] {
    drop {}
  }
  grok {
    #match => [ "message", "(?m)^# [email protected]: %{USER:user}\[[^\]]+\] @ (?:(?<clienthost>\S*) )?\[(?:%{IP:clientip})?\]\s*# Query_time: %{NUMBER:query_time:float}\s+Lock_time: %{NUMBER:lock_time:float}\s+Rows_sent: %{NUMBER:rows_sent:int}\s+Rows_examined: %{NUMBER:rows_examined:int}\s*(?:use %{DATA:database};\s*)?SET timestamp=%{NUMBER:timestamp};\s*(?<query>(?<action>\w+)\s+.*)\n# Time:.*$" ]
    #match => [ "message", "# [email protected]: %{WORD:test};" ]
    match => [ "message", "# [email protected]:\s+%{WORD:user1}\[%{WORD:user2}\]\s+@\s+\[(?:%{IP:clientip})?\]\s+#\s+Thread_id:\s+%{NUMBER:thread_id:int}\s+Schema:\s+%{WORD:schema}\s+QC_hit:\s+%{WORD:qc_hit}\s+#\s+Query_time:\s+%{NUMBER:query_time:float}\s+Lock_time:\s+%{NUMBER:lock_time:float}\s+Rows_sent:\s+%{NUMBER:rows_sent:int}\s+Rows_examined:\s+%{NUMBER:rows_examined:int}\s+#\s+Rows_affected:\s+%{NUMBER:rows_affected:int}\s+SET\s+timestamp=%{NUMBER:timestamp};\s+(?<query>(?<action>\w+)\s+.*);"]
  }
  date {
    match => [ "timestamp", "UNIX" ]
    remove_field => [ "timestamp" ]
  }
}

output {
  elasticsearch { 
    hosts => ["192.168.126.17:9200"] 
        index=>"mysql-slow-log-%{+YYYY.MM.dd}"
    }
  stdout { codec => rubydebug }
}
           
  • 最讨厭的就是filter,我們的日志是mariadb10.1.16的,網上還沒有發現這種filter,是以參照着寫了個。

k 暫時不用了

  • 正規表達式 http://www.runoob.com/regexp/regexp-syntax.html
  • grok正則捕獲 http://udn.yyuap.com/doc/logstash-best-practice-cn/filter/grok.html
  • mysql 慢查詢日志 logstash捕獲 http://kibana.logstash.es/content/logstash/examples/mysql-slow.html
  • 專業處理mysql http://soft.dog/2016/01/30/logstash-mysql-slow-log/
  • 非常棒的一篇文章 http://www.fblinux.com/?p=40
elk