天天看點

Elasticsearch Word Segmentation

> 對于反向索引來說,很重要的一件事情就是需要對文本進行分詞,經過分詞可以擷取情感、詞性、質性、詞頻等等的資料。

#### Elasticsearch 分詞工作原理

在 Elasticsearch 中進行行分詞的需要經過分析器的3個子產品,字元過濾器将文本進行替換或者删除,在由分詞器進行拆分成單詞,最後由Token過濾器将一些無用語氣助詞删掉。

Elasticsearch Word Segmentation

#### 英文分詞

在Elasticsearch 中共支援5種不同的分詞模式,在不同的場景下發揮不同的效果。

##### standard (過濾标點符号)

GET /_analyze
{
  "analyzer": "standard",
  "text": "The programmer's holiday is 1024!"
}      
Elasticsearch Word Segmentation

##### simple (過濾數字和标點符号)

GET /_analyze
{
  "analyzer": "simple",
  "text": "The programmer's holiday is 1024!"
}      
Elasticsearch Word Segmentation

##### whitespace (不過濾,按照空格分隔)

GET /_analyze
{
  "analyzer": "whitespace",
  "text": "The programmer's holiday is 1024!"
}      
Elasticsearch Word Segmentation

##### stop (過濾停頓單詞及标點符号,例如is are等等)

GET /_analyze
{
  "analyzer": "stop",
  "text": "The programmer's holiday is 1024!"
}      
Elasticsearch Word Segmentation

##### keyword (視為一個整體不進行任何處理)

GET /_analyze

{
  "analyzer": "keyword",
  "text": "The programmer's holiday is 1024!"
}      
Elasticsearch Word Segmentation

#### 中文分詞

因為 Elasticsearch 預設的分詞器隻能按照單字進行拆分,無法具體分析其語意等,是以我們使用 analysis-icu 來代替預設的分詞器。

GET /_analyze
{
  "analyzer": "icu_analyzer",
  "text": "南京市長江大橋"
}      
Elasticsearch Word Segmentation

通過指令``./bin/elasticsearch-plugin install analysis-icu``進行安裝

Elasticsearch Word Segmentation
GET /_analyze
{
  "analyzer": "icu_analyzer",
  "text": "南京市長江大橋"
}      
Elasticsearch Word Segmentation

##### 其他的中文分詞器

支援中文分詞和詞性标注功能

elasticsearch-thulac-plugin

https://github.com/microbun/elasticsearch-thulac-plugin

支援熱更新分詞字典及自定義詞庫

elasticsearch-analysis-ik

https://github.com/medcl/elasticsearch-analysis-ik

wget

https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v7.9.2/elasticsearch-analysis-ik-7.9.2.zip

mkdir analysis-ik

unzip -d analysis-ik elasticsearch-analysis-ik-7.9.2.zip

##### 分詞實驗

# 先删除原先資料,避免影響實驗
DELETE icu
DELETE ik      
# 建立 ICU 的 Index
PUT icu
{
  "settings" : {
    "number_of_shards" : 1,
    "number_of_replicas": 1
  },
  "mappings" : {
    "properties" : {
      "description" : { 
        "type" : "text",
        "analyzer": "icu_analyzer",
        "search_analyzer": "icu_analyzer"
      }
    }
  }
}      
# 建立 IK 的 Index
PUT ik
{
  "settings" : {
    "number_of_shards" : 1,
    "number_of_replicas": 1
  },
  "mappings" : {
    "properties" : {
      "description" : { 
        "type" : "text",
        "analyzer": "ik_max_word",
        "search_analyzer": "ik_smart"
      }
    }
  }
}      
# 批量接口送出測試資料
POST _bulk
{ "index": {"_index":"ik" }}
{"description":"CHINKIANG VINEGAR·GOLD PLUM··12PC·金梅鎮江醋"}
{ "index": {"_index":"ik" }}
{"description":"5YEAR MATUREVINEGAR·SHUITA··24PC·山西五年老陳醋"}
{ "index": {"_index":"ik" }}
{"description":"RED VINEGAR·KOON CHUN··12PC·冠珍大紅浙醋"}
{ "index": {"_index":"ik" }}
{"description":"VINEGAR SEASONING·FUJI··5.28GAL·富士白醋"}
{ "index": {"_index":"ik" }}
{"description":"WHITE VINEGAR·FOUR IN ONE··4PC·四合醋"}
{ "index": {"_index":"ik" }}
{"description":"WHITE VINEGAR·HEAVENLY CHEF··4PC·天廚白醋"}
{ "index": {"_index":"ik" }}
{"description":"CHINKIANG VINEGAR··'24414·24PC·恒順鎮江醋"}
{ "index": {"_index":"ik" }}
{"description":"3YEAR MATUREVINEGAR·SHUITA··24PC·山西三年老陳醋"}
{ "index": {"_index":"ik" }}
{"description":"RICE VINEGAR·KONG YEN·'23709·4PC·工研白醋"}
{ "index": {"_index":"ik" }}
{"description":"CHINKIANG VINEGAR··'24421·24PC·金山鎮江醋"}
{ "index": {"_index":"ik" }}
{"description":"WHITE VINEGAR·CHAMPION··4PC·醋"}
{ "index": {"_index":"ik" }}
{"description":"BLACK VINEGAR·KONG YEN·'23707·4PC·工研烏醋"}
{ "index": {"_index":"ik" }}
{"description":"WHITE VINEGAR·GOLDEN STATE·<50GR>·4PC·醋"}
{ "index": {"_index":"ik" }}
{"description":"WHITE VINEGAR·ACCLAIM··4PC·醋"}
{ "index": {"_index":"ik" }}
{"description":"WHITE VINEGAR·GOLDEN STATE·<100GR>·4PC·醋"}
{ "index": {"_index":"ik" }}
{"description":"RICE VINEGAR·KONG YEN··24PCX10OZ·工研米醋"}
{ "index": {"_index":"ik" }}
{"description":"WHITE VINEGAR·ACCLAIM··12PCX32OZ·醋"}      
POST icu/_search
{
  "query" : { 
    "match" : { 
      "description" : "老陳醋" 
    }
  }
}      
Elasticsearch Word Segmentation
POST ik/_search
{
  "query" : { 
    "match" : { 
      "description" : "老陳醋" 
    }
  }      

}

Elasticsearch Word Segmentation
GET /_analyze
{
  "analyzer": "icu_analyzer",
  "text": "老陳醋"
}      
Elasticsearch Word Segmentation
GET /_analyze
{
  "analyzer": "ik_smart",
  "text": "老陳醋"
}      
Elasticsearch Word Segmentation

繼續閱讀