由于是大資料,打日志相對麻煩,使用玄坤大法,選點打日志,才把這個bug糾出來。
使用庫:sarama
addrs := strings.Split(config.Kafka.RealBroker, ",")
kafkaConfig := sarama.NewConfig()
kafkaConfig.Producer.Timeout = 3
kafkaConfig.Producer.Return.Errors = false //設定成false,防止失敗過多,沒有消費導緻阻塞,而且不是必現,極難排查
kafkaConfig.Producer.Return.Successes = false
kafkaConfig.Producer.MaxMessageBytes = 50 * 1024 * 1024
producer, err := sarama.NewAsyncProducer(addrs, kafkaConfig)
if err != nil {
log.Println("new async producer err: ", err)
_,file,line,_ := runtime.Caller(0)
dingding.PushMessage(fmt.Sprintf("new async producer err:%v file:%s line:%d",err,file,line))
return err
}
conn.KafkaProducer = producer
如果kafkaConfig.Producer.Return.Errors和kafkaConfig.Producer.Return.Successes設定為true,則一定要有相應的消費channel的代碼,否則可能發生消息阻塞。