Easily load data from kafka to ClickHouse with high performance
Switch branches/tags
Nothing to show
Clone or download
Latest commit 79edd35 Nov 7, 2018
Failed to load latest commit information.
bin update package name Jun 14, 2018
column fix return val Jul 5, 2018
conf update to gjson Oct 13, 2018
input fix bug Oct 13, 2018
internal add pool Jun 21, 2018
model update to gjson Oct 13, 2018
output fix auto reconnect Nov 7, 2018
parser update to gjson Oct 13, 2018
pool fix auto reconnect Nov 7, 2018
task update to gjson Oct 13, 2018
util update to gjson Oct 13, 2018
vendor add vendor to source Oct 13, 2018
.gitignore Update README Oct 13, 2018
Makefile Add Makefile. Jul 2, 2018
README.md Fix go get README Oct 31, 2018



clickhouse_sinker is a sinker program that consumes kafka message and import them to ClickHouse.


  • Easy to use and deploy, you don't need write any hard code, just care about the configuration file
  • Custom parser support.
  • Support multiple sinker tasks, each runs on parallel.
  • Support multiply kafka and ClickHouse clusters.
  • Bulk insert (by config bufferSize and flushInterval).
  • Loop write (when some node crashes, it will retry write the data to the other healthy node)
  • Uses Native ClickHouse client-server TCP protocol, with higher performance than HTTP.

Install && Run

By binary files (suggested)

Download the binary files from release, choose the executable binary file according to your env, modify the conf files, then run ./clickhouse_sinker -conf conf

By source

  • Install Golang

  • Go Get

go get -u github.com/housepower/clickhouse_sinker/...

cd $GOPATH/src/github.com/housepower/clickhouse_sinker

go get -u github.com/kardianos/govendor

# may take a while
govendor sync
  • Build && Run
go build -o clickhouse_sinker bin/main.go

## modify the config files, then run it
./clickhouse_sinker -conf conf

Support parsers

  • Json

Supported data types

  • UInt8, UInt16, UInt32, UInt64, Int8, Int16, Int32, Int64
  • Float32, Float64
  • String
  • FixedString
  • DateTime (not support, cause we use timestamp UInt32, that's enough)


See config example

Custom metric parser

  • You just need to implement the parser interface on your own
type Parser interface {
	Parse(bs []byte) model.Metric

See json parser