Skip to content
Scan database/sql rows directly to structs, slices, and primitive types
Go Makefile
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
vendor dep -> mod Mar 30, 2019
.envrc gometalinter -> golangci-lint Apr 1, 2019
.gitignore add RowsStrict and RowStrict for strict scanning (#6) Apr 19, 2019
.golangci.yml gometalinter -> golangci-lint Apr 1, 2019
.travis.yml
CHANGELOG.md add RowsStrict and RowStrict for strict scanning (#6) Apr 19, 2019
LICENSE Create LICENSE Nov 30, 2017
Makefile make: remove mockgen, add lint Nov 12, 2018
README.md add RowsStrict and RowStrict for strict scanning (#6) Apr 19, 2019
bench_scanner_test.go remove panics and use errors Nov 12, 2018
columns.go remove panics and use errors Nov 12, 2018
columns_test.go remove panics and use errors Nov 12, 2018
doc.go maint Feb 27, 2018
example_scanner_test.go add RowsStrict and RowStrict for strict scanning (#6) Apr 19, 2019
examples_values_columns_test.go remove panics and use errors Nov 12, 2018
fakes_test.go remove gomock, use fakes Nov 12, 2018
go.mod dep -> mod Mar 30, 2019
go.sum dep -> mod Mar 30, 2019
interface.go use cache for columns Mar 1, 2018
scanner.go add RowsStrict and RowStrict for strict scanning (#6) Apr 19, 2019
scanner_test.go add RowsStrict and RowStrict for strict scanning (#6) Apr 19, 2019
sqlite_test.go remove dot imports Nov 10, 2018
values.go gometalinter -> golangci-lint Apr 1, 2019
values_test.go remove panics and use errors Nov 12, 2018

README.md

Scan

GoDoc Travis Coveralls github Report Card Dependabot Status

Scan provides the ability to use database/sql/rows to scan datasets directly to structs or slices. For the most comprehensive and up-to-date docs see the godoc

Examples

Multiple Rows

db, _ := sql.Open("sqlite3", "database.sqlite")
rows, _ := db.Query("SELECT * FROM persons")

var persons []Person
err := scan.Rows(&persons, rows)

fmt.Printf("%#v", persons)
// []Person{
//    {ID: 1, Name: "brett"},
//    {ID: 2, Name: "fred"},
//    {ID: 3, Name: "stacy"},
// }

Multiple rows of primitive type

rows, _ := db.Query("SELECT name FROM persons")
var names []string
err := scan.Rows(&names, rows)

fmt.Printf("%#v", names)
// []string{
//    "brett",
//    "fred",
//    "stacy",
// }

Single row

rows, _ := db.Query("SELECT * FROM persons where name = 'brett' LIMIT 1")
var person Person
err := scan.Row(&person, rows)

fmt.Printf("%#v", person)
// Person{ ID: 1, Name: "brett" }

Scalar value

rows, _ := db.Query("SELECT age FROM persons where name = 'brett' LIMIT 1")
var age int8
err := scan.Row(&age, row)

fmt.Printf("%d", age)
// 100

Strict Scanning

Both Rows and Row have strict alternatives to allow scanning to structs strictly based on their db tag. To avoid unwanted behavior you can use RowsStrict or RowStrict to scan without using field names. Any fields not tagged with the db tag will be ignored even if columns are found that match the field names.

Columns

Columns scans a struct and returns a string slice of the assumed column names based on the db tag or the struct field name respectively. To avoid assumptions, use ColumnsStrict which will only return the fields tagged with the db tag. Both Columns and ColumnsStrict are variadic. They both accept a string slice of column names to exclude from the list. It is recommended that you cache this slice.

package main

type User struct {
        ID        int64
        Name      string
        Age       int
        BirthDate string `db:"bday"`
        Zipcode   string `db:"-"`
        Store     struct {
                ID int
                // ...
        }
}

var nobody = new(User)
var userInsertCols = scan.Columns(nobody, "ID")
// []string{ "Name", "Age", "bday" }

var userSelectCols = scan.Columns(nobody)
// []string{ "ID", "Name", "Age", "bday" }

Values

Values scans a struct and returns the values associated with the provided columns. Values uses a sync.Map to cache fields of structs to greatly improve the performance of scanning types. The first time a struct is scanned it's exported fields locations are cached. When later retrieving values from the same struct it should be much faster. See Benchmarks below.

user := &User{
        ID: 1,
        Name: "Brett",
        Age: 100,
}

vals := scan.Values([]string{"ID", "Name"}, user)
// []interface{}{ 1, "Brett" }

I find that the usefulness of both Values and Columns lies within using a library such as sq.

sq.Insert(userCols...).
        Into("users").
        Values(scan.Values(userCols, &user)...)

Configuration

AutoClose: Automatically call rows.Close() after scan completes (default true)

Why

While many other projects support similar features (i.e. sqlx) scan allows you to use any database lib such as the stdlib or squirrel to write fluent SQL statements and pass the resulting rows to scan for scanning.

Benchmarks

λ go test -bench=. -benchtime=10s ./...
goos: linux
goarch: amd64
pkg: github.com/blockloop/scan
BenchmarkColumnsLargeStruct-8           50000000               272 ns/op
BenchmarkValuesLargeStruct-8             2000000              8611 ns/op
BenchmarkScanRowOneField-8               2000000              8528 ns/op
BenchmarkScanRowFiveFields-8             1000000             12234 ns/op
BenchmarkScanTenRowsOneField-8           1000000             16802 ns/op
BenchmarkScanTenRowsTenFields-8           100000            104587 ns/op
PASS
ok      github.com/blockloop/scan       116.055s
You can’t perform that action at this time.