Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time


Build Status Coverage Status Go Report Card GoDoc Codebase Support

Table schema tooling in Go.

Getting started


This package uses semantic versioning 2.0.0.

Go version >= 1.11

Please use go modules if you're using a version that supports it. To know which version of Go you're running, please run:

$ go version
go version go1.14 linux/amd64

If you're running go1.13+, you're good to go!

If you can not upgrade right now, you need to make sure your environment is using Go modules by setting the GO111MODULE environment variable. In bash, that could be done with the following command:

$ export GO111MODULE=on

Go version >= 1.8 && < 1.11

$ dep init
$ dep ensure -add>=0.1

Main Features

Tabular Data Load

Have tabular data stored in local files? Remote files? Packages like the csv are going to help on loading the data you need and making it ready for processing.

package main

import ""

func main() {
   tab, err := csv.NewTable(csv.Remote("myremotetable"), csv.LoadHeaders())
   // Error handling.

Supported physical representations:

You would like to use tableschema-go but the physical representation you use is not listed here? No problem! Please create an issue before start contributing. We will be happy to help you along the way.

Schema Inference and Configuration

Got that new dataset and wants to start getting your hands dirty ASAP? No problems, let the schema package try to infer the data types based on the table data.

package main

import (

func main() {
   tab, _ := csv.NewTable(csv.Remote("myremotetable"), csv.LoadHeaders())
   sch, _ := schema.Infer(tab)
   fmt.Printf("%+v", sch)

Want to go faster? Please give InferImplicitCasting a try and let us know how it goes.

There might be cases in which the inferred schema is not correct. One of those cases is when your data use strings like "N/A" to represent missing cells. That would usually make our inferential algorithm think the field is a string.

When that happens, you can manually perform those last minutes tweaks Schema.

   sch.MissingValues = []string{"N/A"}
   sch.GetField("ID").Type = schema.IntegerType

After all that, you could persist your schema to disk:


And use the local schema later:

sch, _ := sch.LoadFromFile("users_schema.json")

Finally, if your schema is saved remotely, you can also use it:

sch, _ := schema.LoadRemote("http://myfoobar/users/schema.json")

Processing Tabular Data

Once you have the data, you would like to process using language data types. schema.CastTable and schema.CastRow are your friends on this journey.

package main

import (

type user struct {
   ID   int
   Age  int
   Name string

func main() {
   tab, _ := csv.NewTable(csv.FromFile("users.csv"), csv.LoadHeaders())
   sch, _ := schema.Infer(tab)
   var users []user
   sch.CastTable(tab, &users)
   // Users slice contains the table contents properly raw into
   // language types. Each row will be a new user appended to the slice.

If you have a lot of data and can no load everything in memory, you can easily iterate trough it:

   iter, _ := sch.Iter()
   for iter.Next() {
      var u user
      sch.CastRow(iter.Row(), &u)
      // Variable u is now filled with row contents properly raw
      // to language types.

If you store data in a GZIP file, you can load it compressed using the same csv.FromFile:

   tab, _ := csv.NewTable(csv.FromFile("users.csv.gz"), csv.LoadHeaders())

Even better if you could do it regardless the physical representation! The table package declares some interfaces that will help you to achieve this goal:


Class represents field in the schema.

For example, data values can be castd to native Go types. Decoding a value will check if the value is of the expected type, is in the correct format, and complies with any constraints imposed by a schema.

    'name': 'birthday',
    'type': 'date',
    'format': 'default',
    'constraints': {
        'required': True,
        'minimum': '2015-05-30'

The following example will raise exception the passed-in is less than allowed by minimum constraints of the field. Errors will be returned as well when the user tries to cast values which are not well formatted dates.

date, err := field.Cast("2014-05-29")
// uh oh, something went wrong

Values that can't be castd will return an error. Casting a value that doesn't meet the constraints will return an error.

Available types, formats and resultant value of the cast:

Type Formats Casting result
any default interface{}
object default interface{}
array default []interface{}
boolean default bool
duration default time.Time
geopoint default, array, object [float64, float64]
integer default int64
number default float64
string default, uri, email, binary string
date default, any, <PATTERN> time.Time
datetime default, any, <PATTERN> time.Time
time default, any, <PATTERN> time.Time
year default time.Time
yearmonth default time.Time

Saving Tabular Data

Once you're done processing the data, it is time to persist results. As an example, let us assume we have a remote table schema called summary, which contains two fields:

import (

type summaryEntry struct {
    Date time.Time
    AverageAge float64

func WriteSummary(summary []summaryEntry, path string) {
   sch, _ := schema.LoadRemote("http://myfoobar/users/summary/schema.json")

   f, _ := os.Create(path)
   defer f.Close()

   w := csv.NewWriter(f)
   defer w.Flush()

   w.Write([]string{"Date", "AverageAge"})
   for _, summ := range summary{
       row, _ := sch.UncastRow(summ)

API Reference and More Examples

More detailed documentation about API methods and plenty of examples is available at


Found a problem and would like to fix it? Have that great idea and would love to see it in the repository?

Please open an issue before start working

That could save a lot of time from everyone and we are super happy to answer questions and help you alonge the way. Furthermore, feel free to join frictionlessdata Gitter chat room and ask questions.

This project follows the Open Knowledge International coding standards

  • Before start coding:

    • Fork and pull the latest version of the master branch
    • Make sure you have go 1.8+ installed and you're using it
    • Make sure you dep installed
  • Before sending the PR:

$ cd $GOPATH/src/
$ dep ensure
$ go test ./..

And make sure your all tests pass.