Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

database/sql: allow drivers to support additional Value types #13567

Closed
theory opened this issue Dec 10, 2015 · 5 comments
Closed

database/sql: allow drivers to support additional Value types #13567

theory opened this issue Dec 10, 2015 · 5 comments
Assignees
Milestone

Comments

@theory
Copy link

@theory theory commented Dec 10, 2015

It'd be useful if drivers could augment the list of data types supported by database/sql/convert.go. The ability to implement driver.Valuer and sql.Scanner for one's own structs is great, but there are core types I'd like to be able to use, as well. For example, PostgreSQL supports array types. I'd love to be able to convert between slices and SQL arrays. It makes sense that database/sql would not support this, but if there was an interface for the pg or pgx drivers to add it for core types, it would be super useful. Perhaps Driver could have methods for this kind of conversion? Maybe something like:

if scanned, ok := driver.Scan(src); ok {
    return scanned
}

if val, ok := driver.Value(src); ok {
    return val
}

Then the drivers could be updated to, say, convert between []string and TEXT[].

@rsc rsc changed the title database/sql: Add Interface for Drivers to Support Additional data types database/sql: allow drivers to support additional Value types Dec 28, 2015
@rsc rsc added this to the Unplanned milestone Dec 28, 2015
@rsc
Copy link
Contributor

@rsc rsc commented Dec 28, 2015

/cc @bradfitz

@theory
Copy link
Author

@theory theory commented Jan 4, 2016

I'm thinking maybe create a new interface that drivers can implement. If they do, use it, if not, don't. Should ensure compatibility.

@kardianos
Copy link
Contributor

@kardianos kardianos commented Oct 10, 2016

Databases can support non-standard result types. Common ones that are found include arrays, JSON, XML, table valued types, IP addresses, and GIS types.

For scanning custom result types I propose the following changes are made:

package driver

type ScanAssigner interface {
    Rows
    Assign(dest, src interface{}) (ok bool, err error)
}

package sql

// Existing struct
type Rows struct {
    // New field, assigned when Rows is created.
    assigner driver.ScanAssigner
}

// modify existing function, pass new assigner from Rows into convertAssign.
func convertAssign(assigner driver.ScanAssigner, dest, src interface{}) error {
    ok, err := assigner.Assign(dest, src) // Before anything existing.
    if err != nil {
        return err
    }
    if ok {
        return nil
    }
    // Existing function
}

I think the above should allow returning any database supported type. Using any type as input parameters will be another issue.

@rsc rsc modified the milestones: Go1.9, Go1.8Maybe Oct 20, 2016
@derekperkins
Copy link

@derekperkins derekperkins commented Nov 28, 2016

This would also allow drivers to truly support uint64 values. Discussion about that was shut down in #9373, but the MySQL implementation requires a hack that returns a uint64 value as a string. I'm really hoping that this makes it into Go 1.9.

@gopherbot
Copy link

@gopherbot gopherbot commented Mar 23, 2017

CL https://golang.org/cl/38533 mentions this issue.

@gopherbot gopherbot closed this in a9bf3b2 May 18, 2017
@golang golang locked and limited conversation to collaborators May 18, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
7 participants
You can’t perform that action at this time.