-
Notifications
You must be signed in to change notification settings - Fork 729
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
make it less pg dependant #85
Comments
With an update to Go 1.13, the ties to pg will be even smaller, but right now, pq.Array is too convenient so substitute. |
I did not realize pq.Array was doing so much... this needs some deeper inspection two examples of differences i can think of |
The solution for arrays indeed works, but it is for a specific case that was already covered by a previous issue (refer to #77). I think a more general and idiomatic approach is to make full use of sql.Scanner and the driver.Valuer interfaces. I think this piece of code could be something that would be useful: package models
import (
"database/sql"
"database/sql/driver"
"encoding/csv"
"errors"
"fmt"
"regexp"
"strings"
)
type StringSlice []string
// StringSlice is a slice of strings.
type StringSlice []string
// quoteEscapeRegex is the regex to match escaped characters in a string.
var quoteEscapeRegex = regexp.MustCompile(`([^\\]([\\]{2})*)\\"`)
// Scan satisfies the sql.Scanner interface for StringSlice.
func (ss *StringSlice) Scan(src interface{}) error {
buf, ok := src.([]byte)
if !ok {
return errors.New("invalid StringSlice")
}
// change quote escapes for csv parser
str := quoteEscapeRegex.ReplaceAllString(string(buf), `$1""`)
str = strings.Replace(str, `\\`, `\`, -1)
// remove braces
str = str[1 : len(str)-1]
// bail if only one
if len(str) == 0 {
*ss = StringSlice([]string{})
return nil
}
// parse with csv reader
cr := csv.NewReader(strings.NewReader(str))
slice, err := cr.Read()
if err != nil {
fmt.Printf("exiting!: %v\n", err)
return err
}
*ss = StringSlice(slice)
return nil
}
// Value satisfies the driver.Valuer interface for StringSlice.
func (ss StringSlice) Value() (driver.Value, error) {
v := make([]string, len(ss))
for i, s := range ss {
v[i] = `"` + strings.Replace(strings.Replace(s, `\`, `\\\`, -1), `"`, `\"`, -1) + `"`
}
return "{" + strings.Join(v, ",") + "}", nil
} Note: this is a straight copy of some of the code generated by the xo package. But this code works and can be refactored for any of the basic types in Go. It can be used either as field in the generated structs and as a parameter in the db.Query* methods. If we include scanner types for the ints, the floats, string, bool, Time and interface{} in the generated db.go or models.go file, and use them to substitute each instance of an Postgres array, then the dependence to the pq package would be eliminated in Go 1.13 onwards. Any version below that would be impossible to not have any dependence to the pq package. |
@mh-cohen I understand the appeal of having sqlc work with multiple databases. However, the current implementation is built on the top of the PostgreSQL query parser (https://github.com/lfittl/pg_query_go). If there is a Go package containing the SQL parser for your database of choice, I can add it to sqlc. |
hi,
while i really like the project, it is tied to pgsql engine.
I would like the package to be as agnostic as possible with very little ties to the underlying engine as much as possible.
in my current understanding of the package it does parses the sql create, alter statements to performs some type coalescence.
Those types metadatas are being use to produce compatible func signatures, for as much as i understood so far.
As a consequence the package uses a full blown parser of pgsql statements.
Have you considered to extract those type metadata from the golang struct ?
Instead of building the catalog using the create/alter statements, a catalog can be built from the queries statements, using name resolution their types can be retrieved from the golang metadata obtained earlier.
the sql query parser could be as agnostic as possible, trying to understand sufficiently to extract (table|alias).columns names, but blind enough to not bother if it encounters a function name (or anything similar specific to a db engine) it does not know about.
the code generator has some ties to pg too, but it does not look like a big rethink.
The text was updated successfully, but these errors were encountered: