Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CSVProducer returns error 'data type must be byte array' #263

Closed
ahamtat opened this issue May 4, 2023 · 0 comments · Fixed by #280
Closed

CSVProducer returns error 'data type must be byte array' #263

ahamtat opened this issue May 4, 2023 · 0 comments · Fixed by #280

Comments

@ahamtat
Copy link

ahamtat commented May 4, 2023

Hello,

I want to use CSV producer in the response to get data in CSV format. My swagger specification is the following

swagger: '2.0'
info:
  version: 1.0.0
  description: Provides an API for response in CSV format
  title: Foo example
  license:
    name: Apache 2.0
    url: http://www.apache.org/licenses/LICENSE-2.0.html

host:
  localhost:8080

basePath: "/"

paths:
  /:
    get:
      summary: Produces an response with data in CSV format
      description: Produces an response with data in CSV format
      tags:
        - Query
      produces:
        - "text/csv"
      consumes:
        - "application/json"
      operationId: getCSVResponse
      responses:
        200:
          description: "OK"
          schema:
            type: string
            $ref: "#/definitions/GetCSVFileResponse"
          headers:
            Content-Disposition:
              type: string
              description: "Attachment; filename=example.csv"

definitions:
  GetCSVFileResponse:
    description: CSV file response
    type: string
    format: binary
    x-go-mimetype: text/csv
    x-go-fileName: example.csv
    x-nullable: false

Then I generate code with the swagger v0.30.4 utility:

swagger \
	generate server \
	--target=internal/app/rest \
	-f api/app.swagger.yml

The generated GetCSVResponseOK struct has payload of models.GetCSVFileResponse type which is io.ReadCloser:

/*
GetCSVResponseOK OK

swagger:response getCSVResponseOK
*/
type GetCSVResponseOK struct {
	/*Attachment; filename=example.csv

	 */
	ContentDisposition string `json:"Content-Disposition"`

	/*
	  In: Body
	*/
	Payload models.GetCSVFileResponse `json:"body,omitempty"`
}

When I try to run my code the CSV producer fails making type cast from this type to a byte slice and returns an error 'data type must be byte array' at the line https://github.com/go-openapi/runtime/blob/master/csv.go#L60
Also I've tried type 'string' in swagger specification only:

  /:
    get:
      summary: Produces an response with data in CSV format
      description: Produces an response with data in CSV format
      tags:
        - Query
      produces:
        - "text/csv"
      operationId: getCSVResponse
      responses:
        200:
          description: "OK"
          schema:
            type: string
          headers:
            Content-Disposition:
              type: string
              description: A header specifying the file name

This case the generated GetCSVResponseOK struct has payload of string type but type cast into the CSV producer fails the same way.

Please tell how to write a swagger specification to get a byte slice type as a response or to convert other types into a byte slice before calling CSV producer?

fredbi added a commit to fredbi/runtime that referenced this issue Dec 13, 2023
This PR allows to use the CSV consumer and producer in a more versatile way.
There is no breaking change to the interface.

* fixes go-openapi#263 (types built with an io.Reader should be able to produce
  CSV)

* csv/consumer can now consume CSV into *csv.Writer, io.Writer, io.ReaderFrom,
  encoding.BinaryUnmarshaler
* also supports the new CSVWriter interface, i.e. anything that can
  Write([]string) error like *csv.Writer
* also supports pointers with underlying type *[][]string, *[]byte and *string, not just
  *[]byte

* csv/producer can now produce CSV from *csv.Reader, io.Reader,
  io.WriterTo, encoding.BinaryMarshaler
* also supports the new CSVReader interface, i.e. anything that can
  Read() ([]string, error) like *csv.Reader
* also supports underlying types [][]string, []byte and string, not just
  []byte

* CSVConsumer and CSVProducer now stream CSV records whenever possible,
* like ByteStreamConsumer and Producer, added the CSVCloseStream()
  option

* added support to (optionally) configure the CSV format with CSVOpts,
  using the options made available by the standard library

* doc: documented the above in the exported func signatures
* test: added full unit test of the CSVConsumer and Producer

Signed-off-by: Frederic BIDON <fredbi@yahoo.com>
fredbi added a commit to fredbi/runtime that referenced this issue Dec 13, 2023
This PR allows to use the CSV consumer and producer in a more versatile way.
There is no breaking change to the interface.

* fixes go-openapi#263 (types built with an io.Reader should be able to produce
  CSV)

* csv/consumer can now consume CSV into *csv.Writer, io.Writer, io.ReaderFrom,
  encoding.BinaryUnmarshaler
* also supports the new CSVWriter interface, i.e. anything that can
  Write([]string) error like *csv.Writer
* also supports pointers with underlying type *[][]string, *[]byte and *string, not just
  *[]byte

* csv/producer can now produce CSV from *csv.Reader, io.Reader,
  io.WriterTo, encoding.BinaryMarshaler
* also supports the new CSVReader interface, i.e. anything that can
  Read() ([]string, error) like *csv.Reader
* also supports underlying types [][]string, []byte and string, not just
  []byte

* CSVConsumer and CSVProducer now stream CSV records whenever possible,
* like ByteStreamConsumer and Producer, added the CSVCloseStream()
  option

* added support to (optionally) configure the CSV format with CSVOpts,
  using the options made available by the standard library

* doc: documented the above in the exported func signatures
* test: added full unit test of the CSVConsumer and Producer

Signed-off-by: Frederic BIDON <fredbi@yahoo.com>
fredbi added a commit to fredbi/runtime that referenced this issue Dec 13, 2023
This PR allows to use the CSV consumer and producer in a more versatile way.
There is no breaking change to the interface.

* fixes go-openapi#263 (types built with an io.Reader should be able to produce
  CSV)

* csv/consumer can now consume CSV into *csv.Writer, io.Writer, io.ReaderFrom,
  encoding.BinaryUnmarshaler
* also supports the new CSVWriter interface, i.e. anything that can
  Write([]string) error like *csv.Writer
* also supports pointers with underlying type *[][]string, *[]byte and *string, not just
  *[]byte

* csv/producer can now produce CSV from *csv.Reader, io.Reader,
  io.WriterTo, encoding.BinaryMarshaler
* also supports the new CSVReader interface, i.e. anything that can
  Read() ([]string, error) like *csv.Reader
* also supports underlying types [][]string, []byte and string, not just
  []byte

* CSVConsumer and CSVProducer now stream CSV records whenever possible,
* like ByteStreamConsumer and Producer, added the CSVCloseStream()
  option

* added support to (optionally) configure the CSV format with CSVOpts,
  using the options made available by the standard library

* doc: documented the above in the exported func signatures
* test: added full unit test of the CSVConsumer and Producer

Signed-off-by: Frederic BIDON <fredbi@yahoo.com>
fredbi added a commit to fredbi/runtime that referenced this issue Dec 13, 2023
This PR allows to use the CSV consumer and producer in a more versatile way.
There is no breaking change to the interface.

* fixes go-openapi#263 (types built with an io.Reader should be able to produce
  CSV)

* csv/consumer can now consume CSV into *csv.Writer, io.Writer, io.ReaderFrom,
  encoding.BinaryUnmarshaler
* also supports the new CSVWriter interface, i.e. anything that can
  Write([]string) error like *csv.Writer
* also supports pointers with underlying type *[][]string, *[]byte and *string, not just
  *[]byte

* csv/producer can now produce CSV from *csv.Reader, io.Reader,
  io.WriterTo, encoding.BinaryMarshaler
* also supports the new CSVReader interface, i.e. anything that can
  Read() ([]string, error) like *csv.Reader
* also supports underlying types [][]string, []byte and string, not just
  []byte

* CSVConsumer and CSVProducer now stream CSV records whenever possible,
* like ByteStreamConsumer and Producer, added the CSVCloseStream()
  option

* added support to (optionally) configure the CSV format with CSVOpts,
  using the options made available by the standard library

* doc: documented the above in the exported func signatures
* test: added full unit test of the CSVConsumer and Producer

Signed-off-by: Frederic BIDON <fredbi@yahoo.com>
fredbi added a commit that referenced this issue Dec 22, 2023
This PR allows to use the CSV consumer and producer in a more versatile way.
There is no breaking change to the interface.

* fixes #263 (types built with an io.Reader should be able to produce
  CSV)

* csv/consumer can now consume CSV into *csv.Writer, io.Writer, io.ReaderFrom,
  encoding.BinaryUnmarshaler
* also supports the new CSVWriter interface, i.e. anything that can
  Write([]string) error like *csv.Writer
* also supports pointers with underlying type *[][]string, *[]byte and *string, not just
  *[]byte

* csv/producer can now produce CSV from *csv.Reader, io.Reader,
  io.WriterTo, encoding.BinaryMarshaler
* also supports the new CSVReader interface, i.e. anything that can
  Read() ([]string, error) like *csv.Reader
* also supports underlying types [][]string, []byte and string, not just
  []byte

* CSVConsumer and CSVProducer now stream CSV records whenever possible,
* like ByteStreamConsumer and Producer, added the CSVCloseStream()
  option

* added support to (optionally) configure the CSV format with CSVOpts,
  using the options made available by the standard library

* doc: documented the above in the exported func signatures
* test: added full unit test of the CSVConsumer and Producer

Signed-off-by: Frederic BIDON <fredbi@yahoo.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants