Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG][Ruby] Required properties with validation requirements generate invalid tests #16629

Closed
6 tasks done
ivgiuliani opened this issue Sep 20, 2023 · 5 comments · Fixed by #16702
Closed
6 tasks done

Comments

@ivgiuliani
Copy link
Contributor

Bug Report Checklist

  • Have you provided a full/minimal spec to reproduce the issue?
  • Have you validated the input using an OpenAPI validator (example)?
  • Have you tested with the latest master to confirm the issue still exists?
  • Have you searched for related issues/PRs?
  • What's the actual output vs expected output?
  • [Optional] Sponsorship to speed up the bug fix or feature request (example)
Description

Definitions that:

  1. have a required (non-null) attribute...
  2. ...with some validation requirements

generate a model that attempts to validate that any time such attribute is assigned, it satisfies the above requirements.

This definition for example:

SampleModel:
  properties:
    key:
      minLength: 15
      type: string
  required:
    - key

Will require that SampleModel.key is non-null, and of length >= 15. If I followed the trail correctly, this is enforced by the partial_model_generic template (specifically, these lines that add attribute validations).

Unfortunately this means that the generated tests fail, because nothing in the test generate a valid value for these properties (and not sure how they could) but the default constructor attempts assigning a default value (nil in this case) that the validator rejects.

I'm happy to put together a PR but I'm not quite sure what's the fix here as tests can't make up values that validate requirements, and given the validation is somewhat arbitrary not quite sure how to go about this.

openapi-generator version
  • master
  • 7.0.1
OpenAPI declaration file content or url
swagger: "2.0"
info:
  title: Example API
  version: '1'
consumes:
  - application/json
produces:
  - application/json
basePath: /
host: example.com
definitions:
  SampleModel:
    properties:
      key:
        minLength: 15
        type: string
    required:
      - key
    type: object
    x-access:
      - Public
paths:
  /test:
    post:
      consumes:
        - application/vnd.api+json
        - application/json
      operationId: CreateTest
      parameters:
      - name: uri
        in: body
        description: Test
        required: true
        schema:
          $ref: "#/definitions/SampleModel"
      responses:
        200:
          description: OK
schemes:
- https
Steps to reproduce

Using the definition file from above (I'm using bundle here, but the steps to run rspec might differ depending on the local setup):

$ java -jar modules/openapi-generator-cli/target/openapi-generator-cli.jar generate -g ruby -i minimal-repro.yaml -o repro
...
$ cd repro
$ echo '3.2.2' > .ruby-version
$ bundle install
...
$ bundle exec rspec
............................F.

Failures:

  1) OpenapiClient::SampleModel test an instance of SampleModel should create an instance of SampleModel
     Failure/Error: fail ArgumentError, 'key cannot be nil'

     ArgumentError:
       key cannot be nil
     # ./lib/openapi_client/models/sample_model.rb:96:in `key='
     # ./lib/openapi_client/models/sample_model.rb:63:in `initialize'
     # ./spec/models/sample_model_spec.rb:21:in `new'
     # ./spec/models/sample_model_spec.rb:21:in `block (2 levels) in <top (required)>'
     # ./spec/models/sample_model_spec.rb:25:in `block (3 levels) in <top (required)>'

Finished in 0.00787 seconds (files took 0.13927 seconds to load)
30 examples, 1 failure

Failed examples:

rspec ./spec/models/sample_model_spec.rb:24 # OpenapiClient::SampleModel test an instance of SampleModel should create an instance of SampleModel
Related issues/PRs
Suggest a fix

Can't think of a good solution, will need to have a look at what other languages do. The least worst option I came up with is to avoid the explicit nil assignment in partial_model_generic.mustache when an attribute is required. Practically, we could avoid that assignment entirely as it will be nil by default which, from a quick look, is what (at least) the java and python generators do but I suspect that nil assignment is there for a reason?

@wing328
Copy link
Member

wing328 commented Sep 28, 2023

but I suspect that nil assignment is there for a reason?

I can't recall the reason.

What about submitting a PR removing the nil assignment to start with?

@ivgiuliani
Copy link
Contributor Author

I wanted to go backwards through the git history to work out why that nil assignment was added as the conditions around the assignment seem intentional in the template, but I haven't had time to follow through. I have set some time aside for this tomorrow to see if I can come to some conclusion

@wing328
Copy link
Member

wing328 commented Sep 28, 2023

👌 up to you

Unfortunately this means that the generated tests fail,

The auto-generated tests are for demo purpose only. We occasionally run into issues you reported in other generators and we simply just comment out those tests (and users can uncomment and edit those if needed).

The goal of the auto-generated tests is to make it easier for users to write tests as they can simply edit the auto-generated test code but based on our experience, it won't work perfectly for all use cases.

@ivgiuliani
Copy link
Contributor Author

Spent some time looking at this today, and fundamentally there isn't a good fix for this. The explicit nil assignment needs to stay there, as otherwise it becomes possible to initialize invalid models by not passing a required attribute. I can't really think of anything else that would fix this other than commenting out most if not all of the actual test code (model_test.mustache), but feels a bit heavy handed.

Taking a step back, the reason I was trying to get this sorted is because we ended up accidentally shipping a broken client to internal consumers off the back of #5582, so I was trying to run something that would at least give some reassurances the output code is valid. I've since solved that problem in a different/better way, but would have been nice to have tests also pass by default. However as you said, these are not exactly useful right now and largely there for demo purposes, so the return of investment on resolving this issue is limited.

With that in mind, I'd be happy for this to be closed - unless there's better ideas...

@wing328
Copy link
Member

wing328 commented Oct 2, 2023

@ivgiuliani thanks again for your time looking into this issue.

I've filed #16702 to improve the model tests and it now works right out fo the box for the spec you provided.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants