Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow "anyOf" in param["schema"]["type"] in openapi31 #143

Merged
merged 5 commits into from
Oct 24, 2023

Conversation

glatterf42
Copy link
Contributor

@glatterf42 glatterf42 commented Oct 13, 2023

As hinted at in #112, our project is currently changing from pydantic v1 to v2. This means that type hints like <type> | None have become necessary due to pydantic's v2 changes away from implicit default values, at least as far as I can see. For a field to be optional, a default value of the corresponding type or None has to be provided now, even if the type of the field is Optional[<type>]. This means a lot of default None values have been set by adapting to pydantic v2, which need to be accommodated by our endpoints' expected input and response models. Therefore, our docs/openapi-v1.json entries now look similar to this:

{"name": "table", "in": "query", "required": false, "schema": {"anyOf": [{"type": "boolean"}, {"type": "null"}], "default": false, "title": "Table"}}

Since we are using version 3.1.0 of openapi, this leads to the following error:

Exception occurred:
  File "/home/fridolin/ixmp4/.venv/lib/python3.10/site-packages/sphinxcontrib/openapi/openapi31.py", line 312, in _httpresource
    type=param["schema"]["type"], name=param["name"]
KeyError: 'type'

This PR aims to remedy this issue by taking type if it's present in schema or the set of types present in anyOf otherwise. The fix is taking pretty directly from #113.

@stephenfin
Copy link
Member

This looks pretty good. If you can add a unit test for this functionality, I'll be happy to merge it and cut a release.

@glatterf42
Copy link
Contributor Author

glatterf42 commented Oct 19, 2023

Thanks for enabling the tests :)
I tried to put a unit test in the correct location, but I'm not entirely sure I got the correct one: it is now in tests/renderer/httpdomain/test_render_parameter.py as one of the test_render_parameter_query tests.
Some further notes:

  • This test should probably be marked as only working for the openapi31 renderer, but I couldn't quickly find how to do that
  • I'm not entirely sure if the expected test result is formatted correctly in terms of the expected type; you can now see here e.g. for join_parameters how it's looking in our docs
  • I made these changes using the vscode web editor, for which the usual black-formatter extension is not working. Sorry if this is causing trouble.
  • The small if "format" in t.keys() condition was copied from type = param['schema']['type'] breaks in newer spec #113. None of our parameters has a "format", so I don't actually know if adding the "format" key as a type is desired behaviour. In the test_render_parameter_query_type_with_format, the "format" key seems to hold additional information, not the type itself. Maybe the if condition needs adapting.

@stephenfin
Copy link
Member

I pushed a test for you. It's currently failing which unless I've done something obviously wrong would suggest you've a bit more work to do here?

@glatterf42
Copy link
Contributor Author

Thanks a lot :)
To me, the error message looks like the code I'm trying to push isn't called yet:

testrenderer = <sphinxcontrib.openapi.renderers._httpdomain.HttpdomainRenderer object at 0x7fa948dd5ff0>
testspec = {'info': {'title': 'Reproducer for issue #112', 'version': '2.0.0'}, 'openapi': '3.1.0', 'paths': {'/users': {'get': {...ath', 'name': 'userID', 'schema': {...}}], 'responses': {'200': {'content': {...}}}, 'summary': 'Get a user by ID.'}}}}
rendered = PosixPath('/home/runner/work/openapi/openapi/tests/renderers/httpdomain/rendered')

    def test_render(testrenderer, testspec, rendered):
        testspec_name, testspec = testspec
>       rendered_markup = "\n".join(testrenderer.render_restructuredtext_markup(testspec))

tests/renderers/httpdomain/test_render.py:23: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
sphinxcontrib/openapi/renderers/_httpdomain.py:239: in render_restructuredtext_markup
    yield from self.render_paths(spec.get("paths", {}))
sphinxcontrib/openapi/renderers/_httpdomain.py:268: in render_paths
    yield from self.render_operation(endpoint, method, operation)
sphinxcontrib/openapi/renderers/_httpdomain.py:295: in render_operation
    yield from indented(self.render_responses(operation["responses"]))
sphinxcontrib/openapi/renderers/_httpdomain.py:28: in indented
    for item in generator:
sphinxcontrib/openapi/renderers/_httpdomain.py:399: in render_responses
    yield from self.render_response(str(status_code), response)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sphinxcontrib.openapi.renderers._httpdomain.HttpdomainRenderer object at 0x7fa948dd5ff0>
status_code = '200'
response = {'content': {'application/json': {'schema': {'items': {'properties': {'deleted': {...}, 'id': {...}, 'username': {...}}, 'type': 'object'}, 'type': 'array'}}}}

    def render_response(self, status_code, response):
        """Render OAS operation's response."""
    
        yield f":statuscode {status_code}:"
        yield from indented(
>           self._convert_markup(response["description"]).strip().splitlines()
        )
E       KeyError: 'description'

sphinxcontrib/openapi/renderers/_httpdomain.py:406: KeyError

It seems like render_response() is expecting each response dict to have the key description, so I added one to both responses in your test.

@glatterf42
Copy link
Contributor Author

The latest test failure was caused by the function _get_markers_from_object() in openapi/renderers/_httpdomain.py:112. This function appended the schema_type to an empty list called markers, which ended up being a list appended to a list for the bio property in the new test schema. Line 625 then uses the join() function which expects an Iterable of strings, not a list of lists, so it failed.
My fix now checks if schema_type is a list and uses that as the base markers instead. There was one further issue then, namely the non-existing directory tests/renderers/httpdomain/rendered/v3.1/, so I included that (with the file rendered by my local pytest run). So judging from my local run, the tests should be passing now :)

One more note: on my local system, I use ruff for linting and code style checks, and it was applied to openapi/renderers/_httpdomain.py here, too. It sorted the imports alphabetically and deleted some f-string markers (it doesn't like f-strings without actual {} expressions in them). Feel free to revert any or all of these changes, but I can highly recommend ruff. It is much faster than e.g. flake8, which you seem to be using here, and can do additional stuff like isort or even black formatting (in alpha testing).

@glatterf42
Copy link
Contributor Author

Thanks for running again! Could you please fix the end of the file tests/renderers/httpdomain/rendered/v3.1/issue-112.yaml.rst as per the pre-commit failure? This is the only failure now and you probably have the correct working environment already set up. Otherwise, I will do this tomorrow.

Signed-off-by: Stephen Finucane <stephen@that.guru>
@stephenfin stephenfin merged commit 803da3f into sphinx-contrib:master Oct 24, 2023
13 checks passed
@stephenfin
Copy link
Member

This should be part of 0.8.2. Thanks for the quick turnaround on my review feedback 👍

@glatterf42
Copy link
Contributor Author

Thank you for your time here, looking forward to the new release :)

@glatterf42 glatterf42 deleted the enh/openapi31-anyOf branch October 24, 2023 16:21
@glatterf42
Copy link
Contributor Author

You can probably close #53, #54, #96, #112, and #113 now, too :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants