Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#3748] feat: add New IODescriptor "TypedJSON" #3749

Closed
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 37 additions & 1 deletion src/bentoml/_internal/io_descriptors/json.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,16 @@
from __future__ import annotations

import sys
import json
import typing as t
import logging
import dataclasses

if sys.version_info >= (3, 9, 2):
from typing import TypedDict
else:
from typing_extensions import TypedDict
Comment on lines +9 to +12
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since 3.7 will reach EOL in 2 months, it is probably best not to do this and just use t.TypedDict instead when BentoML drops 3.7 support

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason this is checking for 3.9.2?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, read the above comment. Looks like this will have to stay in even if we drop 3.7 support? Definitely not a reason to delay this PR, I think.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably also need to maintain typing_extensions in pyproject.toml


import attr
from starlette.requests import Request
from starlette.responses import Response
Expand Down Expand Up @@ -180,6 +186,8 @@ def __init__(
self,
*,
pydantic_model: t.Type[pydantic.BaseModel] | None = None,
typed_dict: t.Type[TypedDict] | None = None,
sample: JSONType | None = None,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets not put sample here since there is no need to.

Suggested change
sample: JSONType | None = None,

validate_json: bool | None = None,
json_encoder: t.Type[json.JSONEncoder] = DefaultJsonEncoder,
):
Expand All @@ -192,6 +200,24 @@ def __init__(
pydantic_model, pydantic.BaseModel
), "'pydantic_model' must be a subclass of 'pydantic.BaseModel'."

if not typed_dict and sample:
raise BadInput(
"JSON argument 'sample' only support with typed_dict if you want to set sample "
"use JSON.from_sample()"
)
if (
typed_dict and typed_dict is not dict
): # there is no way to check isinstance TypedDict in runtime
assert issubclass(
typed_dict, dict
), "'typed_dict' must be a subclass of 'typing.TypedDict or dict'."

if typed_dict:
from pydantic import create_model_from_typeddict

self._typed_dict_model = create_model_from_typeddict(typed_dict)
Comment on lines +215 to +218
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean typed_dict depends on pydantic? it seems kinda strange that in order to use TypedDict you need to have pydantic installed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this is the case then typed_dict and pydantic_model should be mutually exclusive then.

Might also need a import check or just lazy load it

else:
self._typed_dict_model = None
self._pydantic_model = pydantic_model
self._json_encoder = json_encoder

Expand Down Expand Up @@ -300,8 +326,18 @@ def input_type(self) -> UnionType:
return JSONType

def openapi_schema(self) -> Schema:
if not self._pydantic_model:
if not self._pydantic_model and not self._typed_dict_model:
return Schema(type="object")
elif self._typed_dict_model:
return Schema(
**schema.model_process_schema(
self._typed_dict_model,
model_name_map=schema.get_model_name_map(
schema.get_flat_models_from_model(self._typed_dict_model)
),
ref_prefix=REF_PREFIX,
)[0]
)

# returns schemas from pydantic_model.
return Schema(
Expand Down