Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Autosize" union type accepts string formatted number #30

Closed
MingboPeng opened this issue Jan 30, 2020 · 16 comments
Closed

"Autosize" union type accepts string formatted number #30

MingboPeng opened this issue Jan 30, 2020 · 16 comments
Assignees

Comments

@MingboPeng
Copy link
Member

MingboPeng commented Jan 30, 2020

@chriswmackey
Currently, those autosizable parameters accepts either number or string, but it should be only accepting "autosize"/"autocalculate" if the input is string. In current schema design, a string formatted number is also accepted, but it could potentially cause a problem.

@mostaphaRoudsari
Copy link
Member

In current schema design, a string formatted number is also accept, but it could potentially cause a problem.

Whatever decision we end up making let's not do this one. I can see this be a major problem in the long run.

@chriswmackey
Copy link
Member

A string formatted number is not acceptable in the current schema because of this validator:
https://github.com/ladybug-tools/honeybee-schema/blob/master/honeybee_schema/energy/hvac.py#L75-L78

I realize that validators don't translate into the OpenAPI schema file, though, so that's probably why you thought this. We'll have to come up with some system for you to be aware of the validators on the C# side going forward.

In any case, this one situation of floats and strings has caused so much headache and I really just want a solution at this point. Should we just change this to None meaning NoLimit and negative numbers being autosize?

@MingboPeng
Copy link
Member Author

MingboPeng commented Jan 30, 2020

@chriswmackey is None able to be translated to OpenApi schema file? As we are discussing the same situation in ScheduleTypeLimit that default=None won't be labeled in schema file.

@chriswmackey
Copy link
Member

For the case of the "heating_limit" and "cooling_limit", I think we would set the default to be -1 for autosize rather than None, though None should still be an acceptable input.

For the ScheduleTypeLimit, you can still tell if a given property has a default of None from the OpenAPI specification if it doesn't have a default and it doesn't appear in the "required" section of the object that it is a part of like so:
image
Also note that, if any key does not appear in the "required" section, then it is perfectly fine to send a JSON that lacks the key in its entirety.

On a side note, I just realized that the view_factor attribute of the Outdoors boundary condition has this same situation where it can accept either a number or an "autocalculate" string:
https://www.ladybug.tools/honeybee-schema/model.html#tag/outdoors_model
Let me know if we need to do something similar here where we need to accept negative inputs to get it to work.

@MingboPeng
Copy link
Member Author

@chriswmackey for "heating_limit" and "cooling_limit" case (including all other autosize, autocalculate params), making default value to -1 would work for fixing only accepting numbers, not string formatted numbers. But this only works when all honeybee client sides (rhino, grasshopper, revit..other developers) use -1 for default value or 'autosize'.

In terms of accepting None for these number type parameters, any case that generates a json file with null (None) value to these number type parameters, will still crash the c#. Because they are still not Nullable type.

As you said, it will work not putting these optional keys (heating_limit, etc) in Json file, but cannot assign null to heating_limit. They are two different cases.

From python/c# developer perspective, when do we really need to assign a Null value to these optional parameters, as we don't have to assign anything instead of putting a Null.

Here are two examples:
schedule_type_limit("a new name")
vs
schedule_type_limit("a new name", lower_limit = null)

Technically, I can, and have changed a the c# generator's template, to make all non-required parameters Nullable. But I still don't think we should accept None for these Optional parameters.

At SDK level, we should take care of optional parameters for its default value if user(developers) don't assign value to them, instead of accepting nulls from user(developers).

To summarize above comments:

  • let's change all 'autosize'/'autocalculate' type to number only, and use -1 for 'autosize'/'autocalculate'/

  • putting default values for optional parameters, instead of accepting nulls.

Please let me know if you think otherwise.

@chriswmackey
Copy link
Member

@MingboPeng ,
Can we just set up a call between you, Mostapha, and I so that we can understand why it is difficult to support some of these things.

In the first case of string formatted numbers, I am unclear on why something akin to the validator we have in Pydantic couldn't be implemented on your side.

Second, you'll have to clarify what you mean by "not accepting None" since this isn't realistic for most cases or, rather, trying to get rid of it is likely going to involve implementing more cases of (string, float) union like accepting lower_limit = "NoLimit". If you are just saying that, for our JSON writing functions, we should usually just not write in keys if they are None, I would agree and this is generally what I have done in most of my to_dict methods. But not allowing our API users to input None for a key if they want to be explicit about it doesn't seem realistic or even enforceable within OpenAPI.

Let's just set up a call. We've had these issues hanging over our head for a while and changing them is going to involve fair amount of work and PRs across several repos that I I really don't want to have to revert if we're wrong. So let's just set up a call and get a final decision between you, @mostaphaRoudsari and I.

@chriswmackey
Copy link
Member

I just wanted to document what we decided as a final solution to these issues this morning:
We will add three new objects to the schema to deal with these cases: AutoCalculate, AutoSize and NoLimit. These will be used in place of strings with the same name and null values such that all of these cases will essentially be unions of floats with these 3 objects. We like this solution above all others because:

  1. It's explicit about what the assumption is (using negative numbers was implicit and tough to interpret by looking at schema samples).
  2. It avoids illegal combinations of values. For example, in the solution Antione had originally proposed for these cases, he had separate keys for 'conig' and 'value', which made it possible for someone to put in a "heating_limit" of "AutoSize" for 'config' but a number for the 'value'. The currently proposed solution prevents us from having to add additional validators to check illegal cases like this.
  3. These 3 objects are easy to check and allow @MingboPeng to deal with all situations of autosize and no limit across the schema through a single object.

I'll work on implementing this solution today.

chriswmackey added a commit to chriswmackey/honeybee-schema that referenced this issue Feb 3, 2020
chriswmackey added a commit to chriswmackey/honeybee-schema that referenced this issue Feb 3, 2020
chriswmackey added a commit that referenced this issue Feb 4, 2020
@chriswmackey
Copy link
Member

Took a while to address but it' finally implemented.

@MingboPeng
Copy link
Member Author

Hi @chriswmackey, for some reason, the new "NoLimit" is not exported to OpenAPI correctly.

image

@MingboPeng MingboPeng reopened this Feb 4, 2020
@chriswmackey
Copy link
Member

Unfortunately, nothing I've tried seems to get rid of the extra "allOf" keys in the OpenAPI specification:
image
I may have to resort to hacking Pydantic, which may produce undesirable results. Is the extra heirarchy going to be a dealbraker for you, @MingboPeng ?

@chriswmackey
Copy link
Member

I have noticed that I can get this extra "allOf" hierarchy to go away only when:

  1. I make the key required (so there is no default value)
  2. I don't use the Filed class to add other properties like the description

This all seems to odd to me since the lack of the "allOf" keys is pretty clearly the desired OpenAPI output and there's no clear reason why the Filed class should change this. I've mentioned to @mostaphaRoudsari that this really feels like a bug in Pydantic to me but I know he's still not convinced yet.

Let's get Mostapha's thoughts and, if he doesn't know of any clean way to prevent the extra "allOf" keys, I will just overwrite the pydatnic-produced schema using the schema_extra capability.

@chriswmackey
Copy link
Member

I can confirm that the schema_extra capability can get the job done but it feels pretty hacky. I'll hold off now before going any further.

@MingboPeng
Copy link
Member Author

@chriswmackey I think this is what we discovered previously: Union doesn't like Field in Pydantic.

@mostaphaRoudsari
Copy link
Member

mostaphaRoudsari commented Feb 5, 2020

I've mentioned to @mostaphaRoudsari that this really feels like a bug in Pydantic to me but I know he's still not convinced yet.

It looks like a bug to me. My other comment was only clarifying that the sample that you linked to was related to a different topic.

You should go ahead and report it. It looks like adding using the combination of Union and Field makes unnecessary nested allOf objects. It might be that we should use Field differently with Union or it is simply a bug in Pydantic.

It is good that you have already figured out a workaround for now.

@chriswmackey
Copy link
Member

Sorry for incorrectly paraphrasing, @mostaphaRoudsari . I just knew that you wanted us to investigate more before reporting it. If we agree that this smells like a bug, I'll open an issue on Pydantic github. The code to recreate the issue should be minimal so hopefully they can tell us pretty quickly if it's a bug or not. I may also just implement my workaround for now, though I can see that Pydantic is still making frequent releases (they just released 1.4) so I won't spend much time on it.

@chriswmackey
Copy link
Member

I'm just noting that I put in the workaround as part of this PR:
#45
... and it seems to be working in the generated docs so I'll close out this issue for now.

I also reported this issue on the Pydantic github:
pydantic/pydantic#1209
After we get a response from them as either a bug fix or a note that we're doing it wrong, we can hopefully remove my workaround.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants