New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
null values and addConcreteSettersForOptional #107
Comments
Yeah - I think that might be a mistake. I see no reason why it shouldn't be |
@lpandzic it is intentional, but feel free to change it. My reasoning is that if you have nulls you shouldn't be calling the setter at all; you should know whether your value is nullable or not. Another way of wording it: If you have a code base full of optionals you should already be passing parameters as optionals, so you should be using the setter for the optional type. That being said, there is no harm in just tolerating it in the library code like you suggest. |
So my reasoning why this is problematic:
Now of course there's the third case as you've mentioned where you have a non Optional parameter that you know whether is null or not and for that case current API makes sense. So with If we flip the So to finish off - I see it being slightly more sensible to have |
Just a clarification for @lpandzic and @mads-b you really really should not use The javax.validation is designed for validation. They values are expected to be possibly null. Before the object is validated the values could be null! Null analysis on the other hand is an invariant and is type based. Thats why you should either use JSpecify or Checker or Eclipses annotations as they are TYPE_USE annotations. That is why #106 is important to fix. @Randgalt it doesn't even make since to have methods with javax.validation.constraints (your gist you shared on reddit showing full builder) unless you immediately call the validator. In fact the requireNonNull on justAString is wrong as well as the previous lines. It should throw constraint validation exception. The other two parameters are To be honest a record isn't the ideal use case for using validation. It's actually the builder. A web request or whatever fills the builder and then when you build the record the builder is validated makes sure record has all the invariants likes not null. |
I'm not using it - I'm using the fact that IDEA lacks inference for bean validation On the topic of whether bean validation
I disagree. Invariants should be both documented and validated against records and not builders. Records are single source of truth for fields and properties of those fields. Whether a builder validates anything is and should be a configurable option and not a requirement. Any larger codebase with larger API surface cannot guarantee that builders are always used (e.g. external developers). If an external developer creates a record directly what happens then? Is validation done twice? Anyway, I believe this side story is a topic for another issue. I'd like to focus on changing the Optional handling in this issue. |
Maybe you can clarify that some more since I'm working on the other issue. I don't want to impact it. What inference are we talking about that isn't null analysis? Some completion? I haven't used IntelliJ in a year.
Well thats the thing they are not really invariants like types. I don't really care what people's interpretation of the javax.annotations is so long as your not using them for null analysis (you should not) as they are not TYPE_USE annotations. I have seen libraries do this and it is really annoying. The java validation mostly is used for this workflow: You have a HTML form presented on a GET request. Your model is an object that has validation annotations is either completely blank or partially filled. The object that gets put in the model (e.g. data binding) is often largely not valid. Then a POST of the form happens. Then you run validation in your controller. If you don't run validation first then you could get a null pointer exception instead of a constraint violation exception. Regardless you need the object in invalid state at times. That isn't the same as static analysis invariance I'm talking about with null analysis which you would want with Records. I meant earlier Builders are a better fit for model objects to be validated because they are inherently mutable and can very easily be in the wrong state (as well as some databinding requires mutable pojos). It's fine if the records have the validation annotations but they are not invariants. If you do magical AspectJ or whatever byte weaving magic where you validate on constructing I guess I could see it being invariant but I largely disagree with using them as "invariant" like types. Besides that totally breaks the data binding workflow that validation is normally used for.
You can certainly use records with validation annotations but the expectation is you will be creating an invalid objects all the time... and usually not fail immediately like a null pointer exception. This is completely separate from null analysis.
Assuming Valhalla does come in the next 5 years (which we will be lucky if loom comes in that time frame) we still have 20 or more years of code bases that can easily be annotated without breaking compile time including the JDK (e.g. Map.get isn't going to be changed to return Optional). Loom largely will not break existing libraries but trying to retrofit everyone to use Optional will. Besides null will still exist. The only reason I make a big deal out of the validation annotations is to get folks at least use proper annotations and this does not include javax.validation or the findbugs annotations. If people did we might have JSpecify or something like it widespread but we can't if you use validation annotations for null analysis or we just assume the JDK will fix it. Anyway its probably all moot because I assumed you were doing null analysis but if intelliJ does something else then you can clearly ignore me. |
IMO the fundamental problem is record-builder is interpreting what TYPE_USE
Obviously I can configure it so it ignores it but the library should be designed on how it interprets a TYPE_USE And the reason that is important right now is to your idea of creating the concrete setter which I'm not against but it should be annotated Frankly I think the whole trying to do validation and interpret As for validation being bloat for a record you could trivially add a mixin interface in your own codebase to do validation perhaps with a method called public interface ValdiateMixin {
default void validate() {
SomeStatic.getValidator().validate(this);
}
} I think even adding the concrete setter for optional is bloat. It's not hard to write |
Wrong is a loaded term here. As I recall we were trying to emulate what the Immutables library does and, again as I recall, it treats any non-null-like annotation as implying a null check. The goal of the library is utility so while it may not conform to a particular spec it does conform to those who wish to migrate from the Immutables library. If there's a strong desire to conform to a different spec that can be done via another option but I'd like to see a community desire for this. |
Can you provide a link of that behavior because the behavior I'm proposing of focusing on Now let us go look at your gist again. @Generated("io.soabase.recordbuilder.core.RecordBuilder")
public static FullRecord FullRecord(@NotNull List<Number> numbers,
@NotNull Map<Number, FullRecord> fullRecords, @NotNull String justAString) {
numbers = __list(numbers);
fullRecords = __map(fullRecords);
Objects.requireNonNull(justAString, "justAString is required");
return new FullRecord(numbers, fullRecords, justAString);
}
public static FullRecord FullRecord(@Nullable List<Number> numbers,
@Nullable Map<Number, FullRecord> fullRecords, @NotNull String justAString) { Then you arbitrarily decide that Strings on the other hand cannot be coerced. That null cannot be coerced to an empty string. I'll go check but I doubt Immutables does the above. It just seems like the |
Speaking of community. To rope this all back to the original issue (#107) and why I have gone on this crusade (I'm sorry I am an asshole but for good reason) is because I was putting in work to add So when @lpandzic proposes to change a generated method from implied And thats the thing this change is largely caused by a previous PR being added which was |
@agentgt like I said in one of my previous replies - I'm not proposing to change anything about nullness inference - only that the implementation for concrete setter on Optional record field uses |
Let us say I have a record that has an Optional and as an extra but not requirement lets say I am doing null analysis where the default is nonnull (that is only nullable is marked. I stress analysis and not inference like you are saying). In my code I expect this to fail fast with a null pointer exception:
I might even have a unit test to expect that behavior. It certainly will get kicked on if mutation testing is done. So hopefully you only break peoples unit tests with your change. Now for the null analysis without it being marked this method appears to be:
And indeed that was the original contract. Thus your change will never be seen and possibly misunderstood as well debatable breaking the contract. Furthermore its confusing. If you are following the Maybe if the setter gets prefixed with Speaking of which Finally let's not forget why this whole issue was brought up in the first place. You assumed that |
@lpandzic Immutables which apparently is what this library is so supposed to emulate does not do what you want. import org.immutables.value.Value;
@Value.Immutable
public interface Stuff {
Optional<String> someString();
} @Generated(from = "Stuff", generator = "Immutables")
public static final class Builder {
private String someString;
//snip
/**
* Initializes the optional value {@link Stuff#someString() someString} to someString.
* @param someString The value for someString
* @return {@code this} builder for chained invocation
*/
public final Builder someString(String someString) {
this.someString = Objects.requireNonNull(someString, "someString");
return this;
} See how this "reasonable" PR can lead to many more issues than you might think. |
Can we pause this discussion for a bit please? When I get some more time I'll look into this deeper. I feel confident there's a good solution here that can meet everyone's needs. |
Yes that will probably be good for all. However I can't keep working on #106 or the other nullable annotation bugs till some decisions are made. Which is why I have been so back and forth on this as there are some serious previous design decisions that are impacting me. So I'm fine with holding off indefinitely till some decisions are made (like lets copy what immutables does). And I'm sorry for coming off rough and flooding the inbox but I have already sunken some time on this (a lot more than just changing some call from I'll stop work and commenting till the dust clears. |
I find that addConcreteSettersForOptional uses
Optional#of
instead ofOptional#ofNullable
quite surprising especially since Intellij IDEA by default doesn't warn against passing nulls tojavax.validation.constraints.NotNull
annotated parameters.What is the reasoning behind this decision?
The text was updated successfully, but these errors were encountered: