-
-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Interfaces and example validator #2
Conversation
I like it! Feels nice and sleek. I'm not up to speed on typescript yet so I might be missing the point here, but I'm wondering a bit on how we should handle types. So the scenario I'm thinking of is something like your example above: var validationSchema = {
username: { min: 6 }
} Let's say we have an ordinary input for this, if the view adaptor is angular and we told it to be a text we get a string to validate against, but if its a This can be fine, we effectively move the type transformation and therefore the validation of type to the view adaptor and what The other way to do it I guess would be to make type in the validation schema a none optional value (or optional with default) and let form-js be able to type coerce and check if needed.This on the other hand could end up doubling type coercion/transformation in some cases. (I do this in schema form and to properly support all type validations I had to validate the |
Hey David! Thanks for the feedback. FWIW I'm working on an example validator function (for As for your question... I'm still pretty on the fence about type validations, to be honest. (Coercing and) validating primitive types is easy. Objects is a lot more difficult. I'm half-tempted to say we consider only supporting primitive type validation. Either way- I had envisioned type validations being optional. (That's what the I think one of our goals should be to keep all of the validation logic inside of the vanilla Forms JS library. |
Cool, looking forward to it. Yeah I think basic types only is a good idea, I don't think its necessary to support objects, arrays etc They are more suited to custom validation functions. But I do think we need to support integers and step values on numbers and date. I'm not sure type can be optional, but I think that will clear itself up in the future. |
Okay. I'll update the interface definitions shortly! And by "optional" I mean "defaults to |
nice! |
I may be misunderstanding what you mean when you say "validating objects" but if you're saying what I think you're saying then this would be an awesome place to bring in apiCheck.shape({
name: apiCheck.shape({
first: apiCheck.string,
last: apiCheck.string
}),
age: apiCheck.number,
isOld: apiCheck.bool,
walk: apiCheck.func,
childrenNames: apiCheck.arrayOf(apiCheck.string)
})({
name: {
first: 'Matt',
last: 'Meese'
},
age: 27,
isOld: false,
walk: function() {},
childrenNames: []
}); // <-- pass
apiCheck.shape({
mint: apiCheck.bool,
chocolate: apiCheck.bool
})({mint: true}); // <-- fail There's a lot more to the api that makes it very very flexible. |
@davidlgj I've updated the interfaces and added an example @kentcdodds Looks like using apiCheck for type-validation could potentially work. Some small concerns though:
I'm not initially convinced that apiCheck buys us all that much if we have to validate individual fields already. It seems like it's a nice utility to short-hand validate types on an object, but... if you have to examine individual fields (as we will have to) how much is it saving us? |
* Use a Promise to perform asynchronous validation or to return a custom failure message. | ||
*/ | ||
export interface ValidatorFunction { | ||
(value:any, formData:any):boolean|Promise<string>; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't string
be a valid return type and also Promise is probably resolvable to boolean
too:
(value:any, formData:any):boolean|string|Promise<string|boolean>;
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't string be a valid return type
Potentially. What would it mean for a custom validator to return a string? Does that indicate a valid or invalid value? I assume a string would have to be treated as an error message but I worry this confuses our truthy/falsy return scenario. Do we really need it? If you want to return a synchronous failure message, you could always return Promise.reject("Your message")
.
Promise is probably resolvable to boolean too
I think this would be redundant. A resolved promise means a valid value; a rejected promise means an invalid value. The only reason a string
type is specified is to support the case of an overriding, custom validation failure message.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using rejected/resolved state of promise for determining validation makes sense. I take back my suggestion for resolving to boolean. But if a validator that returns promise can provide custom error message, shouldn't a validator that is not returning a promise be able to do the same?
Something like:
funtion validate(val, formData){
if (val % 5 !== 0) return "value should be mod 5"
if (val % 3 !== 0) return "value should be mod 3"
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm on the fence. What do others thing?
On the one hand, this...
funtion validate(val, formData){
if (val % 5 !== 0) return "value should be mod 5"
if (val % 3 !== 0) return "value should be mod 3"
}
...could already be achieved with this...
funtion validate(val, formData){
if (val % 5 !== 0) return Promise.reject("value should be mod 5");
if (val % 3 !== 0) return Promise.reject("value should be mod 3");
}
So it's really just a question of...is the syntactic sugar worth the added complexity?
I like this architecture! I think we're on right track. One minor reminder, the |
Thanks @mohsen1! Actually a couple of us chatted last Thursday and I believe (@davidlgj, @mchapman, correct me if I'm wrong) we agreed that the benefits of making
This changeset just illustrates the validation schema. What isn't (yet) represented here is what we started calling the view or layout schema - which controls the ordering of fields. I'm not sure if we settled on a proposal for how that ordering should be specified. Seems like something we should all chat about? |
Good concerns @bvaughn. First, I don't know what to say about the future of any library. One thing that I can tell you is that we are using this library heavily at work and angular-formly is also using this library. So I'm pretty committed to its success. It wont do anything for us with respect to coercion. I'm not sure I understand what you mean when you say:
Could you explain what you mean? |
Sure thing @kentcdodds. Please correct me if I make any wildly inaccurate statements or assumptions here.. :) Forms JS supports multiple validation rules. Data type is only one of these conditions. In order to support other, non-type validations (min/max length/size, pattern, required, etc) we need to look at individual field values. (One way we could do this is shown in this pull request.) I believe apiCheck is concerned solely with data type so while it may be useful for validating the structure of an object- using it would not prevent us from having to iterate over individual fields of the other validate criteria. If we're already doing lower-level validations (e.g. min value) then it's easy to also check the type while we're working at the level of an individual field. We also need to return validation errors at the field-level (not form-level) so we can show inline validation errors. I think apiCheck deals more at the form-data-level than the field level, so we'd have to merge the 2 sources of potential failure... |
@bvaughn so you're planning to not use JSON Schema and use a secondary object to describe the schema. Those are the main reasons I moved away from angular-schema-form. For a dynamic app like Swagger Editor, this was a problem since I had to construct a "view" schema for each schema programmatically. My suggestion is, if we want to use a JavaScript object for schema, we should use JSON Schema semantics and don't invent a new format. |
I see what you're saying. Yes, built-in stuff with In fact, I should probably add a Required is also handled because when using the So, I hadn't thought about this, but users could actually specify an apiCheck object (call it whatever you want) which we could use as the validation api. It could be used to validate the entire model (not just a single field). I think that the API is clean and battle tested (it's very similar to React's PropTypes used by thousands of React developers all over the world for years now). Whether we do this with |
@mohsen1, I think you're illustrating the very reason that accomplishing this |
@mohsen1 Ah, I thought we had already discussed this in a previous issue and more or less reached agreement. In short- Forms JS validations are neither a subset nor a superset of JSON schema. (JSON schema simultaneously does too much and not enough in terms of the validation use-cases we know we'll need to support.) There is some overlap, but supporting the full JSON schema would be too much work and likewise, extending JSON schema would (in my opinion) be just as much work because of the need to merge validation results from a JSON schema validator and Forms JS custom validations. I feel strongly that the right approach here is for us to write a library that consumes JSON schema and turns it into Forms JS validation. We can video chat about this more though if there's still concern? I think it's important that we reach a consensus here. @kentcdodds Cool, apiCheck sounds pretty powerful. Maybe even something we should run on an incoming Forms JS validation schema? (I think you've proposed this in the past, and I'm totally game - provided it doesn't negatively impact performance.) I personally think it would be nice if our validation approach was consistent (principle of least surprise) rather than half apiCheck and half something else. So for that reason primarily, I'd initially be slightly opposed to using apiCheck to validate field-type. |
@bvaughn, totally agree with the principle of least surprise. As far as performance goes, it definitely impacts performance, though I've never benchmarked it. However, just like with React's PropTypes, you generally want to disable it in prod (which is quite easy). |
@kentcdodds Ah, right. I recall you suggesting as much before. In that case, I'm totally on-board. I'm kind of envisioning that we'll build a debug version of Forms JS and a production ready, minified/uglified version. In the debug version, using apiCheck to do run-time analysis of incoming form and layout schema objects sounds pretty useful. Let's do it. :) |
With the power of webpack and uglify, we could even have it totally remove all the apiCheck code as well and save on some bytes :-) This is similar to what React does (except they don't use webpack, but some of their own home-grown stuff). |
Here's a thought: one thing we could consider that would address my concern about additional complexity of using TypeScript's union types while simultaneously making our validation schema easier to validate using apiCheck.js... Instead of a validation schema that looked like this: var validationSchema = {
fields: {
simpleRequired: {
required: true
},
advancedRequired: {
required: {
rule: true,
failureMessage: "This is a custom required failure message"
}
}
}
}; Maybe we could approach it like this: var validationSchema = {
fields: {
aRequiredField: {
required: true,
requiredFailureMessage: "This is a optional, custom required failure message"
}
}
}; Then we could validate our schema with apiCheck more or less like this: apiCheck.shape({
required: apiCheck.boolean.optional,
requiredFailureMessage: apiCheck.string.optional,
// Other validation types here...
})(validationRules); This would simplify our interface declarations too, replacing |
Just updated the pull request with an example of what I'm talking about above. (Can always revert if you guys dislike it!) |
failureMessage = validatableAttribute.minFailureMessage || "Must be at least ${min}."; | ||
} else if (typeof value === 'number' && value < validatableAttribute.min) { | ||
failureMessage = validatableAttribute.minFailureMessage || "Must be at least ${min} characters long."; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you get these default messages the wrong way round?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Haha, yup. Sure did. This is the kind of thing that- should we approve this approach- I will add unit tests for before proceeding.
In light of our Hangout and email decisions to move forward with some prototypes, I'm going to merge this branch. If we decide to throw it out in favor of a different type of validation in the future- that's fine. At least it makes a starting point. When our wrapper libraries (Angular, web component, etc.) detect a change in field-values, they can validate with |
Interfaces and example validator
Here's an example of how we could (potentially) represent the validation schema outlined in the shared Google doc by way of TypeScript interfaces.
Open questions for discussion:
required: true
) or an object (required: { value: true, failureMessage: "This is a custom error" }
) is certainly convenient from a user perspective but it also complicates our interface definitions. I assume this is okay, but it seems worth confirming.constraints
field because it felt unnecessary/redundant. Is this okay?The below interfaces describe a validation object like this: