You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently validation of data usually happens in the browser to validate additions to the db before it's added to the appropriate data files - for example, ensuring that a title is not blank. But, a person can very easily get around this by directly modifying the files.
So... what if we moved the validation from before the data-file modification to after the updates to the data files are received to the other peers? Currently, ZeroNet already does something like this for filenames and ids via the content.json file (permissions).
So, I propose that we add a new data validation system where all updates to data files are checked on all clients to ensure the data is validated, and if it is not, then the update is not accepted.
And example of how this could be implemented for the data files that would go into the database is something like this, in the dbschema file:
As you can see, each column array accepts a third argument which is a string of all validation rules delimited by |. I based this on how Laravel's system works: https://laravel.com/docs/6.x/validation
You can see a list of all the rules of validation that Laravel supports: https://laravel.com/docs/6.x/validation#available-validation-rules
I've used some rules that laravel doesn't have, like "file" and "directory". Obviously we don't need to completely follow everything that Laravel has, but it offers a pretty good set to base our system off of.
The other problem is if we only implement this in the dbschema file... then we can only validate data files that are connected to the db... but we might want to validate other json files or other files in general.
I got this idea from a person who commented on issue #2204 and from PeerMessage which actually has something exactly like this - it can filter messages based on the content of the message.
The text was updated successfully, but these errors were encountered:
Currently validation of data usually happens in the browser to validate additions to the db before it's added to the appropriate data files - for example, ensuring that a title is not blank. But, a person can very easily get around this by directly modifying the files.
So... what if we moved the validation from before the data-file modification to after the updates to the data files are received to the other peers? Currently, ZeroNet already does something like this for filenames and ids via the content.json file (permissions).
So, I propose that we add a new data validation system where all updates to data files are checked on all clients to ensure the data is validated, and if it is not, then the update is not accepted.
And example of how this could be implemented for the data files that would go into the database is something like this, in the dbschema file:
As you can see, each column array accepts a third argument which is a string of all validation rules delimited by
|
. I based this on how Laravel's system works: https://laravel.com/docs/6.x/validationYou can see a list of all the rules of validation that Laravel supports: https://laravel.com/docs/6.x/validation#available-validation-rules
I've used some rules that laravel doesn't have, like "file" and "directory". Obviously we don't need to completely follow everything that Laravel has, but it offers a pretty good set to base our system off of.
The other problem is if we only implement this in the dbschema file... then we can only validate data files that are connected to the db... but we might want to validate other json files or other files in general.
I got this idea from a person who commented on issue #2204 and from PeerMessage which actually has something exactly like this - it can filter messages based on the content of the message.
The text was updated successfully, but these errors were encountered: