Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Regularization to the PINN Loss function #102

Closed
yorkiva opened this issue May 19, 2023 · 4 comments
Closed

Adding Regularization to the PINN Loss function #102

yorkiva opened this issue May 19, 2023 · 4 comments
Labels
enhancement New feature or request

Comments

@yorkiva
Copy link

yorkiva commented May 19, 2023

Is your feature request related to a problem? Please describe.
The condition module is the only place where you can define your physics-informed loss functions. However, there seems to be no way of adding additional regularizers like a L1 or L2 regularizer on the model's weights. Such regularization can be very useful to constrain overfitting, especially if training with noisy IC.

Describe the solution you'd like
It would be good to have an extension of the condition module (or something similar) to allow regularization.

Describe alternatives you've considered

Additional context
openjournals/joss-reviews#5352

@yorkiva yorkiva added the enhancement New feature or request label May 19, 2023
@dario-coscia
Copy link
Collaborator

dario-coscia commented May 25, 2023

👋🏻 @yorkiva Thank you for your comment. The Condition class is used to define the condition to apply (e.g. function) and where to apply it (e.g. location). The L2 regulariser on the weights can be accomplished by setting the regularizer parameter in the PINN class.

Eventually, I agree with you that a class Loss should be implemented, to enable the user to be more flexible. We will release soon a beta version of the software where we plan to introduce these features, but for mantainability we can not merge these features on the current version.

@dario-coscia
Copy link
Collaborator

Hello @yorkiva :)

Just wanted to let you know that in the beta version we will soon release the possibility to add a custom loss for the PINN (#105), and other very cool features such as gradient clipping, batch gradient accumulation, ... since we will use lightining Trainer module in backhand to train the PINN .

Thank you for the very useful feedbacks 😄

@danielskatz
Copy link
Contributor

Has this been resolved?

@yorkiva
Copy link
Author

yorkiva commented Jun 13, 2023

The regularizer in the PINN class only implements the L2 regularization. It should also be extended to incorporate L1 regularization. I have had some experience with comparing these regularizations and I have found that for some problems L1 regularization can be quite useful when the training data (i.e. boundary/initial condition) is noisy.
Since the authors mention that a general functionality for regularization would be added in the upcoming release, this issue can be closed for now.

@yorkiva yorkiva closed this as completed Jun 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants