Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using custom arguments in boundary condition #250

Closed
karthikncsuiisc opened this issue Feb 23, 2024 · 2 comments
Closed

Using custom arguments in boundary condition #250

karthikncsuiisc opened this issue Feb 23, 2024 · 2 comments
Labels
help wanted Extra attention is needed

Comments

@karthikncsuiisc
Copy link

I want to use custom arguments in my boundary conditions. I tried the below code and it gives error. This is a simple example. In the next step, I would like to use custom parameters along with input_ parameter in my boundary condition.

Is there a tutorial for implementing such a type?

class Eqn1D(TimeDependentProblem, SpatialProblem):
    def __init__(self,params):
        super(Eqn1D, self).__init__()

        self.params = params
        self.lbc = params.lbc

    # @classmethod
    # def get_params(cls):
    #     return cls.params

    # define the burger equation
    def ht_equation(input_, output_):
        du = grad(output_, input_)
        ddu = grad(du, input_, components=['dudx'])

        coef1 = params.alpha

        res = du.extract(['dudt'])-coef1*ddu.extract(['ddudxdx'])
        return res

    # define initial condition
    def initial_condition(input_, output_):
        u_expected = 0
        return output_.extract(['u']) - u_expected

    # assign output/ spatial and temporal variables
    output_variables = ['u']
    spatial_domain = CartesianDomain({'x': [0, 1]})
    temporal_domain = CartesianDomain({'t': [0, 1]})

    # problem condition statement
    conditions = {
        'gamma1': Condition(location=CartesianDomain({'x': 0, 't': [0, 1]}), equation=FixedGradient(-self.lbc,d='x')),
        'gamma2': Condition(location=CartesianDomain({'x': 1, 't': [0, 1]}), equation=FixedGradient(0,d='x')),
        't0': Condition(location=CartesianDomain({'x': [0, 1], 't': 0}), equation=Equation(initial_condition)),
        'D': Condition(location=CartesianDomain({'x': [0, 1], 't': [0, 1]}), equation=Equation(ht_equation)),
    }
@karthikncsuiisc karthikncsuiisc added the help wanted Extra attention is needed label Feb 23, 2024
@dario-coscia
Copy link
Collaborator

Hello 👋🏻 Please take a look at issue #225 , specifically this answer could help.
I would avoid defining an __init__ and use either static variables. The value of these variables can always be changed during training by:

# update
from pytorch_lightning.callbacks import Callback
class Update(Callback):
    def on_train_epoch_end(self, trainer, __):
        trainer.solver.problem.__class__.[STATIC VARIABLE NAME] = ...

For more on callbacks and where to put them have a look at the lightning documentation, it is just an extra argument of the Trainer!

Let me know how it goes and if you like the package leave us a star ⭐️ which helps us grow the community!😄

@karthikncsuiisc
Copy link
Author

Thank you for the help this worked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants