New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parametric Burgers + hard Dirichlet BC doubt #379
Comments
Please find attached the code in which I tried to use a 2D geometry (x, mu) in order to construct the neural network for the parametric Burgers problem. I am getting a ValueError when compiling the model in line 116 |
To include the parameter mu, you can simply assume mu is another space coordinate. In DeepXDE, there is no difference between x and mu. So, if we change the notation from mu to y, then it is u(x, y, t), where u is defined in a 2D space. For the hard constraints, you can also check this paper https://arxiv.org/abs/2102.04626 for more details. |
@GuillemBarroso did you manage to solve the parametric problem? |
My apologies for not following up on this issue.
Yes! I did manage to get the parametric problem running @rodrigogdourado. As pointed out by @lululxvi, it was just a matter of considering it as an extra space coordinate, so you would be solving a 3D (x, mu, t) problem. That was a year ago so I will not be able to be more detailed. I just remember that I had a bug in my code (see attachment file in the original comment) in the Good luck! |
Thank you very much @GuillemBarroso! It worked very well. |
Hi @GuillemBarroso . Since you are ahead of me, could you explain the i and j in this operator dde.grad.jacobian(y, x, i=0, j=2)? I'm still making some confusions with this numbering system. Which are the numbers for the different derivatives, like dy_dx, dy_dt, dy_dxx and etc? |
@rodrigofarias-MECH are you from brazil? If your problem involves 2-dimensional coordinates (x1,x2), is time-dependent, and requires solving for only one variable (such as in a 2D heat conduction problem), then dy_dx1 -> i=0,j=0 now, consider a 2D navier-stokes equation. In this case you are solving the problem for u, v, p, so your y matrix is composed by 3 columns. Thus, u = y[:, 0:1] du_dx1 - > i=0,j=0 so, j is related to the input and i is related to the output parameters |
Yes I'm from RJ-Brazil! Thank you, know I understood the indexes! Good example this for Navier-stokes. |
I work at ITA. You are the first brazilian I know interested in PINN. If you would like to share your work with me, maybe we can work together in something. You can text me on researchgate https://www.researchgate.net/profile/Rodrigo-Dourado-2?ev=hdr_xprf |
I work at LNTSOLD-COPPE/UFRJ. I send you a message in researchgate and linkedin. |
Dear @lululxvi,
First, thank you for your dedicated work. I have to questions regarding DeepXDE.
I am interested in parametric forward problems and I have started with the 1D Burgers example available in the examples section. As a first test to introduce a parameter dependency on this test problem, I have modified the initial condition of the problem as
u(x,0) = sin(mupix), with mu \in [1,3]
so that I can use different values of mu to modify the problem's behaviour. I can successfully train several NN that take x and t as inputs and predicts u with a good accuracy for different values of mu. Note that the boundary conditions u(0,t) = u(1, t) = 0 are only satisfied for integer values of mu. That is why I have modified the right BC to also change with time, see the modified code attached.
Now, in the next stage I would like to include mu as a parametric direction to the problem. That is, a NN that takes mu, x and t and predicts u(mu, x, t). And here is where I am not too sure how to proceed. I have seen 2D and 3D geometries in time-dependent problems in the examples provided, but it is not clear to me how to add a "parametric direction". Again, my goal is to define a NN with 3 inputs mu, x and t and the training data stored in "x" should have all the points' locations in mu, x and t (as opposed to now that I only have x and t).
I hope I made myself clear. Please let me know otherwise.
As discussed in the issues and as stated in https://arxiv.org/abs/1907.04502, DeepXDE enforces the BC softly. For the parametric Burgers mentioned in question 1) I believe that enforcing strongly the Dirichlet BCs would ease the training (that is the idea I got reading the issues). However, I am not fully understanding the methodology followed when using
net.apply_output_transform
.In the paper you mention that for u(0) = u(1) with \Omega = [0, 1], one could simply choose the surrogate model u_new(x) = x(x-1)N(x). My question is, is that transformation (using
net.apply_output_transform
) going to be applied to all outputs regardless of the value of x? Of course, for x = 0 and x = 1, u_new(x) is going to be set equal to 0. But for x = 0.5 it would be u_new(x) = -0.25*N(x), which would modify the solution in the interior of the domain. Am I missing something? Also, looking at how I define the right BC (at x=1), please see the code attached, do you think it is possible to use a similar method to strongly impose these Dirichlet BC?Thank you very much for your time. I greatly appreciate your help.
Best regards,
Guillem
paramBurguers_sin.py.zip
The text was updated successfully, but these errors were encountered: