New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Very complicated boundary #264
Comments
Hi Riccardo, If your BC is pure Dirichlet, then maybe try deepxde.boundary_conditions.PointSetBC. |
Thank you @smao-astro Is it possible to do the same - to provide the array of the appropriate set of points, along with the corresponding values - for the Initial Conditions? I clarify that I'm already able to anchor all the training points that I want in the domain/timedomain. |
I guess the answer is Yes, though I haven't try that -- PointSetBC is mainly for cases that the constraint NN(x) = f(x) can not be written in analytical expression. |
Excuse me, i have a problem about how to anchor training points , and could you please share your code about anchoring training |
I found out how to do it here: #64 My case is kind of complicated and my code is less than newbie level but here is what it's about:
then:
time = 10
TP = np.array([-1, -1, -1])
TP1 = np.array([-1, -1, -1])
i = 0
j = 0
k = 0
while k < time+1:
while i < nrows:
while j < ncols:
if L[i,j] == 1:
TP1 = np.array([15*j, 15*i, k])
TP = np.vstack((TP, TP1))
j = j + 20 # 20 is the pace along x
i = i + 20 # 20 is the pace along y
j = 0
k = k + 1 # 1 is the pace along time
i = 0
TP = np.delete(TP,0,0) Then, when it's time to define the model, I'm going to write something like this: data = dde.data.TimePDE(
geomtime, pde, BC,
num_domain = 0,
num_boundary = 100,
num_initial = 0)
anchors = TP) |
@Ryszard2 thanks a lot |
I used If I understand it correctly it works like this:
I'd have two questions:
Thank you, |
By default, If you have multiple outputs (y1, y2, etc), then for every y specify a dde.boundary_conditions.PointSetBC(X, Y0, component=0)
dde.boundary_conditions.PointSetBC(X, Y1, component=1) |
Thank you @smao-astro But... Even though I applied rigorously all the BC conditions, the outline that I mean to be the boundary of the domain isn't perceived as boundary by the NN, since the solutions are kind of crazy. Here is mi doubt:
I never placed a single training point (or IC point, or BC point) in all that major portion of rectangle that I meant to be outside the domain. My good intention of define a partial domain seems hardly to be understood by the NN. Thank you, |
I may be wrong, but brooding over this problem I got to think that the focal point is not the boundary, it's the whole rectangular domain instead. I'm getting pretty sure that my attention on the boundary is the wrong way to the solution of this problem. This is a 2D shallow water equations problem. It takes place in a rectangular domain whose depth changes from point to point, as shown in the figures
That means that the white portion is an area where the IC for the water height is zero. Inside the valley, instead, there is a proper IC distribution for the water height. I do not have the function of this depth, I have the matrix whose elements provide the value of the depth in every single point of the whole rectangular domain. This is the real problem.
It didn't work out, I always got the Resource Exhausted error What can be done? Thank you, |
@Ryszard2 I guess your question is that you only need to enforce the PDE inside colored area. To do this, you ask DeepXDE not to sample any point in the recrangle by setting data = dde.data.TimePDE(
geomtime, pde, BC,
num_domain = 0,
num_boundary = 0,
num_initial = 0)
anchors = TP # the points your sampled
) You may test this idea by using a simple example, e.g., a disk inside a square for a poisson equation. Also, see #161 |
Thank you @lululxvi
The problems are:
The NN learns well the IC, yet it fails to learn what happens from second one on. The questions I can formulate from my point of view are:
Thank you |
Solving large scale PDEs for a very long time is always difficult. You may consider training in parallel in multiple GPUs, see #39. Also, you may consider train using mini-batch, i.e., in each iteration using partial data points. By the way, if you are solving an inverse problem, then there is no need to consider the whole domain. |
Thank you @lululxvi So far the solution is wrong. I scaled the space by the thousands, the time by the thousands, and the water height by like 50: the problem is now like unitary in every aspect. Thank you |
Usually, there is no significant difference between mini-batch and full batch, assuming that your training data points are sufficient enough. How about starting from a simple example, e.g., a polygon with a simple PDE like Possion eq to test the code etc? |
Thank you @lululxvi I can't say the same for this complicated domain, so, I've abandoned the approach of anchoring by myself the domain and the boundary points. Now I can build the polygon and let DeepXDE set the domain/boundary points but here's the new problem: DeepXDE takes like half an hour to process the line: data = dde.data.PDE(geom, pde, bc, num_domain = 1000, num_boundary = 500) It takes forever, and we talking about a minimum amount of points! Seems like a very complicated polygon is not manageable. Thank you |
Great to know that it works for a simpler geometry!
|
Gosh, it sounds good!
I'm not sure who was the guilty between the domain points or the boundary ones. I honestly can't manage a check about it since I can run only the up-to-date version of DeepXDE. I thank you so much, @lululxvi |
Hello @lululxvi
is it somehow possible to define a boundary like this one?
I do have all the boolean arrays that identify the situation in all the area (domain/no_domain, boundary/no_boundary), indeed I can anchor the training points in the domain:
That makes me confident on the domain (and Initial Condition) part, but I can't work out the definition of the boundary.
I thought of the
scipy.interpolate.interp2d
and thetf.convert_to_tensor
function to convert those boolean arrays into tensor or functions, but still I can't take advantage of all these ingredients.Thank you,
Riccardo
The text was updated successfully, but these errors were encountered: