Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allowing a function to be called multiple times with different inputs #627

Draft
wants to merge 49 commits into
base: master
Choose a base branch
from

Conversation

nicholaskl97
Copy link

Currently something like u(x) - u(0) ~ sin(x) gets parsed as u(x) - u(x) ~ sin(x) when generating the loss function, because all instances of u(anything) are assumed to have the same input. I've made changes to _transform_expression and dot that generate the correct loss function, and I made a minor edit to one 1D test case changing u(x) ~ xcos(...) to u(x) - u(0) ~ xcos(...), to test these changes and mimic someone trying to train a network that they new passed through (0,0) and so would represent by g(x) = u(x) - u(0).

@nicholaskl97
Copy link
Author

nicholaskl97 commented Jan 1, 2023

Prior to my edits the loss function would implement u(x,y) - u(0,0) ~ 0 as something like
coord1 = vcat(x,y)
loss = u(coord1) - u(coord1)
Now it will do
coord1 = vcat(x,y)
loss = u(x,y) - u(0,0)

Not sure when it would come up, but the changes should also allow for things like u(cos(x)) ~ func(x), etc., which I don't think would have been parsed right before these changes. I think will handle something like Dx(u(cos(x)) as du/dx evaluated at cos(x), not -sin(x)*du/dx, which might not be ideal, but I'm not sure the best way to add in chain rule.

@nicholaskl97 nicholaskl97 marked this pull request as draft January 1, 2023 02:32
@ChrisRackauckas
Copy link
Member

Test failures look real.

@nicholaskl97
Copy link
Author

Yeah, I think it was having trouble with multiple input dimensions, which makes sense since I was mostly working on 1D so far for simplicity. I'll give it a look soon, but will only have so much time for it in the next week or two since I'm visiting the in-laws at the moment. I do believe we've gotten closer, though.

@nicholaskl97
Copy link
Author

The test (NNPDE1, 1) appears to be barely failing, but the test (NNPDE1, 1.6) succeeds. I'm guessing the 1 versus 1.6 is Julia version, but I'm not sure. Since it passes in 1.6 and on my computer, I'm not sure how to fix it other than upping the number of sample points and iterations, since the values the test prints are only barely off from the values it's hoping for.

Any suggestions?

@nicholaskl97
Copy link
Author

nicholaskl97 commented Mar 7, 2023

The other test that's failing appears to check code coverage by the test cases and is complaining about docstrings, as suggested in the Julia documentation. It also looks like the target code coverage is a little under 79.63%, so I'm not sure how committed we've actually been with writing comprehensive test cases anyway...

If we do care about code coverage and want to get it up to 100%, I'd volunteer to advise a UROP who took 6.005/6.031/whatever the new number is to build out better unit testing, code coverage, documentation, etc., which I think would make it much easier for someone new to developing the package (as I was recently) to see what each internal function does, how it relates to the others, what the inputs are, etc.

@nicholaskl97
Copy link
Author

nicholaskl97 commented Mar 7, 2023

Everything passed on buildkite before the last commit, which only changed some docstrings from using three quotes to one (since I thought that might be why it was counting against code coverage?) and changed the number of sample points and iterations of a test in a different file from the one that is now failing. The test that failed on buildkite passes on my computer. So, I'm at a bit of a loss as to what stopped working and why.

Could it have to do with the package versions that are getting loaded? It says "Warning: Could not use exact versions of packages in manifest, re-resolving," which I suspect means that the packages that get loaded could have changed between runs? Is that warning an issue, and if so, how would I resolve it?

@@ -61,7 +61,9 @@ function build_symbolic_loss_function(pinnrep::PINNRepresentation, eqs;
this_eq_pair = pair(eqs, depvars, dict_depvars, dict_depvar_input)
this_eq_indvars = unique(vcat(values(this_eq_pair)...))
else
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update the docstring above. What does the code look like now?

bds = eltypeθ.(bds)
bds[1, :], bds[2, :]
else
[eltypeθ(0.0)], [eltypeθ(0.0)]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what case is this handling?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I remember correctly, it was the case of something like $u(0)=0$. The parser will now just make that into something like u(_vcat(0)) .- 0 instead of u(cord1) .- 0. Since that doesn't have a variable in it, we don't bother making training data for that expression (we can evaluate it without a training set). However, if you passed empty arrays along, then it would error, so instead we're just giving it 0 as both the upper and lower bounds, which don't really have any meaning since there aren't any variables that range between the bounds.

syms = Set{Symbol}()
filter(vcat(ind_args_...)) do ind_arg
"All arguments in any instance of a dependent variable"
all_ind_args = vcat((ind_args_...)...)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reduce(vcat,...)

Comment on lines 480 to 489
# Add any independent variables from expression dependent variable calls
for ind_arg in all_ind_args
if ind_arg isa Expr
for ind_var in collect(keys(dict_indvars))
if !isempty(NeuralPDE.find_thing_in_expr(ind_arg, ind_var))
push!(all_ind_args, ind_var)
end
end
end
end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahh this looks like the key.

@ChrisRackauckas
Copy link
Member

That Lux failure could be stochastic? Let's see.

@nicholaskl97
Copy link
Author

@ChrisRackauckas, I've merged recent changes to the library into this branch and addressed all your comments in the most recent commit.

Comment on lines +20 to +24
begin
cord2 = vcat(x, y)
cord1 = vcat(x, y)
end
(+).((*).(4, derivative(phi2, u, _vcat(x, y), [[0.0, ε]], 1, θ2)), derivative(phi1, u, _vcat(x, y), [[ε, 0.0]], 1, θ1)) .- 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are those made and then not used?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They're not used in this case any more. I think they may be used in the integral case, but that might not be true either. I can look through the different cases to see if they are ever used and remove them if not.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah it looks like deprecated code now so it would be good to just remove it

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, currently, cord1 = vcat(...) is being used for integral equations only and in my efforts to see if it's possible to remove it, I've found something I broke that wasn't being tested for the integral equations, so I'm can work more on fixing that next week. In particular, I'll look for a fix that removes any need for those lines.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another thing I realized as I was working on this was that Ix(u(sin(x)) will now be interpreted as $\int u(\sin x) dx$. However, Dx(u(sin(x)) is (under my current changes) being interpreted as $u'(\sin x)$, not $\frac{d}{dx}\left[ u(\sin x) \right] = u'(\sin x) \cos x$. They're interpreted this way because that's how the numeric integral and numeric derivative functions were already written. However, it feels a little inconsistent with the way the integral was interpreted; it's instead consistent with an interpretation of Ix(u(sin(x)) as $U(\sin x)$, where $U$ is an antiderivative of $u$.

It feels to me like Ix(u(sin(x)) is $\int u(\sin x) dx$ and Dx(u(sin(x)) is $\frac{d}{dx}\left[ u(\sin x) \right]$, but then I don't know how you would actually specify $U(\sin x)$ or $u'(\sin x)$, or if you should even be allowed to. (I'm fine not letting people use $U(\sin x)$ since it's not uniquely defined, but it feels like they should be able to use $u'(\sin x)$.)

Thoughts?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

However, Dx(u(sin(x)) is (under my current changes) being interpreted as

That's not correct and would not play nicely. It should give the same result as what happens when basic symbolic interactions are done:

julia> using Symbolics

julia> @variables u x
2-element Vector{Num}:
 u
 x

julia> @variables u(..) x
2-element Vector{Any}:
  u
 x

julia> u(sin(x))
u(sin(x))

julia> D = Differential(x)
(::Differential) (generic function with 2 methods)

julia> D(u(sin(x)))
Differential(x)(u(sin(x)))

julia> expand_derivatives(D(u(sin(x))))
cos(x)*Differential(sin(x))(u(sin(x)))

@nicholaskl97 nicholaskl97 marked this pull request as draft March 13, 2023 02:02
@ChrisRackauckas
Copy link
Member

Are you talking with @xtalax on this to ensure it is into the rewrite?

@nicholaskl97
Copy link
Author

Are you talking with @xtalax on this to ensure it is into the rewrite?

No, I didn't know about this rewrite. I'll message him on Slack about it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants