-
-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allowing a function to be called multiple times with different inputs #627
base: master
Are you sure you want to change the base?
Conversation
Prior to my edits the loss function would implement u(x,y) - u(0,0) ~ 0 as something like Not sure when it would come up, but the changes should also allow for things like u(cos(x)) ~ func(x), etc., which I don't think would have been parsed right before these changes. I think will handle something like Dx(u(cos(x)) as du/dx evaluated at cos(x), not -sin(x)*du/dx, which might not be ideal, but I'm not sure the best way to add in chain rule. |
Test failures look real. |
Yeah, I think it was having trouble with multiple input dimensions, which makes sense since I was mostly working on 1D so far for simplicity. I'll give it a look soon, but will only have so much time for it in the next week or two since I'm visiting the in-laws at the moment. I do believe we've gotten closer, though. |
The test (NNPDE1, 1) appears to be barely failing, but the test (NNPDE1, 1.6) succeeds. I'm guessing the 1 versus 1.6 is Julia version, but I'm not sure. Since it passes in 1.6 and on my computer, I'm not sure how to fix it other than upping the number of sample points and iterations, since the values the test prints are only barely off from the values it's hoping for. Any suggestions? |
The other test that's failing appears to check code coverage by the test cases and is complaining about docstrings, as suggested in the Julia documentation. It also looks like the target code coverage is a little under 79.63%, so I'm not sure how committed we've actually been with writing comprehensive test cases anyway... If we do care about code coverage and want to get it up to 100%, I'd volunteer to advise a UROP who took 6.005/6.031/whatever the new number is to build out better unit testing, code coverage, documentation, etc., which I think would make it much easier for someone new to developing the package (as I was recently) to see what each internal function does, how it relates to the others, what the inputs are, etc. |
Everything passed on buildkite before the last commit, which only changed some docstrings from using three quotes to one (since I thought that might be why it was counting against code coverage?) and changed the number of sample points and iterations of a test in a different file from the one that is now failing. The test that failed on buildkite passes on my computer. So, I'm at a bit of a loss as to what stopped working and why. Could it have to do with the package versions that are getting loaded? It says "Warning: Could not use exact versions of packages in manifest, re-resolving," which I suspect means that the packages that get loaded could have changed between runs? Is that warning an issue, and if so, how would I resolve it? |
@@ -61,7 +61,9 @@ function build_symbolic_loss_function(pinnrep::PINNRepresentation, eqs; | |||
this_eq_pair = pair(eqs, depvars, dict_depvars, dict_depvar_input) | |||
this_eq_indvars = unique(vcat(values(this_eq_pair)...)) | |||
else |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update the docstring above. What does the code look like now?
bds = eltypeθ.(bds) | ||
bds[1, :], bds[2, :] | ||
else | ||
[eltypeθ(0.0)], [eltypeθ(0.0)] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what case is this handling?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I remember correctly, it was the case of something like u(_vcat(0)) .- 0
instead of u(cord1) .- 0
. Since that doesn't have a variable in it, we don't bother making training data for that expression (we can evaluate it without a training set). However, if you passed empty arrays along, then it would error, so instead we're just giving it 0 as both the upper and lower bounds, which don't really have any meaning since there aren't any variables that range between the bounds.
src/symbolic_utilities.jl
Outdated
syms = Set{Symbol}() | ||
filter(vcat(ind_args_...)) do ind_arg | ||
"All arguments in any instance of a dependent variable" | ||
all_ind_args = vcat((ind_args_...)...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reduce(vcat,...)
src/symbolic_utilities.jl
Outdated
# Add any independent variables from expression dependent variable calls | ||
for ind_arg in all_ind_args | ||
if ind_arg isa Expr | ||
for ind_var in collect(keys(dict_indvars)) | ||
if !isempty(NeuralPDE.find_thing_in_expr(ind_arg, ind_var)) | ||
push!(all_ind_args, ind_var) | ||
end | ||
end | ||
end | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh this looks like the key.
That Lux failure could be stochastic? Let's see. |
@ChrisRackauckas, I've merged recent changes to the library into this branch and addressed all your comments in the most recent commit. |
begin | ||
cord2 = vcat(x, y) | ||
cord1 = vcat(x, y) | ||
end | ||
(+).((*).(4, derivative(phi2, u, _vcat(x, y), [[0.0, ε]], 1, θ2)), derivative(phi1, u, _vcat(x, y), [[ε, 0.0]], 1, θ1)) .- 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are those made and then not used?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They're not used in this case any more. I think they may be used in the integral case, but that might not be true either. I can look through the different cases to see if they are ever used and remove them if not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah it looks like deprecated code now so it would be good to just remove it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, currently, cord1 = vcat(...)
is being used for integral equations only and in my efforts to see if it's possible to remove it, I've found something I broke that wasn't being tested for the integral equations, so I'm can work more on fixing that next week. In particular, I'll look for a fix that removes any need for those lines.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another thing I realized as I was working on this was that Ix(u(sin(x))
will now be interpreted as Dx(u(sin(x))
is (under my current changes) being interpreted as Ix(u(sin(x))
as
It feels to me like Ix(u(sin(x))
is Dx(u(sin(x))
is
Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
However, Dx(u(sin(x)) is (under my current changes) being interpreted as
That's not correct and would not play nicely. It should give the same result as what happens when basic symbolic interactions are done:
julia> using Symbolics
julia> @variables u x
2-element Vector{Num}:
u
x
julia> @variables u(..) x
2-element Vector{Any}:
u⋆
x
julia> u(sin(x))
u(sin(x))
julia> D = Differential(x)
(::Differential) (generic function with 2 methods)
julia> D(u(sin(x)))
Differential(x)(u(sin(x)))
julia> expand_derivatives(D(u(sin(x))))
cos(x)*Differential(sin(x))(u(sin(x)))
Are you talking with @xtalax on this to ensure it is into the rewrite? |
No, I didn't know about this rewrite. I'll message him on Slack about it. |
Currently something like u(x) - u(0) ~ sin(x) gets parsed as u(x) - u(x) ~ sin(x) when generating the loss function, because all instances of u(anything) are assumed to have the same input. I've made changes to _transform_expression and dot that generate the correct loss function, and I made a minor edit to one 1D test case changing u(x) ~ xcos(...) to u(x) - u(0) ~ xcos(...), to test these changes and mimic someone trying to train a network that they new passed through (0,0) and so would represent by g(x) = u(x) - u(0).