Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: compat bumps to latest versions #783

Merged
merged 14 commits into from Jan 22, 2024

Conversation

sathvikbhagavan
Copy link
Member

@sathvikbhagavan sathvikbhagavan commented Jan 16, 2024

IntegralsCubature was pinned at 0.2.2 which caused Integrals@3 and forced SciMLBase@1. Now IntegralsCubature does not exist and is an extension in Integrals with Cubature. Removing the version pinnings causes the precompilation to succeed with julia 1.10.

@sathvikbhagavan sathvikbhagavan force-pushed the sb/compats branch 5 times, most recently from 881a20c to ff368ce Compare January 17, 2024 15:32
@sathvikbhagavan sathvikbhagavan force-pushed the sb/compats branch 3 times, most recently from ace150f to a1c4797 Compare January 18, 2024 18:24
@ChrisRackauckas ChrisRackauckas mentioned this pull request Jan 19, 2024
@ChrisRackauckas
Copy link
Member

how''s this going?

@sathvikbhagavan
Copy link
Member Author

So, the status is all tests are passing (GHA builds). There is some difference in the run times though.

In the last build: https://github.com/SciML/NeuralPDE.jl/actions/runs/7587786863/job/20668895873?pr=783 & https://buildkite.com/julialang/neuralpde-dot-jl/builds/2030

Job name this PR master
ODEBPINN 64m 91m
PDEBPINN 50m 54m
NNPDE1 230m 194m
NNPDE2 116m 111m
AdaptiveLoss 93m 76m
Logging 223m 213m
Forward 9m 9m

Most of the tests run faster or similar times compared to master except NNPDE1 and AdaptiveLoss which has a big difference.

The doc builds are very slow for some reason in both 1.10 and 1.9 (#772) in CI. I will build them locally to see what's happening.

I have fixed the GPU tests here and they are passing.

For compats, I have bumped SciMLBase, Integrals, QMC, DomainSets and removed RAT as it is not used anywhere.
So, #625, #626, #735, #746, #747, #752, #754, #755, #758, #769, #770, #774 can be closed.

I haven't bumped Optimization related compat bumps as I saw you had assigned Vaibhav for them in SciML/Optimization.jl#669

So, remaining compat bumps is #784 which I leave to @AstitvaAggarwal as it related to BPINNs.

@ChrisRackauckas
Copy link
Member

I think I handled the NNODE ones. Seems like it's just all basic improvements, so tests not failing. I think we should just use the Lux conversion everywhere though to improve correctness and performance. This is what we did with DiffEqFlux Lux.transform https://github.com/SciML/DiffEqFlux.jl/blob/master/src/neural_de.jl#L42. It would cut down on a lot of code too. Let's follow up with that in a separate PR though.

@sathvikbhagavan
Copy link
Member Author

When I am running NNODE with autodiff=true, I am getting this:

julia> solve(prob, NeuralPDE.NNODE(chain, opt; autodiff = true), dt = 1 / 20.0f0,
                              verbose = true, abstol = 1.0f-10, maxiters = 200)
WARNING: both DomainSets and SciMLBase export "isconstant"; uses of it in module NeuralPDE must be qualified
WARNING: both DomainSets and SciMLBase export "islinear"; uses of it in module NeuralPDE must be qualified
WARNING: both DomainSets and SciMLBase export "issquare"; uses of it in module NeuralPDE must be qualified
WARNING: both QuasiMonteCarlo and ModelingToolkit export "Shift"; uses of it in module NeuralPDE must be qualified
WARNING: both MonteCarloMeasurements and Symbolics export ""; uses of it in module NeuralPDE must be qualified
WARNING: both MonteCarloMeasurements and Symbolics export ""; uses of it in module NeuralPDE must be qualified
┌ Warning: `ForwardDiff.jacobian(f, x)` within Zygote cannot track gradients with respect to `f`,
│ and `f` appears to be a closure, or a struct with fields (according to `issingletontype(typeof(f))`).typeof(f) = NeuralPDE.var"#163#164"{NeuralPDE.ODEPhi{Optimisers.Restructure{Chain{Tuple{Dense{typeof(σ), Matrix{Float64}, Vector{Float64}}, Dense{typeof(identity), Matrix{Float64}, Vector{Float64}}}}, @NamedTuple{layers::Tuple{@NamedTuple{weight::Int64, bias::Int64, σ::Tuple{}}, @NamedTuple{weight::Int64, bias::Int64, σ::Tuple{}}}}}, Float32, Float32, Nothing}, Vector{Float64}}
└ @ Zygote ~/.julia/packages/Zygote/WOy6z/src/lib/forward.jl:150
Current loss is: 121.37774643538611, Iteration: 1
Current loss is: 121.37774643538611, Iteration: 2
Current loss is: 121.37774643538611, Iteration: 3
Current loss is: 121.37774643538611, Iteration: 4
Current loss is: 121.37774643538611, Iteration: 5
Current loss is: 121.37774643538611, Iteration: 6
Current loss is: 121.37774643538611, Iteration: 7
Current loss is: 121.37774643538611, Iteration: 8
Current loss is: 121.37774643538611, Iteration: 9
Current loss is: 121.37774643538611, Iteration: 10
Current loss is: 121.37774643538611, Iteration: 11
Current loss is: 121.37774643538611, Iteration: 12
Current loss is: 121.37774643538611, Iteration: 13
Current loss is: 121.37774643538611, Iteration: 14
Current loss is: 121.37774643538611, Iteration: 15
Current loss is: 121.37774643538611, Iteration: 16
Current loss is: 121.37774643538611, Iteration: 17
Current loss is: 121.37774643538611, Iteration: 18
Current loss is: 121.37774643538611, Iteration: 19
Current loss is: 121.37774643538611, Iteration: 20
Current loss is: 121.37774643538611, Iteration: 21
Current loss is: 121.37774643538611, Iteration: 22
Current loss is: 121.37774643538611, Iteration: 23
Current loss is: 121.37774643538611, Iteration: 24
Current loss is: 121.37774643538611, Iteration: 25
Current loss is: 121.37774643538611, Iteration: 26
Current loss is: 121.37774643538611, Iteration: 27
Current loss is: 121.37774643538611, Iteration: 28
Current loss is: 121.37774643538611, Iteration: 29
Current loss is: 121.37774643538611, Iteration: 30
Current loss is: 121.37774643538611, Iteration: 31
Current loss is: 121.37774643538611, Iteration: 32
Current loss is: 121.37774643538611, Iteration: 33
Current loss is: 121.37774643538611, Iteration: 34
Current loss is: 121.37774643538611, Iteration: 35
Current loss is: 121.37774643538611, Iteration: 36
Current loss is: 121.37774643538611, Iteration: 37
Current loss is: 121.37774643538611, Iteration: 38
Current loss is: 121.37774643538611, Iteration: 39
Current loss is: 121.37774643538611, Iteration: 40
Current loss is: 121.37774643538611, Iteration: 41
Current loss is: 121.37774643538611, Iteration: 42
Current loss is: 121.37774643538611, Iteration: 43
Current loss is: 121.37774643538611, Iteration: 44
Current loss is: 121.37774643538611, Iteration: 45
Current loss is: 121.37774643538611, Iteration: 46
Current loss is: 121.37774643538611, Iteration: 47
Current loss is: 121.37774643538611, Iteration: 48
Current loss is: 121.37774643538611, Iteration: 49
Current loss is: 121.37774643538611, Iteration: 50
Current loss is: 121.37774643538611, Iteration: 51
Current loss is: 121.37774643538611, Iteration: 52
Current loss is: 121.37774643538611, Iteration: 53
Current loss is: 121.37774643538611, Iteration: 54
Current loss is: 121.37774643538611, Iteration: 55
Current loss is: 121.37774643538611, Iteration: 56
Current loss is: 121.37774643538611, Iteration: 57
Current loss is: 121.37774643538611, Iteration: 58
Current loss is: 121.37774643538611, Iteration: 59
Current loss is: 121.37774643538611, Iteration: 60
Current loss is: 121.37774643538611, Iteration: 61
Current loss is: 121.37774643538611, Iteration: 62
Current loss is: 121.37774643538611, Iteration: 63
Current loss is: 121.37774643538611, Iteration: 64
Current loss is: 121.37774643538611, Iteration: 65
Current loss is: 121.37774643538611, Iteration: 66
Current loss is: 121.37774643538611, Iteration: 67
Current loss is: 121.37774643538611, Iteration: 68
Current loss is: 121.37774643538611, Iteration: 69
Current loss is: 121.37774643538611, Iteration: 70
Current loss is: 121.37774643538611, Iteration: 71
Current loss is: 121.37774643538611, Iteration: 72
Current loss is: 121.37774643538611, Iteration: 73
Current loss is: 121.37774643538611, Iteration: 74
Current loss is: 121.37774643538611, Iteration: 75
Current loss is: 121.37774643538611, Iteration: 76
Current loss is: 121.37774643538611, Iteration: 77
Current loss is: 121.37774643538611, Iteration: 78
Current loss is: 121.37774643538611, Iteration: 79
Current loss is: 121.37774643538611, Iteration: 80
Current loss is: 121.37774643538611, Iteration: 81
Current loss is: 121.37774643538611, Iteration: 82
Current loss is: 121.37774643538611, Iteration: 83
Current loss is: 121.37774643538611, Iteration: 84
Current loss is: 121.37774643538611, Iteration: 85
Current loss is: 121.37774643538611, Iteration: 86
Current loss is: 121.37774643538611, Iteration: 87
Current loss is: 121.37774643538611, Iteration: 88
Current loss is: 121.37774643538611, Iteration: 89
Current loss is: 121.37774643538611, Iteration: 90
Current loss is: 121.37774643538611, Iteration: 91
Current loss is: 121.37774643538611, Iteration: 92
Current loss is: 121.37774643538611, Iteration: 93
Current loss is: 121.37774643538611, Iteration: 94
Current loss is: 121.37774643538611, Iteration: 95
Current loss is: 121.37774643538611, Iteration: 96
Current loss is: 121.37774643538611, Iteration: 97
Current loss is: 121.37774643538611, Iteration: 98
Current loss is: 121.37774643538611, Iteration: 99
Current loss is: 121.37774643538611, Iteration: 100
Current loss is: 121.37774643538611, Iteration: 101
Current loss is: 121.37774643538611, Iteration: 102
Current loss is: 121.37774643538611, Iteration: 103
Current loss is: 121.37774643538611, Iteration: 104
Current loss is: 121.37774643538611, Iteration: 105
Current loss is: 121.37774643538611, Iteration: 106
Current loss is: 121.37774643538611, Iteration: 107
Current loss is: 121.37774643538611, Iteration: 108
Current loss is: 121.37774643538611, Iteration: 109
Current loss is: 121.37774643538611, Iteration: 110
Current loss is: 121.37774643538611, Iteration: 111
Current loss is: 121.37774643538611, Iteration: 112
Current loss is: 121.37774643538611, Iteration: 113
Current loss is: 121.37774643538611, Iteration: 114
Current loss is: 121.37774643538611, Iteration: 115
Current loss is: 121.37774643538611, Iteration: 116
Current loss is: 121.37774643538611, Iteration: 117
Current loss is: 121.37774643538611, Iteration: 118
Current loss is: 121.37774643538611, Iteration: 119
Current loss is: 121.37774643538611, Iteration: 120
Current loss is: 121.37774643538611, Iteration: 121
Current loss is: 121.37774643538611, Iteration: 122
Current loss is: 121.37774643538611, Iteration: 123
Current loss is: 121.37774643538611, Iteration: 124
Current loss is: 121.37774643538611, Iteration: 125
Current loss is: 121.37774643538611, Iteration: 126
Current loss is: 121.37774643538611, Iteration: 127
Current loss is: 121.37774643538611, Iteration: 128
Current loss is: 121.37774643538611, Iteration: 129
Current loss is: 121.37774643538611, Iteration: 130
Current loss is: 121.37774643538611, Iteration: 131
Current loss is: 121.37774643538611, Iteration: 132
Current loss is: 121.37774643538611, Iteration: 133
Current loss is: 121.37774643538611, Iteration: 134
Current loss is: 121.37774643538611, Iteration: 135
Current loss is: 121.37774643538611, Iteration: 136
Current loss is: 121.37774643538611, Iteration: 137
Current loss is: 121.37774643538611, Iteration: 138
Current loss is: 121.37774643538611, Iteration: 139
Current loss is: 121.37774643538611, Iteration: 140
Current loss is: 121.37774643538611, Iteration: 141
Current loss is: 121.37774643538611, Iteration: 142
Current loss is: 121.37774643538611, Iteration: 143
Current loss is: 121.37774643538611, Iteration: 144
Current loss is: 121.37774643538611, Iteration: 145
Current loss is: 121.37774643538611, Iteration: 146
Current loss is: 121.37774643538611, Iteration: 147
Current loss is: 121.37774643538611, Iteration: 148
Current loss is: 121.37774643538611, Iteration: 149
Current loss is: 121.37774643538611, Iteration: 150
Current loss is: 121.37774643538611, Iteration: 151
Current loss is: 121.37774643538611, Iteration: 152
Current loss is: 121.37774643538611, Iteration: 153
Current loss is: 121.37774643538611, Iteration: 154
Current loss is: 121.37774643538611, Iteration: 155
Current loss is: 121.37774643538611, Iteration: 156
Current loss is: 121.37774643538611, Iteration: 157
Current loss is: 121.37774643538611, Iteration: 158
Current loss is: 121.37774643538611, Iteration: 159
Current loss is: 121.37774643538611, Iteration: 160
Current loss is: 121.37774643538611, Iteration: 161
Current loss is: 121.37774643538611, Iteration: 162
Current loss is: 121.37774643538611, Iteration: 163
Current loss is: 121.37774643538611, Iteration: 164
Current loss is: 121.37774643538611, Iteration: 165
Current loss is: 121.37774643538611, Iteration: 166
Current loss is: 121.37774643538611, Iteration: 167
Current loss is: 121.37774643538611, Iteration: 168
Current loss is: 121.37774643538611, Iteration: 169
Current loss is: 121.37774643538611, Iteration: 170
Current loss is: 121.37774643538611, Iteration: 171
Current loss is: 121.37774643538611, Iteration: 172
Current loss is: 121.37774643538611, Iteration: 173
Current loss is: 121.37774643538611, Iteration: 174
Current loss is: 121.37774643538611, Iteration: 175
Current loss is: 121.37774643538611, Iteration: 176
Current loss is: 121.37774643538611, Iteration: 177
Current loss is: 121.37774643538611, Iteration: 178
Current loss is: 121.37774643538611, Iteration: 179
Current loss is: 121.37774643538611, Iteration: 180
Current loss is: 121.37774643538611, Iteration: 181
Current loss is: 121.37774643538611, Iteration: 182
Current loss is: 121.37774643538611, Iteration: 183
Current loss is: 121.37774643538611, Iteration: 184
Current loss is: 121.37774643538611, Iteration: 185
Current loss is: 121.37774643538611, Iteration: 186
Current loss is: 121.37774643538611, Iteration: 187
Current loss is: 121.37774643538611, Iteration: 188
Current loss is: 121.37774643538611, Iteration: 189
Current loss is: 121.37774643538611, Iteration: 190
Current loss is: 121.37774643538611, Iteration: 191
Current loss is: 121.37774643538611, Iteration: 192
Current loss is: 121.37774643538611, Iteration: 193
Current loss is: 121.37774643538611, Iteration: 194
Current loss is: 121.37774643538611, Iteration: 195
Current loss is: 121.37774643538611, Iteration: 196
Current loss is: 121.37774643538611, Iteration: 197
Current loss is: 121.37774643538611, Iteration: 198
Current loss is: 121.37774643538611, Iteration: 199
Current loss is: 121.37774643538611, Iteration: 200
Current loss is: 121.37774643538611, Iteration: 201
retcode: Success
Interpolation: Trained neural network interpolation
t: 0.0f0:0.05f0:1.0f0
u: 21-element Vector{Float64}:
  0.0
  0.006315714800835081
  0.011539374558691208
  0.015674108264008866
  0.01872510579170209
  0.02069958500724156
  0.021606740683030142
  0.021457682728718303
  0.020265362784713716
  0.018044490439805484
  0.014811432249981035
  0.010584115158917613
  0.005381909754133146
 -0.0007744793127106396
 -0.007863167441728219
 -0.01586117341049996
 -0.024744577475576335
 -0.03448864560999067
 -0.045067947328471566
 -0.056456536791448714
 -0.06862800437895486

It appears ForwardDiff and Zygote are not compatible hence the loss is constant. This is one of the test which was supposed to error that was changed in 5df70a8 and doesn't error anymore. Is there a way to fix this?

@ChrisRackauckas
Copy link
Member

Optimisers.Restructure that's a problem with Flux. One way to fix it is to just do the conversions to Lux.

@sathvikbhagavan
Copy link
Member Author

I think its a bug with GridTraining. I get the same issue with Lux.

julia> sol = solve(prob, NeuralPDE.NNODE(luxchain, opt; autodiff = true, strategy = GridTraining(1/20.0)), verbose = true, maxiters = 200)
┌ Warning: `ForwardDiff.jacobian(f, x)` within Zygote cannot track gradients with respect to `f`,
│ and `f` appears to be a closure, or a struct with fields (according to `issingletontype(typeof(f))`).typeof(f) = NeuralPDE.var"#163#164"{NeuralPDE.ODEPhi{Lux.Chain{@NamedTuple{layer_1::Lux.Dense{true, typeof(sigmoid_fast), typeof(WeightInitializers.glorot_uniform), typeof(WeightInitializers.zeros32)}, layer_2::Lux.Dense{true, typeof(identity), typeof(WeightInitializers.glorot_uniform), typeof(WeightInitializers.zeros32)}}, Nothing}, Float64, Float64, @NamedTuple{layer_1::@NamedTuple{}, layer_2::@NamedTuple{}}}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}}
└ @ Zygote ~/.julia/packages/Zygote/WOy6z/src/lib/forward.jl:150
Current loss is: 133.4565371299845, Iteration: 1
Current loss is: 133.4565371299845, Iteration: 2
Current loss is: 133.4565371299845, Iteration: 3
Current loss is: 133.4565371299845, Iteration: 4
Current loss is: 133.4565371299845, Iteration: 5
Current loss is: 133.4565371299845, Iteration: 6
Current loss is: 133.4565371299845, Iteration: 7
Current loss is: 133.4565371299845, Iteration: 8
Current loss is: 133.4565371299845, Iteration: 9
Current loss is: 133.4565371299845, Iteration: 10
Current loss is: 133.4565371299845, Iteration: 11
Current loss is: 133.4565371299845, Iteration: 12
Current loss is: 133.4565371299845, Iteration: 13
Current loss is: 133.4565371299845, Iteration: 14
Current loss is: 133.4565371299845, Iteration: 15
Current loss is: 133.4565371299845, Iteration: 16
Current loss is: 133.4565371299845, Iteration: 17
Current loss is: 133.4565371299845, Iteration: 18
Current loss is: 133.4565371299845, Iteration: 19
Current loss is: 133.4565371299845, Iteration: 20
Current loss is: 133.4565371299845, Iteration: 21
Current loss is: 133.4565371299845, Iteration: 22
Current loss is: 133.4565371299845, Iteration: 23
Current loss is: 133.4565371299845, Iteration: 24
Current loss is: 133.4565371299845, Iteration: 25
Current loss is: 133.4565371299845, Iteration: 26
Current loss is: 133.4565371299845, Iteration: 27
Current loss is: 133.4565371299845, Iteration: 28
Current loss is: 133.4565371299845, Iteration: 29
Current loss is: 133.4565371299845, Iteration: 30
Current loss is: 133.4565371299845, Iteration: 31
Current loss is: 133.4565371299845, Iteration: 32
Current loss is: 133.4565371299845, Iteration: 33
Current loss is: 133.4565371299845, Iteration: 34
Current loss is: 133.4565371299845, Iteration: 35
Current loss is: 133.4565371299845, Iteration: 36
Current loss is: 133.4565371299845, Iteration: 37
Current loss is: 133.4565371299845, Iteration: 38
Current loss is: 133.4565371299845, Iteration: 39
Current loss is: 133.4565371299845, Iteration: 40
Current loss is: 133.4565371299845, Iteration: 41
Current loss is: 133.4565371299845, Iteration: 42
Current loss is: 133.4565371299845, Iteration: 43
Current loss is: 133.4565371299845, Iteration: 44
Current loss is: 133.4565371299845, Iteration: 45
Current loss is: 133.4565371299845, Iteration: 46
Current loss is: 133.4565371299845, Iteration: 47
Current loss is: 133.4565371299845, Iteration: 48
Current loss is: 133.4565371299845, Iteration: 49
Current loss is: 133.4565371299845, Iteration: 50
Current loss is: 133.4565371299845, Iteration: 51
Current loss is: 133.4565371299845, Iteration: 52
Current loss is: 133.4565371299845, Iteration: 53
Current loss is: 133.4565371299845, Iteration: 54
Current loss is: 133.4565371299845, Iteration: 55
Current loss is: 133.4565371299845, Iteration: 56
Current loss is: 133.4565371299845, Iteration: 57
Current loss is: 133.4565371299845, Iteration: 58
Current loss is: 133.4565371299845, Iteration: 59
Current loss is: 133.4565371299845, Iteration: 60
Current loss is: 133.4565371299845, Iteration: 61
Current loss is: 133.4565371299845, Iteration: 62
Current loss is: 133.4565371299845, Iteration: 63
Current loss is: 133.4565371299845, Iteration: 64
Current loss is: 133.4565371299845, Iteration: 65
Current loss is: 133.4565371299845, Iteration: 66
Current loss is: 133.4565371299845, Iteration: 67
Current loss is: 133.4565371299845, Iteration: 68
Current loss is: 133.4565371299845, Iteration: 69
Current loss is: 133.4565371299845, Iteration: 70
Current loss is: 133.4565371299845, Iteration: 71
Current loss is: 133.4565371299845, Iteration: 72
Current loss is: 133.4565371299845, Iteration: 73
Current loss is: 133.4565371299845, Iteration: 74
Current loss is: 133.4565371299845, Iteration: 75
Current loss is: 133.4565371299845, Iteration: 76
Current loss is: 133.4565371299845, Iteration: 77
Current loss is: 133.4565371299845, Iteration: 78
Current loss is: 133.4565371299845, Iteration: 79
Current loss is: 133.4565371299845, Iteration: 80
Current loss is: 133.4565371299845, Iteration: 81
Current loss is: 133.4565371299845, Iteration: 82
Current loss is: 133.4565371299845, Iteration: 83
Current loss is: 133.4565371299845, Iteration: 84
Current loss is: 133.4565371299845, Iteration: 85
Current loss is: 133.4565371299845, Iteration: 86
Current loss is: 133.4565371299845, Iteration: 87
Current loss is: 133.4565371299845, Iteration: 88
Current loss is: 133.4565371299845, Iteration: 89
Current loss is: 133.4565371299845, Iteration: 90
Current loss is: 133.4565371299845, Iteration: 91
Current loss is: 133.4565371299845, Iteration: 92
Current loss is: 133.4565371299845, Iteration: 93
Current loss is: 133.4565371299845, Iteration: 94
Current loss is: 133.4565371299845, Iteration: 95
Current loss is: 133.4565371299845, Iteration: 96
Current loss is: 133.4565371299845, Iteration: 97
Current loss is: 133.4565371299845, Iteration: 98
Current loss is: 133.4565371299845, Iteration: 99
Current loss is: 133.4565371299845, Iteration: 100
Current loss is: 133.4565371299845, Iteration: 101
Current loss is: 133.4565371299845, Iteration: 102
Current loss is: 133.4565371299845, Iteration: 103
Current loss is: 133.4565371299845, Iteration: 104
Current loss is: 133.4565371299845, Iteration: 105
Current loss is: 133.4565371299845, Iteration: 106
Current loss is: 133.4565371299845, Iteration: 107
Current loss is: 133.4565371299845, Iteration: 108
Current loss is: 133.4565371299845, Iteration: 109
Current loss is: 133.4565371299845, Iteration: 110
Current loss is: 133.4565371299845, Iteration: 111
Current loss is: 133.4565371299845, Iteration: 112
Current loss is: 133.4565371299845, Iteration: 113
Current loss is: 133.4565371299845, Iteration: 114
Current loss is: 133.4565371299845, Iteration: 115
Current loss is: 133.4565371299845, Iteration: 116
Current loss is: 133.4565371299845, Iteration: 117
Current loss is: 133.4565371299845, Iteration: 118
Current loss is: 133.4565371299845, Iteration: 119
Current loss is: 133.4565371299845, Iteration: 120
Current loss is: 133.4565371299845, Iteration: 121
Current loss is: 133.4565371299845, Iteration: 122
Current loss is: 133.4565371299845, Iteration: 123
Current loss is: 133.4565371299845, Iteration: 124
Current loss is: 133.4565371299845, Iteration: 125
Current loss is: 133.4565371299845, Iteration: 126
Current loss is: 133.4565371299845, Iteration: 127
Current loss is: 133.4565371299845, Iteration: 128
Current loss is: 133.4565371299845, Iteration: 129
Current loss is: 133.4565371299845, Iteration: 130
Current loss is: 133.4565371299845, Iteration: 131
Current loss is: 133.4565371299845, Iteration: 132
Current loss is: 133.4565371299845, Iteration: 133
Current loss is: 133.4565371299845, Iteration: 134
Current loss is: 133.4565371299845, Iteration: 135
Current loss is: 133.4565371299845, Iteration: 136
Current loss is: 133.4565371299845, Iteration: 137
Current loss is: 133.4565371299845, Iteration: 138
Current loss is: 133.4565371299845, Iteration: 139
Current loss is: 133.4565371299845, Iteration: 140
Current loss is: 133.4565371299845, Iteration: 141
Current loss is: 133.4565371299845, Iteration: 142
Current loss is: 133.4565371299845, Iteration: 143
Current loss is: 133.4565371299845, Iteration: 144
Current loss is: 133.4565371299845, Iteration: 145
Current loss is: 133.4565371299845, Iteration: 146
Current loss is: 133.4565371299845, Iteration: 147
Current loss is: 133.4565371299845, Iteration: 148
Current loss is: 133.4565371299845, Iteration: 149
Current loss is: 133.4565371299845, Iteration: 150
Current loss is: 133.4565371299845, Iteration: 151
Current loss is: 133.4565371299845, Iteration: 152
Current loss is: 133.4565371299845, Iteration: 153
Current loss is: 133.4565371299845, Iteration: 154
Current loss is: 133.4565371299845, Iteration: 155
Current loss is: 133.4565371299845, Iteration: 156
Current loss is: 133.4565371299845, Iteration: 157
Current loss is: 133.4565371299845, Iteration: 158
Current loss is: 133.4565371299845, Iteration: 159
Current loss is: 133.4565371299845, Iteration: 160
Current loss is: 133.4565371299845, Iteration: 161
Current loss is: 133.4565371299845, Iteration: 162
Current loss is: 133.4565371299845, Iteration: 163
Current loss is: 133.4565371299845, Iteration: 164
Current loss is: 133.4565371299845, Iteration: 165
Current loss is: 133.4565371299845, Iteration: 166
Current loss is: 133.4565371299845, Iteration: 167
Current loss is: 133.4565371299845, Iteration: 168
Current loss is: 133.4565371299845, Iteration: 169
Current loss is: 133.4565371299845, Iteration: 170
Current loss is: 133.4565371299845, Iteration: 171
Current loss is: 133.4565371299845, Iteration: 172
Current loss is: 133.4565371299845, Iteration: 173
Current loss is: 133.4565371299845, Iteration: 174
Current loss is: 133.4565371299845, Iteration: 175
Current loss is: 133.4565371299845, Iteration: 176
Current loss is: 133.4565371299845, Iteration: 177
Current loss is: 133.4565371299845, Iteration: 178
Current loss is: 133.4565371299845, Iteration: 179
Current loss is: 133.4565371299845, Iteration: 180
Current loss is: 133.4565371299845, Iteration: 181
Current loss is: 133.4565371299845, Iteration: 182
Current loss is: 133.4565371299845, Iteration: 183
Current loss is: 133.4565371299845, Iteration: 184
Current loss is: 133.4565371299845, Iteration: 185
Current loss is: 133.4565371299845, Iteration: 186
Current loss is: 133.4565371299845, Iteration: 187
Current loss is: 133.4565371299845, Iteration: 188
Current loss is: 133.4565371299845, Iteration: 189
Current loss is: 133.4565371299845, Iteration: 190
Current loss is: 133.4565371299845, Iteration: 191
Current loss is: 133.4565371299845, Iteration: 192
Current loss is: 133.4565371299845, Iteration: 193
Current loss is: 133.4565371299845, Iteration: 194
Current loss is: 133.4565371299845, Iteration: 195
Current loss is: 133.4565371299845, Iteration: 196
Current loss is: 133.4565371299845, Iteration: 197
Current loss is: 133.4565371299845, Iteration: 198
Current loss is: 133.4565371299845, Iteration: 199
Current loss is: 133.4565371299845, Iteration: 200
Current loss is: 133.4565371299845, Iteration: 201
retcode: Success
Interpolation: Trained neural network interpolation
t: 0.0:0.010101010101010102:1.0
u: 100-element Vector{Float64}:
  0.0
 -0.005377352795611298
 -0.010789880802387412
 -0.016237580798480915
 -0.021720447414775464
 -0.027238473135829697
 -0.03279164830119815
 -0.03837996110712897
 -0.04400339760863766
 -0.049661941721956294
  
 -0.6192373633496234
 -0.6275331833276632
 -0.6358562744043232
 -0.6442064822886961
 -0.6525836518203324
 -0.6609876269910301
 -0.6694182509666272
 -0.6778753661088008
 -0.6863588139968584
 -0.6948684354495236

@ChrisRackauckas
Copy link
Member

Make that case appropriately error and we should merge.

@sathvikbhagavan
Copy link
Member Author

Make that case appropriately error and we should merge.

Yes done in 8a0dccc and tests updated in 607d4f1

It only works for QuadratureTraining and not others. What is the correct way to fix this? Is this more on Zygote/ForwardDiff?

I had opened an issue a while back related to this - #725 although looking at the stack trace, I think it was a different issue which works now. I will update it to include the current issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants