Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Radau solver really slow when adaptive time stepping and collocation solver don't play nicely #14080

Open
Naikless opened this issue May 17, 2021 · 6 comments

Comments

@Naikless
Copy link

Naikless commented May 17, 2021

I am using CalTechs shock and detonation toolbox to solve some ODE's that describe the spatio-temporal structure of a detonation wave with detailed chemistry using cantera.

Since this is a very stiff problem, the default ODE solver is SciPy's Radau. For some initial conditions, this works flawlessly and reasonably fast. However, in many cases I encounter a behavior that increases integration time significantly and can even cause failure of the integration.

In these cases, the following pattern can be observed:

  1. the local error becomes low, so that the solver increases the step size
  2. solve_collocation_system is unable to solve the implicit equation system for this increased step size
  3. the step size gets continuously decreased while all the way trying to solve the system (thus resulting in many function calls)
  4. Eventually, the step size is reduced to the initial value in 1. or even below that and the collocation system converges. However, because the step size is so small again, local error is once again very small, causing the whole cycle to repeat from 1.

Once this pattern has established, each time step is extremely small and additionally takes many internal loops to be accepted. Playing around with tolerances can often help to avoid this unfortunate loop, but that means some trial an error for each initial condition I would like to examine.

I am neither a mathematician nor a computer scientist, so I am unsure whether this behavior is to some extent unavoidable when dealing with stiff ODE systems, or whether I might be able to tweak the solver to avoid this.

@rkern
Copy link
Member

rkern commented May 18, 2021

I suspect we'll need a small self-contained example to make progress on this. That kind of thing seems like a bug to me.

@Naikless
Copy link
Author

Naikless commented May 18, 2021

I tried to keep it as small as possible:

import cantera as ct
import numpy as np
from scipy.integrate import solve_ivp

# set up stoichiometric H2-air detonation
P0 = 1e5
T0 = 300
X = 'H2:42, O2:21, N2:79'
mech = 'gri30.cti'

# precalculated using shock and detonation toolbox
cj_speed = 1968.5642471885012
    
# Set up gas object of initial state
gas_init = ct.Solution(mech)
gas_init.TPX = T0, P0, X
r1 = gas_init.density

# frozen post shock state, precalculated using shock and detonation toolbox
P_fr = 2745072.177393151
T_fr = 1531.1425010572102
gas_fr = ct.Solution(mech)
gas_fr.TPX = T_fr, P_fr, X


# RHS of ODE, adapted from "znd.py" of shock and detonation toolbox
def fun(t,y):
    
    # helper fun
    def getThermicity(gas):
        w = gas.molecular_weights
        hs = gas.standard_enthalpies_RT*ct.gas_constant*gas.T/w
        dydt = gas.net_production_rates*w/gas.density
        thermicity = sum((gas.mean_molecular_weight/w
                          -hs/(gas.cp_mass*gas.T))*dydt) 
        return thermicity  
    
    gas_fr.DPY = y[1],y[0],y[2:]
    c = (gas_fr.cp_mass/gas_fr.cv_mass * gas_fr.T * ct.gas_constant/gas_fr.mean_molecular_weight)**0.5
    U = cj_speed*r1/gas_fr.density
    M = U/c
    eta = 1-M**2 
    
    sigmadot = getThermicity(gas_fr)
    Pdot = -gas_fr.density*U**2*sigmadot/eta
    rdot = -gas_fr.density*sigmadot/eta
    
    dYdt = gas_fr.net_production_rates*gas_fr.molecular_weights/gas_fr.density    

    return np.hstack((Pdot, rdot, dYdt))


#%% run solver on problem

y0 = np.hstack((gas_fr.P,gas_fr.density,gas_fr.Y))
tel = [0.,1e-3]

# atol=1e-8 necessarry to avoid solver stepping into negative pressure, might be also due to the described issue
znd_out = solve_ivp(fun,tel,y0,method='Radau', atol=1e-8)

I have tapped into the solver to extract some data on the factor that determines the next time step for the above case:
image
It can be seen that, triggered by the error falling below a certain threshold, step factors begin to oscillate, causing many steps for only a minuscule part of the overall calculation time. If atol is not set to 1e-8, these oscillations cause the pressure to be reduced below zero, which of course is nonphysical and in turn causes cantera to crash.

This behavior can be reproduced for different conditions and different kinetic mechanisms, although it remains somewhat unpredictable.

@laurent90git
Copy link

Would you be able to provide a standalone example that does not use Cantera ? I don't know if there is a simple enough mechanism that you can code yourself (like one global reaction) and still produces the issue. Are you sure you do not come near a singularity, for example M going close to 1 ?

@laurent90git
Copy link

Hi,
I've tested your case and all the integrators run into an issue where they cannot reach a sufficiently low error estimate without taking time steps that are much too low (1e-20) and could cause large rounding errors in the solution computation. Hence the computation is stopped with the message "Required step size is less than spacing between numbers". The other solvers (BDF for instance) also run into the same issue. This usually means you system is diverging, typically encountering a singularity.

Evaluating the ODE function (time derivatives) at the last point reached by the solver, I get time derivatives on the order of 1e17 to 1e20 for the pressure and the density...
In your ODE function, the Mach number M is at 0.9999999782772773. As you later divide by (1-M), you obtain a singularity.
There's nothing wrong with Scipy. Only the model which has a singularity. Maybe your model is not valid anymore when M is close to 1 and the equations should be changed, I don't know. But the solvers behave fine ;)

@Naikless
Copy link
Author

Naikless commented Jul 12, 2021

Thanks for diving into this! The singularity at M=1 is expected and the solution should be fine up until this point. But I suppose you are right, the effect illustrated in my example can be simply tied to approaching this singularity.

Interestingly, I never encountered the above message about the time step being too small. Are you running my exact example? Which version of python/scipy?

Maybe making min_step an optional argument for Radau would be worth considering?

@Naikless
Copy link
Author

Naikless commented Jun 20, 2023

Just to dig this out again with some additional info:

The persistence of this issue caused me to reimplement the whole problem in Julia. This gave me access to the wide variety of implicit solvers that DifferentialEquations.jl has in store.
None of them, including their RADAU implementation, ever behaved like described above, so I would claim that this is indeed a scipy issue and not a necessary result of this specific problem's math.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants