A neural rough differential equation netowrk based solver for path-dependent partial differential equation. [arXiv]
In this paper we investigate the solution of path-dependent partial differential equation (PPDE) of the form
where
We strongly recommend checking the following repositories relating to Neural-ODE (NODE) type models and previous works on solving PPDE using neural network:
- torchdiffeq [torchdiffeq]: A differentiable ODE solvers with full GPU support and adjoint backpropagation using
$\mathcal{O}(1)$ memory. - Neural CDE [torchcde]: Neural controlled differential equation network that generalise NODE and incorporate sequential input.
- Neural RDE [NRDE]: Neural rough differential equation network we use in our paper.
- Deep PPDE [Deep-PPDE]: Using LSTM network with signature input to approximate solution of PPDE.
Install the required packages by running:
pip install -r requirement.txt
pip install signatory==1.2.6.1.7.1 --no-cache-dir --force-reinstall
The signatory have to be installed after the installation of correspondingPytorch
version, otherwise it may cause error message.
The code for our NRDE solver for PPDE is in the NRDE_Solver.py file under the Solver folder. In the folder we also include the package for the general neural RDE network, which we will use to build our solver.
self.sig_channels = signatory.logsignature_channels(in_channels=self.d+1, depth=depth)
self.f = nrde.model.NeuralRDE(initial_dim=self.d+1, logsig_dim=self.sig_channels, hidden_dim=hidden, output_dim=output, num_layers=num_layers,hidden_hidden_dim=ffn_hidden,solver=odesolver,odestep=odestep
self.dfdx= nrde.model.NeuralRDE(initial_dim=self.d+1, logsig_dim=self.sig_channels, hidden_dim=hidden, output_dim=self.d, num_layers=num_layers,hidden_hidden_dim=ffn_hidden,solver=odesolver,odestep=odestep)
def cond_exp(self, ts: torch.Tensor, x0: torch.Tensor, option: Lookback, lag: int,drop:bool):
if self.d_red:
x_copy,x, path_signature, brownian_increments = self.prepare_data(ts,x0,lag,drop)
payoff = option.payoff(x_copy)
else:
x,path_signature,brownian_increments=self.prepare_data(ts, x0, lag, drop)
payoff=option.payoff(x)
device = x.device
batch_size = x.shape[0]
t = ts[::lag]
x1=x[:,::lag,:]
t0=torch.zeros(batch_size,len(t),1,device=device)
x1=torch.cat([t0,x1],2)
if not self.withx:
Y = self.f((x1[:,0,:],path_signature))
else:
Y = self.f((x1[:,0,:],path_signature,x[:,::lag,:]))
loss_fn = nn.MSELoss()
loss = 0
for idx,idt in enumerate(ts[::lag]):
if self.ncdrift:
discount_factor = torch.exp(-self.mu*0.5*(t[-1]**2-idt**2))
else:
discount_factor = torch.exp(-self.mu*(t[-1]-idt))
target = discount_factor*payoff
pred = Y[:,idx,:]
loss += loss_fn(pred, target)
return loss, Y,payoff
The Experiment folder contains the three experiments we conduct. For example, to run the heat equation's experiment with dimension
python Heat_nrde.py --d 4 --d_red False
After the training, the model parameters will be stored in the numerical_results folder.
result = {"state":ppde.state_dict(),
"loss":losses}
torch.save(result, os.path.join(base_dir, "model_{}.tar".format(d)))
This folder contains all the models we get for each experiment. The .tar
file contains all the model parameters and could be loaded using torch.load
and model.load_state_dict
functions once we specify the hyperparameters of the model. We include a jupyter notebook Report in the main directory to illustrate this procedure and generate the plot and result from the paper.
@misc{fang2023neural,
title={A Neural RDE-based model for solving path-dependent PDEs},
author={Bowen Fang and Hao Ni and Yue Wu},
year={2023},
eprint={2306.01123},
archivePrefix={arXiv},
primaryClass={cs.LG}
}