Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
188 commits
Select commit Hold shift + click to select a range
437eeab
Added gnn.py
Mar 21, 2025
82eb033
Added gnn.py
Mar 22, 2025
24e87b8
Added Graph class
Mar 22, 2025
a160fc3
Graph class loader from pyLOM mesh
Mar 23, 2025
33a5245
Completed Graph constructor method
Mar 26, 2025
de89763
Finished graph filter function
Mar 27, 2025
dbcb0c9
Added fit function
Mar 30, 2025
f836a7e
Integrated train function into fit function
Apr 2, 2025
2f4abf7
finished fit, _train and _test
Apr 2, 2025
9aaab98
adjusting predict function
Apr 3, 2025
0f0ce45
Deleting inner loop in training
Apr 5, 2025
f50a787
Created graph setter
Apr 6, 2025
7a8768a
Finished optuna func
Apr 6, 2025
79dfd0f
bug fixes in gnn.py
Apr 6, 2025
4543dd2
Changed names to gns and pyLOMGraph
Apr 6, 2025
bdd6184
added example_GNS_DLR_airfoil.py
Apr 6, 2025
a04450f
improvement in graph filtering func
Apr 6, 2025
dcbb099
Bug fix in gns.py
Apr 6, 2025
6c81c0a
Bug fix in gns.py
Apr 6, 2025
5a502f6
created datasets for example gns
Apr 6, 2025
2081ac7
Bug fixes in gns.py
Apr 7, 2025
4cce1bd
Finished example_GNS_DLR_airfoil.py
Apr 7, 2025
7dd3f8b
Bug fix in gns
Apr 7, 2025
1baea71
bug fix in gns
Apr 7, 2025
fbf5050
Bug fixes in gns.py
Apr 8, 2025
9ced6d4
Bug fixes in example gns
Apr 8, 2025
36291a3
improvements in gns
Apr 8, 2025
3c24252
Improvements in gns
Apr 8, 2025
31e7ce3
Improvements in example gns
Apr 10, 2025
12ab452
Bug fix in gns
Apr 10, 2025
3661f98
Comparison with DLR results
Apr 10, 2025
984db4d
bug fixes in gns
Apr 10, 2025
2dafedc
bug fix in gns example
Apr 11, 2025
9109cf5
printing losses
Apr 11, 2025
10172fb
updated example mesh
May 6, 2025
107b729
Reestructured graph building logics
May 7, 2025
9cffb08
Fixed docstrings in gns.py
May 8, 2025
03fe22e
Added snapshot plotting
May 10, 2025
de61df8
Changed __call__ inputs of GNS class
May 10, 2025
ab8e6f0
Improved forward method
May 10, 2025
2298a9f
fixed bug in gns.py
May 10, 2025
aabf583
Improvements and prints in gns
May 19, 2025
f00a5a1
Changed number of epochs in example gns
May 19, 2025
440bbeb
suggested changes pr
May 21, 2025
04804dd
Changed graph.node_attr to graph.x
May 23, 2025
1f53a37
Graph load/save functions
May 23, 2025
a733934
Adding new converter
Jun 4, 2025
3c2cf71
save graph like dataset
Jun 23, 2025
474f9ae
changed init and save/load
Jun 23, 2025
09eebfd
finished Graph
Jun 23, 2025
b7384ab
finished graph io
Jun 23, 2025
3917597
moved Graph to utils
Jun 24, 2025
00b501c
refactored NN/utils
Jun 24, 2025
234144f
Refactor GNS.forward: split graph/tensor paths, add vectorized inference
Jun 24, 2025
f1a253e
added batchers.py
Jun 24, 2025
3522daf
added list batcher
Jun 24, 2025
50b1489
Moved scaler protocol to scalers
Jun 24, 2025
ff107b1
bug fix in batchers
Jun 24, 2025
ae907d6
bug fix in batchers
Jun 24, 2025
963c3a5
New batcher in gns
Jun 24, 2025
dbb8f13
Aesthetic improvements
Jun 24, 2025
47b83dc
aesthetic improvements gns
Jun 25, 2025
8e295cb
Renamed DRL2h5.py to DLR2h5.py
Jun 25, 2025
e61101d
Finished DLR2h5.py
Jun 25, 2025
95ad540
Fixed bugs in Graph i/o
Jun 25, 2025
e73dc5f
Deleted h5_append_graph_serial
Jun 25, 2025
d55f0f0
import fix in graph.py
Jun 25, 2025
b6a72ca
Fixed naming in Graph
Jun 25, 2025
b5dd887
renamed to node_features and edge_features
Jun 26, 2025
64c71d0
Created GraphPreparer class
Jun 26, 2025
d17058c
vectorized loop in batchers
Jun 30, 2025
51602f4
Graph ready for production
Jun 30, 2025
d550132
batchers ready for production
Jun 30, 2025
a178fa8
gns ready for production
Jun 30, 2025
15cfb9c
added cr support for gns, graph, and batchers
Jun 30, 2025
eb74c64
refactored DLR2h5.py converter
Jun 30, 2025
524cf48
bugfix in DLR2h5.py
Jun 30, 2025
580a7eb
renamed to nodefeatr and edgefeatr in io_h5.py
Jun 30, 2025
bb99261
Bugfix in gns.py
Jul 1, 2025
572d7a9
Added NN/dataset support for global dataset
Jul 1, 2025
cb252d6
num_nodes and num_edges validation in graph.load
Jul 1, 2025
f52cbcb
finished example_GNS_DLR_airfoil.py
Jul 1, 2025
78f6705
Bugfix example_GNS_DLR_airfoil.py
Jul 1, 2025
38aff48
Working on dataset wrapper
Jul 1, 2025
98bd934
Documentation in batchers.py
Jul 3, 2025
9265590
Renamed batchers.py to gns_batchers.py. Refactored batcher classes
Jul 7, 2025
c8c9001
Bugfix in pyLOM/NN/__init__.py
Jul 7, 2025
4ff3028
Added Examples/NN/example_NN_Dataset.py
Jul 7, 2025
a1447ee
Bugfix in Converters/DLR2h5.py
Jul 7, 2025
d1e075c
Changes in example_GNS_DLR_airfoil.py
Jul 7, 2025
3c2e79c
added documentation for NN module
Jul 7, 2025
af25eaf
moved gns utils to NN/gns/
Jul 7, 2025
f23c10a
Added method _init_subgraph_loader in GNS
Jul 7, 2025
3ff16c3
fixed __init__ files in NN
Jul 7, 2025
eb22373
Minor improvemets in gns
Jul 7, 2025
342f4e6
renamed node_features, edge_features, node_labels to x, edge_attr, y
Jul 7, 2025
10f21fe
Bugfix
Jul 7, 2025
8564d32
Fixed import errors
Jul 7, 2025
8ede77b
Refactorization of GNS
Jul 8, 2025
638c228
Bugfixes GNS
Jul 8, 2025
646de94
Bugfix in example GNS
Jul 9, 2025
e2b911e
Moved utils from init to optuna_utils
Jul 9, 2025
cf15dd0
Small refactorization of GNS
Jul 9, 2025
4979d6c
aesthetic changes in GNS
Jul 9, 2025
d2742bf
Added yaml configs
Jul 15, 2025
65c34d9
working on configs
Jul 16, 2025
45c20b7
Added GNS documentation
Jul 16, 2025
4928a33
deleted from_config_yaml
Jul 16, 2025
533e044
Merge branch 'develop' of https://github.com/ArnauMiro/pyLowOrder int…
Jul 23, 2025
cc31050
working on api
Jul 24, 2025
982a9fd
update in gns_config.yaml
Jul 24, 2025
8a0727c
API 2.0
Jul 24, 2025
6edb6c8
fixing example_mesh.py
Jul 25, 2025
d87a96a
Working on pr changes
Jul 25, 2025
6b06ba9
working on pr
Jul 27, 2025
30c3a2d
deleted example_GNS_DLR_airfoil_old.py
Jul 27, 2025
5153829
added optuna to example
Jul 27, 2025
d8e3e39
moved gns_config.yml 1 level up
Jul 27, 2025
d9c56f7
bugfix
Jul 27, 2025
53a68d0
created config_loader_factory.py
Jul 28, 2025
5d7ebae
finished config_loader_factory
Jul 29, 2025
5cd95d3
modified gns_config.yaml
Jul 29, 2025
410674d
deleted experiments.py
Jul 29, 2025
8acee04
bugfix
Jul 29, 2025
29c6721
updated config_loader_factory
Jul 29, 2025
5ac749e
improvements in examle GNS
Jul 29, 2025
fb3badc
bugfix
Jul 29, 2025
3f76ec7
bugfix
Jul 29, 2025
216e195
moved config_loader_factory to NN
Jul 29, 2025
c338b26
bugfixes
Jul 29, 2025
219fa95
bugfix
Jul 29, 2025
1603450
Added optuna reproducibility
Jul 31, 2025
04e2631
finished API
Jul 31, 2025
60632bf
Reproducibility
Aug 1, 2025
5c7026a
minor fixes
Aug 1, 2025
876e3e7
improvements in api
Aug 2, 2025
0bd6014
minor fixes
Aug 2, 2025
59adfba
created config_serialization.py
Aug 3, 2025
da420a6
added generators and worker_init_fn
Aug 3, 2025
3e9233e
added git commit metadata
Aug 3, 2025
a292d49
minor fixes
Aug 3, 2025
87cd061
improved format
Aug 3, 2025
a2bade1
deleted config_manager.py
Aug 3, 2025
6994a64
bugfix
Aug 3, 2025
418fe57
added load_yaml
Aug 3, 2025
f7660ac
bugfix
Aug 3, 2025
f351f24
improved wrapper
Aug 3, 2025
00279f2
bugfixes
Aug 3, 2025
e95f5fa
restored init_subgraph_loader
Aug 3, 2025
7491785
bugfixes
Aug 3, 2025
c8953c0
created graph dataloader
Aug 4, 2025
8e15c14
bugfix
Aug 4, 2025
ce6dfad
bugfix
Aug 4, 2025
498929c
deleted test files
Aug 4, 2025
5e51b33
Debugging
Aug 4, 2025
6dfb017
created experiment.py
Aug 4, 2025
bc90927
Integrated ManualNeighborDataset
Aug 4, 2025
cf489bd
Working on configs
Aug 4, 2025
8129625
Deleted ManualNeighborDataset
Aug 5, 2025
b4e86b0
improving contracts
Aug 5, 2025
121eaa2
added AppConfig
Aug 7, 2025
fa93e3c
refactored config_loader.py
Aug 7, 2025
c191e0f
bugfix
Aug 7, 2025
a7d0fe2
debugging
Aug 8, 2025
593b86c
created auto-instantiate method
Aug 10, 2025
f0b7dad
Added fingerprint to graph
Aug 10, 2025
0b9c862
improved sae/load
Aug 10, 2025
3727fd5
fixes in optuna function
Aug 10, 2025
6ae76b6
bugfixes in gns
Aug 10, 2025
46b0f83
improvements in gns
Aug 10, 2025
9d3c143
Added instantiate_from_config to resolvers
Aug 10, 2025
222f095
bugfix
Aug 10, 2025
64f19da
minor fixes
Aug 10, 2025
9c49ed7
minor fixes
Aug 10, 2025
3eb7097
deleted legacy files
Aug 10, 2025
84c1416
bugfixes
Aug 10, 2025
4980f87
bugfixes
Aug 10, 2025
d7c442c
minor improvements
Aug 11, 2025
72f6d1e
bugfix
Aug 11, 2025
1e7bb6e
minor fixes
Aug 11, 2025
7c648f1
minor fixes
Aug 11, 2025
5871df7
delete TargetShapeWrapper
Aug 11, 2025
c946edb
improved get_git_commit func
Aug 11, 2025
0827701
added test_gns_repro.py
Aug 25, 2025
ccdf407
debugging scalers
Sep 5, 2025
d035a51
print debugging
Sep 5, 2025
fd07c77
Added blocks to MinMaxScaler
Sep 5, 2025
5d50e4b
improvements in scalers and dataset
Sep 9, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
148 changes: 148 additions & 0 deletions Converters/DLR2h5.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
#!/bin/env python
#
# Conversion from NLR7301 dataset to
# pyLOM v3.0 format
#
# 27/09/2024
import os
import glob
import numpy as np
import netCDF4 as NC4
import matplotlib.pyplot as plt
import torch
from tqdm import tqdm

import pyLOM
from pyLOM.NN import Graph


def process_edge_vectors(edge_index: np.ndarray, xyz: np.ndarray) -> np.ndarray:
"""
Process edge vectors from Cartesian to polar coordinates.
"""
edge_vecs = np.zeros((edge_index.shape[1], 2), dtype=np.float64)
for i, edge in enumerate(edge_index.T):
c_i = xyz[edge[0]]
c_j = xyz[edge[1]]
d_ij = c_j - c_i
edge_vecs[i, :] = [np.linalg.norm(d_ij), np.arctan2(d_ij[1], d_ij[0])]
return edge_vecs


def convert_dataset(dset: str, datapath: str, npoints: int, ptable: pyLOM.PartitionTable) -> None:
print(f"[{dset.upper()}] Processing dataset")
folder = os.path.join(datapath, dset)
filelist = glob.glob(os.path.join(folder, '*'))

Mvec = [float(f.split('_')[-2][1:]) for f in filelist]
AoAvec = [float(f.split('_')[-1][3:]) for f in filelist]
case = [int(f.split('_')[-3][4:]) for f in filelist]

xyz = np.zeros((npoints, 2), np.double)
X = np.zeros((npoints, len(Mvec)), np.double)

for ii, (c, M, AoA) in enumerate(tqdm(zip(case, Mvec, AoAvec), total=len(Mvec), desc=f"[{dset}]")):
fname = os.path.join(folder, f"Snap_Case{c:04d}_M{M:.5f}_AoA{AoA:.5f}")
with NC4.Dataset(fname) as ncfile:
xyz[:, 0] = ncfile.variables['x'][:npoints]
xyz[:, 1] = ncfile.variables['z'][:npoints]
X[:, ii] = ncfile.variables['cp'][:npoints]

d = pyLOM.Dataset(
xyz=xyz,
ptable=ptable,
order=np.arange(npoints),
point=True,
vars={
'Mach': {'idim': 0, 'value': np.array(Mvec)},
'AoA': {'idim': 0, 'value': np.array(AoAvec)},
},
CP={'ndim': 1, 'value': X},
)
out_path = os.path.join(datapath, f"{dset.upper()}_converter.h5")
d.save(out_path, append=False)
print(f"[{dset.upper()}] Saved to {out_path}\n")
return d


def create_graph(datapath: str, npoints: int) -> Graph:
sample_file = os.path.join(datapath, 'train', 'Snap_Case0000_M0.52500_AoA-0.33333')
with NC4.Dataset(sample_file) as ncfile:
xyz = np.zeros((npoints, 2), np.double)
xyz[:, 0] = ncfile.variables['x'][:npoints]
xyz[:, 1] = ncfile.variables['z'][:npoints]

normals = np.load(os.path.join(datapath, "normals.npz"))['normals']
wall_normals = np.load(os.path.join(datapath, "faceNormals.npz"))['faceNormals']
edge_index = torch.tensor(np.load(os.path.join(datapath, "edgesCOO.npz"))['edgesCOO'], dtype=torch.long)

x_dict = {
'xyz': torch.tensor(xyz, dtype=torch.float),
'normals': torch.tensor(normals, dtype=torch.float),
}
edge_vecs = process_edge_vectors(edge_index.numpy(), xyz)
edge_attr_dict = {
'edge_vecs': torch.tensor(edge_vecs, dtype=torch.float),
'wall_normals': torch.tensor(wall_normals, dtype=torch.float),
}

g = Graph(edge_index=edge_index, x_dict=x_dict, edge_attr_dict=edge_attr_dict)
print("Graph created.")
return g

def plot_graph_cp(x: np.ndarray, z: np.ndarray, normals: np.ndarray, cp: np.ndarray, mach: float, aoa: float, savepath: os.path) -> None:
""" Plot the NLR7301 airfoil, along with surface normals and CP distribution for a given Mach and AoA.
Args:
x (np.ndarray): X-coordinates of the airfoil.
z (np.ndarray): Z-coordinates of the airfoil.
normals (np.ndarray): Surface normals at each point.
cp (np.ndarray): CP distribution at each point.
mach (float): Mach number for the case.
aoa (float): Angle of attack for the case.
"""
plt.style.use('seaborn-v0_8-darkgrid')
plt.rcParams.update({'font.size': 14})
plt.figure(figsize=(10, 6))
plt.scatter(x, z, s=10)
plt.plot(x, cp, label='CP', color='blue')
plt.quiver(x, z, normals[:, 0], normals[:, 1], color='red', scale=10, label='Normals')
plt.title(f'CP Distribution (Mach: {mach}, AoA: {aoa})')
plt.xlabel('X')
plt.ylabel('Z')
plt.legend()
plt.grid()
plt.savefig(os.path.join(savepath,f"cp_plot_mach{mach}_aoa{aoa}.png"), dpi=300, bbox_inches='tight')
plt.show(block=True)


def main():
datapath = "/home/p.yeste/CETACEO_DATA/nlr7301/"
datasets = ['test', 'train', 'val']
npoints = 597
ptable = pyLOM.PartitionTable.new(1, npoints, npoints)

for dset in datasets:
saved_dset = convert_dataset(dset, datapath, npoints, ptable)

g = create_graph(datapath, npoints)
for dset in datasets:
path = os.path.join(datapath, f"{dset.upper()}_converter.h5")
print(f"Appending graph to {path}")
g.save(path, mode='a')

# Select a snapshot from the dataset and plot the airfoil with CP distribution
i_case = 0 # First case for demonstration
xyz = g.xyz
x = xyz[:, 0]
z = xyz[:, 1]
normals = g.normals
cp = saved_dset.fields['CP']['value'][:, i_case].flatten()
mach = saved_dset.get_variable('Mach')[i_case]
aoa = saved_dset.get_variable('AoA')[i_case]

plot_graph_cp(x, z, normals, cp, mach, aoa, savepath=datapath)



if __name__ == "__main__":
main()
118 changes: 118 additions & 0 deletions Examples/NN/configs/gns_config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,118 @@
# config.yaml — pyLOM experiment configuration (DTO-friendly)

experiment:
name: gns_nlr7301_baseline
description: "GNS model on NLR7301 airfoil dataset as described by ..."
version: 1.0
tags: [GNS, airfoil, baseline]
results_path: "../CETACEO_RESULTS/nlr7301/"
mode: optuna

datasets:
train_ds: "../CETACEO_DATA/nlr7301/TRAIN_converter.h5"
val_ds: "../CETACEO_DATA/nlr7301/VAL_converter.h5"
test_ds: "../CETACEO_DATA/nlr7301/TEST_converter.h5"

model:
# Kept outside the DTO on purpose (provenance, not a hyperparameter)
graph_path: "../CETACEO_DATA/nlr7301/TRAIN_converter.h5"
config:
# --- GNSModelConfig (pure DTO) ---
input_dim: 2
output_dim: 1
latent_dim: 16
hidden_size: 256
num_msg_passing_layers: 1
encoder_hidden_layers: 6
decoder_hidden_layers: 1
message_hidden_layers: 2
update_hidden_layers: 2
groupnorm_groups: 2
activation: "torch.nn.ELU"
p_dropout: 0.0
seed: 42
device: "cuda"

training:
# --- GNSTrainingConfig (pure DTO) ---
epochs: 1000
lr: 6.5e-4
lr_gamma: 0.9954
lr_scheduler_step: 1
optimizer: "torch.optim.Adam"
scheduler: "torch.optim.lr_scheduler.StepLR"
loss_fn: "torch.nn.MSELoss"
print_every: 25

dataloader:
batch_size: 15
shuffle: true
num_workers: 4
pin_memory: true

subgraph_loader:
batch_size: 32
shuffle: true
input_nodes: null

optuna:
study:
n_trials: 100
direction: minimize
pruner:
type: optuna.pruners.MedianPruner
n_startup_trials: 15
n_warmup_steps: 100
sampler:
type: optuna.samplers.TPESampler
multivariate: true
seed: 42

# Search space (NOT DTOs) — these stay as distributions for sampling.
optimization_params:
model:
graph_path: "../CETACEO_DATA/nlr7301/TRAIN_converter.h5"
config:
# Everything here is a search space for sampling -> returns a dict that
# we feed to dacite.from_dict(GNSModelConfig, ...)
input_dim: 2
output_dim: 1
latent_dim: { type: int, low: 8, high: 64, step: 2 }
hidden_size: { type: int, low: 64, high: 512 }
num_msg_passing_layers: { type: int, low: 1, high: 5 }
encoder_hidden_layers: { type: int, low: 1, high: 4 }
decoder_hidden_layers: { type: int, low: 1, high: 4 }
message_hidden_layers: { type: int, low: 1, high: 4 }
update_hidden_layers: { type: int, low: 1, high: 4 }
groupnorm_groups: 2
activation:
type: categorical
choices:
- "torch.nn.ELU"
- "torch.nn.ReLU"
- "torch.nn.LeakyReLU"
p_dropout: { type: float, low: 0.0, high: 0.5 }
seed: 42
device: "cuda"

training:
# Search space for GNSTrainingConfig (NOT a DTO here)
loss_fn: "torch.nn.MSELoss"
optimizer: "torch.optim.Adam"
scheduler: "torch.optim.lr_scheduler.StepLR"
epochs: 1000
lr: { type: float, low: 1.0e-5, high: 1.0e-3, log: true }
lr_gamma: { type: float, low: 0.9, high: 0.999 }
lr_scheduler_step: { type: int, low: 1, high: 10 }
print_every: null

dataloader:
batch_size: { type: int, low: 8, high: 64 }
shuffle: true
num_workers: 4
pin_memory: true

subgraph_loader:
batch_size: { type: int, low: 1, high: 597 }
shuffle: true
input_nodes: null
Loading