Skip to content

Commit 636a6b5

Browse files
jlsmith-hepJamesvvolkl
authored
Baler 2025 GSoC Project (#1684)
* Baler 2025 proposal * Make compatible with other manchester merge * Update with Leonid's additions * Fix orgs --------- Co-authored-by: James <jamse.smith-7@manchester.ac.uk> Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
1 parent 3f1b538 commit 636a6b5

File tree

4 files changed

+83
-0
lines changed

4 files changed

+83
-0
lines changed

.github/config/mdcheck.json

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,8 @@
11
{
22
"ignorePatterns": [
3+
{
4+
"pattern": "https://www.sciencedirect.com/science/article/abs/pii*"
5+
},
36
{
47
"pattern": "/assets/CWP-Charge-HSF.pdf"
58
},

_gsocorgs/2025/baler.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
---
2+
title: "BALER"
3+
author: "James Smith"
4+
layout: default
5+
organization: baler
6+
logo: baler-logo.png
7+
description: |
8+
BALER is a compression tool undergoing development at the particle physics division of the University of Manchester. BALER uses autoencoder and other types of neural networks as a type of lossy machine learning-based compression to compress multi-dimensional data and evaluate the accuracy of the dataset after compression. BALER is led by a collaboration of early-career scientists and welcomes all contributions.
9+
---
10+
11+
{% include gsoc_proposal.ext %}

_gsocproposals/2025/proposal_BALER.md

Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
---
2+
title: The rise of the machine (learning) in data compression for high energy physics and beyond
3+
layout: gsoc_proposal
4+
project: BALER
5+
year: 2025
6+
organization:
7+
- baler
8+
- UManchester
9+
- CERN
10+
difficulty: medium
11+
duration: 350
12+
mentor_avail: June-August
13+
---
14+
15+
## Short description of the project
16+
The Large Hadron Collider (LHC) hosts multiple large-scale experiments, LHC experiments such as ATLAS, ALICE, LHCb, and CMS. These together produce roughly 1 Petabyte of data per second, but bandwidth and storage limitations force them to only pick the most interesting data, and discard the rest. The final data stored on disk is roughly 1 Petabyte per day [[1](https://home.cern/news/news/computing/cern-data-centre-passes-200-petabyte-milestone)]. Despite such steep methods of data reduction, the upgraded High Luminosity LHC in 2029 will produce 10 times more particle collisions. This is a great improvement for the potential to discover new physics, but poses a challenge both for data processing and data storage, as the resources needed in both departments are expected to be 3 and 5 times larger than the projected resources available [[2](https://cerncourier.com/a/time-to-adapt-for-big-data/)][[3](https://doi.org/10.1051/epjconf/202024504035)].
17+
18+
Data compression would be the go-to solution to this issue, but general data formats used for big data and the ROOT data format used at the LHC are already highly compressed, meaning that the data does not compress much under normal loss-less compression methods like zip [[4](https://www.sciencedirect.com/science/article/abs/pii/S016890029700048X)]. However, since the observables in these experiments benefit from more events and higher statistics, lossy compression is a good alternative. By using lossy compression some data accuracy is lost, but the compression will allow for the storage of more data which will increase the statistical precision of the final analysis.
19+
20+
BALER is a compression tool undergoing development at the particle physics division of the University of Manchester. BALER uses autoencoder and other neural networks as a type of lossy machine learning-based compression to compress multi-dimensional data and evaluate the accuracy of the dataset after compression.
21+
22+
Since data storage is a problem in many fields of science and industry, BALER aims to be an open source tool that can support the compression of data formats from vastly different fields of science. For example, catalog data in astronomy and time series data in computational fluid dynamics.
23+
24+
This project aims to work on the machine learning models in BALER to optimize performance for LHC data and evaluate its performance in real LHC analyses.
25+
26+
## Task ideas
27+
28+
This internship can focus on a range of work packages, and the project can be tailored to the intern. Possible projects include:
29+
30+
* New auto-encoder models could be developed, better identifying correlations between data objects in a given particle physics dataset entry (event, typically containing thousands of objects and around 1MB each). New models could also improve performance on live / unseen data. These could include transformer, GNN, probabilistic and other tiypes of networks.
31+
* Existing models could be applied on an FPGA, potentially significantly reducing latency and power consumption, opening the possibility of live compression before transmission of data on a network.
32+
* BALER could also be integrated into standard research data storage formats and programs used by hundreds of thousands of physics researchers (ROOT).
33+
* Finally the compression could be applied to particle physics datasets and the effect on the physics discovery sensitivity of an analysis could be assessed and compared to the possible increased sensitivity from additional data bandwidth.
34+
35+
Ideas from the intern are also welcomed.
36+
37+
## Expected results
38+
39+
An improved compression performance with documentation and figures of merit that may include:
40+
* Plots made in matplotlib that demonstrate the performance of the new models compared to the old
41+
* Documentation of the design choices made for the improved models
42+
* Documented evaluation of a physics analysis on data before and after compression
43+
44+
## Requirements
45+
46+
The candidate should have experience with the python language and a Linux environment, familiarity with AI fundamentals, and familiarity with PyTorch.
47+
48+
Desirable skills include familiarity with AI fundamentals including transformers and/or graph neural networks, particle physics theory and experiments, PyTorch, FPGA programming and/or simulation.
49+
50+
51+
## Mentors
52+
* ***[James Smith](mailto:james.smith-7@manchester.ac.uk)***
53+
* [Caterina Doglioni](mailto:caterina.doglioni@cern.ch) as backup mentor
54+
* [Leonid Didukh](mailto:ledidukh@gmail.com)
55+
56+
## Links
57+
58+
* [BALER GitHub](https://github.com/baler-collaboration/baler)
59+
* [BALER Paper](https://arxiv.org/abs/2305.02283)
60+
61+
* Previous work:
62+
* [Thesis by Eric Wulff, Lund University](https://lup.lub.lu.se/student-papers/search/publication/9004751)
63+
* [Thesis by Erik Wallin, Lund University](https://lup.lub.lu.se/student-papers/search/publication/9012882)
64+
* [GSOC 2020 project: Medium post by Honey Gupta](https://medium.com/@hn.gpt1/deep-compression-for-high-energy-physics-data-google-summer-of-code20-3dea5acc7bcf)
65+
* [GSOC 2021 project: Zenodo entry by George Dialektakis](https://zenodo.org/record/5482611#.Y-I28S2l3fa)
66+
67+
* [ROOT](https://root.cern/)
68+
* [Jupyter](http://jupyter.org)
69+
* [PyTorch](http://pytorch.org)

images/baler-logo.png

475 KB
Loading

0 commit comments

Comments
 (0)