-
Notifications
You must be signed in to change notification settings - Fork 38
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'master' into pyup-update-dill-0.2.7.1-to-0.2.8.2
- Loading branch information
Showing
49 changed files
with
2,203 additions
and
453 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,19 +1,32 @@ | ||
## Contribution guidelines | ||
|
||
You're welcome to contribute to strax! | ||
Currently it might be a little difficult since not even the core features are implemented | ||
and there is no documentation to speak of. For the coming weeks we're probably not even following our own advice. | ||
|
||
- Please work in a fork, then submit pull requests. | ||
- Unless you have have a good reason (and access) to make a branch. | ||
- Do not commit large (> 100 kB) files. | ||
- For example, do not commit jupyter notebooks with high-resolution plots (clear the output first), or long configuration files, or binary test data. | ||
- We'd like to keep the repository no more than a few MB. | ||
While it's possible to rewrite history to remove large files, this is a bit of work and messes with the repository's consistency. | ||
Once data has gone to master it's especially difficult, then there's a risk of others merging the files back in later unless they cooperate in the history-rewriting. | ||
- This is one reason to prefer forks over branches; if you commit a huge file by mistake it's just in your fork. | ||
- Of course, please write nice and clean code :-) | ||
- PEP8-compatibility is great (you can test with flake8) but not as important as other good coding habits such as avoiding duplication. See e.g. the [famous beyond PEP8 talk](https://www.youtube.com/watch?v=wf-BqAjZb8M). | ||
- In particular, don't go into code someone else is maintaining to "PEP8-ify" it (or worse, use some automatic styling tool) | ||
- Other style guidelines (docstrings etc.) are yet to be determined. | ||
- When accepting pull requests, prefer merge or squash depending on how the commit history looks. | ||
- If it's dozens of 'oops' and 'test' commits, best to squash. | ||
- If it's a few commits that mostly outline discrete steps of an implementation, it's worth keeping, so best to merge. | ||
|
||
Currently many features are still in significant flux, and the documentation is still very basic. Until more people start getting involved in development, we're probably not even following our own advice below... | ||
|
||
### Please fork | ||
Please work in a fork, then submit pull requests. | ||
Only maintainers sometimes work in branches if there is a good reason for it. | ||
|
||
### No large files | ||
Avoid committing large (> 100 kB) files. We'd like to keep the repository no more than a few MB. | ||
|
||
For example, do not commit jupyter notebooks with high-resolution plots (clear the output first), or long configuration files, or binary test data. | ||
|
||
While it's possible to rewrite history to remove large files, this is a bit of work and messes with the repository's consistency. Once data has gone to master it's especially difficult, then there's a risk of others merging the files back in later unless they cooperate in the history-rewriting. | ||
|
||
This is one reason to prefer forks over branches; if you commit a huge file by mistake it's just in your fork. | ||
|
||
### Code style | ||
Of course, please write nice and clean code :-) | ||
|
||
PEP8-compatibility is great (you can test with flake8) but not as important as other good coding habits such as avoiding duplication. See e.g. the [famous beyond PEP8 talk](https://www.youtube.com/watch?v=wf-BqAjZb8M). | ||
|
||
In particular, don't go into code someone else is maintaining to "PEP8-ify" it (or worse, use some automatic styling tool) | ||
|
||
Other style guidelines (docstrings etc.) are yet to be determined. | ||
|
||
### Pull requests | ||
When accepting pull requests, prefer merge or squash depending on how the commit history looks. | ||
- If it's dozens of 'oops' and 'test' commits, best to squash. | ||
- If it's a few commits that mostly outline discrete steps of an implementation, it's worth keeping, so best to merge. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,13 +1,22 @@ | ||
|
||
0.2.0 | ||
------ | ||
- Start documentation | ||
- `ParallelSourcePlugin` to better distribute low-level processing over multiple cores | ||
- `OverlapWindowPlugin` to simplify algorithms that look back and ahead in the data | ||
- Run-dependent config defaults | ||
- XENON: Position reconstruction (tensorflow NN) and corrections | ||
|
||
0.1.2 / 2018-05-09 | ||
------------------ | ||
- Failed to make last patch release. | ||
|
||
0.1.1 / 2018-05-09 | ||
------------------ | ||
- Bug fix of not shipping all subpackages, thereby numba could not find code #19 | ||
- Autodeploy from Travis to PyPI works | ||
- Badges | ||
- `#19`: list subpackages in setup.py, so numba can find cached code | ||
- Autodeploy from Travis to PyPI | ||
- README badges | ||
|
||
0.1.0 / 2018-05-06 | ||
------------------ | ||
- initial release | ||
- Initial release |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,60 +1,14 @@ | ||
# strax | ||
Streaming analysis for Xenon experiments | ||
Streaming analysis for xenon experiments | ||
|
||
[![Build Status](https://travis-ci.org/AxFoundation/strax.svg?branch=master)](https://travis-ci.org/AxFoundation/strax) | ||
[![Readthedocs Badge](https://readthedocs.org/projects/strax/badge/?version=latest)](https://strax.readthedocs.io/en/latest/?badge=latest) | ||
[![Coverage Status](https://coveralls.io/repos/github/AxFoundation/strax/badge.svg?branch=master)](https://coveralls.io/github/AxFoundation/strax?branch=master) | ||
[![PyPI version shields.io](https://img.shields.io/pypi/v/strax.svg)](https://pypi.python.org/pypi/strax/) | ||
[![Join the chat at https://gitter.im/AxFoundation/strax](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/AxFoundation/strax?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) | ||
[![Codacy Badge](https://api.codacy.com/project/badge/Grade/cc159474f2764d43b445d562a24ca245)](https://www.codacy.com/app/tunnell/strax?utm_source=github.com&utm_medium=referral&utm_content=AxFoundation/strax&utm_campaign=Badge_Grade) | ||
|
||
Strax is an analysis framework for pulse-only digitization data, | ||
specialized for live data reduction at speeds of 50-100 MB(raw) / core / sec. | ||
|
||
For comparison, this is more than 100x faster than the XENON1T processor [pax](http://github.com/XENON1T/pax), | ||
and does not require a preprocessing stage ('eventbuilder'). | ||
It achieves this due to using [numpy](https://docs.scipy.org/doc/numpy/) [structured arrays](https://docs.scipy.org/doc/numpy/user/basics.rec.html) internally, | ||
which are supported by the amazing just-in-time compiler [numba](http://numba.pydata.org/). | ||
|
||
Features: | ||
* Start from unordered streams of pulses (like pax's [trigger](https://xe1t-wiki.lngs.infn.it/doku.php?id=xenon:xenon1t:aalbers:trigger_upgrade)) | ||
* Output to files or MongoDB | ||
* Plugin system for extensibility | ||
* Each plugin produces a dataframe | ||
* Dependencies and configuration tracked explicitly | ||
* Limited "Event class" emulation for code that needs it | ||
* Processing algorithms: hitfinding, sum waveform, clustering, classification, event building | ||
|
||
Strax is initially developed for the XENONnT experiment. However, the configuration | ||
and specific algorithms for XENONnT will ultimately be hosted into a separate repository. | ||
|
||
### Documentation | ||
|
||
Documentation is under construction. For the moment, you might find these useful: | ||
* [Tutorial notebook](https://www.github.com/AxFoundation/strax/blob/master/notebooks/Strax%20demo.ipynb) | ||
* [Introductory talk](https://docs.google.com/presentation/d/1qZmbAKJmzn7iTbBbkzhTvHmiBqdbYyxhgheRRrDhTeY) (aimed at XENON1T analysis/DAQ experts) | ||
* Function reference (TODO readthedocs) | ||
|
||
|
||
### Installation | ||
To install the latest stable version (from pypi), run `pip install strax`. | ||
Dependencies should install automatically: | ||
numpy, pandas, numba, two compression libraries (blosc and zstd | ||
and a few miscellaneous pure-python packages. | ||
|
||
You can also clone the repository, then setup a developer installation with `python setup.py develop`. | ||
|
||
If you experience problems during installation, try installing | ||
exactly the same version of the dependencies as used on he Travis build test server. | ||
Clone the repository, then do `pip install -r requirements.txt`. | ||
|
||
#### Test data | ||
|
||
The provided demonstration notebooks require test data that is not included in the repository. | ||
Eventually we will provide simulated data for this purpose. | ||
For now, XENON collaboration members can find test data at: | ||
* [Processed only](https://xe1t-wiki.lngs.infn.it/lib/exe/fetch.php?media=xenon:xenon1t:aalbers:processed.zip) (for strax demo notebook) | ||
* Raw (for fake_daq.py and eb.py) at midway: `/scratch/midway2/aalbers/test_input_data.zip` | ||
|
||
To use these, unzip them in the same directory as the notebooks. The 'processed' zipfile will just make one directory with a single zipfile inside. | ||
|
||
Strax is an analysis framework for pulse-only digitization data, specialized for live data reduction at speeds of 50-100 MB(raw) / core / sec. | ||
It's primary aim is to support noble liquid TPC dark matter searches, such as XENONnT. | ||
|
||
For more information, please see the [strax documentation](https://strax.readthedocs.io). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
#!/usr/bin/env bash | ||
make clean | ||
rm -r source/reference | ||
sphinx-apidoc -o source/reference ../strax | ||
rm source/reference/modules.rst | ||
make html |
Oops, something went wrong.