AEStream sends event-based data from A to B. AEStream is both a command-line tool an a C++/Python library with built-in GPU-acceleration for use with PyTorch, and Jax. We support reading and writing from files, event cameras, network protocols, and visualization tools.
Read more about the inner workings of the library in the AEStream publication.
Read more in our installation guide
The fastest way to install AEStream is by using pip: pip install aestream
.
Source | Installation | Description |
---|---|---|
pip | pip install aestream pip install aestream --no-binary aestream |
Standard installation Support for event-cameras and CUDA kernels (more info) |
nix | nix run github:aestream/aestream nix develop github:aestream/aestream |
Command-line interface Python environment |
docker | See Installation documentation |
Contributions to support AEStream on additional platforms are always welcome.
Read more in our Python usage guide
AEStream can process .csv
, .dat
, .evt3
, and .aedat4
files like so.
You can either directly load the file into memory
FileInput("file.aedat4", (640, 480)).load()
or stream the file in real-time to PyTorch, Jax, or Numpy
with FileInput("file.aedat4", (640, 480)) as stream:
while True:
frame = stream.read("torch") # Or "jax" or "numpy"
...
Streaming data is particularly useful in real-time scenarios. We currently support Inivation, Prophesee, and SynSense devices over USB, as well as the SPIF protocol over UDP. Note: requires local installation of drivers and/or SDKs (see installation guide).
# Stream events from a DVS camera over USB
with USBInput((640, 480)) as stream:
while True:
frame = stream.read() # A (640, 480) Numpy tensor
...
# Stream events from UDP port 3333 (default)
with UDPInput((640, 480), port=3333) as stream:
while True:
frame = stream.read("torch") # A (640, 480) Pytorch tensor
...
More examples can be found in our example folder.
Please note the examples may require additional dependencies (such as Norse for spiking networks or PySDL for rendering). To install all the requirements, simply stand in the aestream
root directory and run pip install -r example/requirements.txt
We stream events from a camera connected via USB and process them on a GPU in real-time using the spiking neural network library, Norse using fewer than 50 lines of Python.
The left panel in the video shows the raw signal, while the middle and right panels show horizontal and vertical edge detection respectively.
The full example can be found in example/usb_edgedetection.py
Read more in our CLI usage documentation page
Installing AEStream also gives access to the command-line interface (CLI) aestream
.
To use aestraem
, simply provide an input
source and an optional output
sink (defaulting to STDOUT):
aestream input <input source> [output <output sink>]
Input | Description | Example usage |
---|---|---|
DAVIS, DVXPlorer | Inivation DVS Camera over USB | input inivation |
EVK Cameras | Prophesee DVS camera over USB | input prophesee |
File | Reads .aedat , .aedat4 , .csv , .dat , or .raw files |
input file x.aedat4 |
SynSense Speck | Stream events via ZMQ | input speck |
UDP network | Receives stream of events via the SPIF protocol | input udp |
Output | Description | Example usage |
---|---|---|
STDOUT | Standard output (default output) | output stdout |
Ethernet over UDP | Outputs to a given IP and port using the SPIF protocol | output udp 10.0.0.1 1234 |
File: .aedat4 |
Output to .aedat4 format |
output file my_file.aedat4 |
File: .csv |
Output to comma-separated-value (CSV) file format | output file my_file.csv |
Viewer | View live event stream | output view |
Example | Syntax |
---|---|
View live stream of Inivation camera (requires Inivation drivers) | aestream input inivation output view |
Stream Prophesee camera over the network to 10.0.0.1 (requires Metavision SDK) | aestream input output udp 10.0.0.1 |
Convert .dat file to .aedat4 |
aestream input example/sample.dat output file converted.aedat4 |
AEStream is developed by (in alphabetical order):
- Cameron Barker (@GitHub cameron-git)
- Juan Pablo Romero Bermudez (@GitHub jpromerob)
- Alexander Hadjivanov (@Github cantordust)
- Emil Jansson (@GitHub emijan-kth)
- Jens E. Pedersen (@GitHub jegp)
- Christian Pehle (@GitHub cpehle)
The work has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Fundation) under Germany's Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).
Thanks to Philipp Mondorf for interfacing with Metavision SDK and preliminary network code.
Please cite aestream
if you use it in your work:
@inproceedings{10.1145/3584954.3584997,
author = {Pedersen, Jens Egholm and Conradt, Jorg},
title = {AEStream: Accelerated event-based processing with coroutines},
year = {2023},
isbn = {9781450399470},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3584954.3584997},
doi = {10.1145/3584954.3584997},
booktitle = {Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference},
pages = {86–91},
numpages = {6},
keywords = {coroutines, event-based vision, graphical processing unit, neuromorphic computing},
location = {San Antonio, TX, USA, },
series = {NICE '23}
}