Self-Service Data Analytics for the (Industrial) IoT
StreamPipes is a self-service (Industrial) IoT toolbox to enable non-technical users to connect , analyze and explore IoT data streams.
Table of contents
- About Apache StreamPipes
- Use Cases
- Pipeline Elements
- Extending StreamPipes
- Get help
About Apache StreamPipes
Apache StreamPipes (incubating) enables flexible modeling of stream processing pipelines by providing a graphical modeling editor on top of existing stream processing frameworks.
It leverages non-technical users to quickly define and execute processing pipelines based on an easily extensible toolbox of data sources, data processors and data sinks. StreamPipes has an exchangeable runtime execution layer and executes pipelines using one of the provided wrappers, e.g., for Apache Flink or Apache Kafka Streams.
Pipeline elements in StreamPipes can be installed at runtime - the built-in SDK allows to easily implement new pipeline elements according to your needs. Pipeline elements are standalone microservices that can run anywhere - centrally on your server, in a large-scale cluster or close at the edge.
StreamPipes allows you to connect IoT data sources using the SDK or the built-in graphical tool StreamPipes Connect.
The extensible toolbox of data processors and sinks supports use cases such as
- Continuously store IoT data streams to third party systems (e.g., databases)
- Filter measurements on streams (e.g., based on thresholds or value ranges)
- Harmonize data by using data processors for transformations (e.g., by converting measurement units and data types or by aggregating measurements)
- Detect situations that should be avoided (e.g., patterns based on time windows)
- Wrap Machine Learning models into data processors to perform classifications or predictions on sensor and image data
- Visualize real-time data from sensors and machines using the built-in Live Dashboard
The quickest way to run StreamPipes is the Docker-based installer script available for Unix, Mac and Windows (10).
It's easy to get started:
- Make sure you have Docker and Docker Compose installed.
- Clone or download the installer script from https://www.github.com/apache/incubator-streampipes-installer
- Enter the hostname and choose the version you'd like to run (the Lite version runs with less memory assigned to Docker (< 6 GB), use the full version if you have more memory available)
- Open your browser, navigate to
http://YOUR_HOSTNAME_HEREand follow the installation instructions.
- Once finished, switch to the pipeline editor and start the interactive tour or check the online tour to learn how to create your first pipeline!
For a more in-depth manual, read the installation guide at https://streampipes.apache.org/docs/docs/user-guide-installation/!
StreamPipes includes a repository of ready-to-use pipeline elements. A description of the standard elements can be found in the Github repository streampipes-extensions.
You can easily add your own data streams, processors or sinks. A Java-based SDK and several run-time wrappers for popular streaming frameworks such as Apache Flink, Apache Spark and Apache Kafka Streams (and also plain Java programs) can be used to integrate your existing processing logic into StreamPipes. Pipeline elements are packaged as Docker images and can be installed at runtime, whenever your requirements change.
Check our developer guide at https://streampipes.apache.org/docs/docs/dev-guide-introduction.
If you have any problems during the installation or questions around StreamPipes, you'll get help through one of our community channels:
And don't forget to follow us on Twitter!
We welcome contributions to StreamPipes. If you are interested in contributing to StreamPipes, let us know!
We'd love to hear your feedback! Subscribe to firstname.lastname@example.org