Load Terabytes of Data From Postgres Into BigQuery
-
Updated
Jun 4, 2019 - Go
Load Terabytes of Data From Postgres Into BigQuery
utility to enable flexible ETL scenarios, supports golang plug-in for built-in consumer|transformer|producer options
Proof of concept application for medium and large scale data acquisition
csvplus extends the standard Go encoding/csv package with fluent interface, lazy stream operations, indices and joins.
this is an example of a churro extension that you can write to extend churro's transformation logic to meet your unique needs.
A flexible, light, easy to use, automation framework for typical data manipulation with terminal commands.
Industrial monitoring systems for power plants. Stream data from acoustic-based culvert rupture telltale aggregation boxes.
Open AI processor for Benthos
Callback-based iterators for Go language.
Command line tool to upload csv files to Postgres with Go
OpenSource data platform to build event-driven systems. It's like Deebezium for golang :)
Stellar ETL will enable real-time analytics on the Stellar network
Service for bulk-loading data to databases with automatic schema management (Redshift, Snowflake, BigQuery, ClickHouse, Postgres, MySQL)
Add a description, image, and links to the etl-pipeline topic page so that developers can more easily learn about it.
To associate your repository with the etl-pipeline topic, visit your repo's landing page and select "manage topics."