-
-
Notifications
You must be signed in to change notification settings - Fork 185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement DAG builder for parsing Ethereum logs #17
Comments
Issue Status: 1. Open 2. Started 3. Submitted 4. Done This issue now has a funding of 500.0 DAI (500.0 USD @ $1.0/DAI) attached to it.
|
Issue Status: 1. Open 2. Started 3. Submitted 4. Done Work has been started. These users each claimed they can complete the work by 2 weeks from now. 1) igetgames has applied to start work (Funders only: approve worker | reject worker). I will implement the DAG builder according to ethereum-etl-airflow/issues#17 and the README. Create an Airflow DAG builder which can later be used to parse Ethereum logs given an ABI. For example 0x transactions and ENS events. Learn more on the Gitcoin Issue Details page. |
Should we also add a field E.g. |
@askeluv There is an example here for converting event signature e.g. Transfer(address,address,uint256) to it's hash: https://github.com/blockchain-etl/ethereum-etl/blob/develop/ethereumetl/cli/get_keccak_hash.py. It uses the keccak function from eth_utils lib. I've also found these 2 functions in eth_utils/abi.py:
|
Issue Status: 1. Open 2. Started 3. Submitted 4. Done Work for 500.0 DAI (500.0 USD @ $1.0/DAI) has been submitted by: @ceresstation please take a look at the submitted work:
|
Issue Status: 1. Open 2. Started 3. Submitted 4. Done The funding of 500.0 DAI (500.0 USD @ $1.0/DAI) attached to this issue has been approved & issued to @askeluv.
|
The DAG builder is simply a Python function that generates an Airflow DAG given a list of parameters. An example of a DAG builder can be found here.
The input for the DAG builder for parsing Ethereum logs is a list of json files, each of which is a table definition file. An example table definition file is given below:
The output is an Airflow DAG:
PythonOperator
task should be created which executes a BigQuery query job with destination table (an example can be found here).table
andparser
definitions, using Jinja template (an example Jinja template can be found here. An example log parsing query can be found here).The text was updated successfully, but these errors were encountered: