Skip to content

Commit

Permalink
Write Settlements to DB from Action (#185)
Browse files Browse the repository at this point in the history
Introducing transaction log parsing via ethers library with appropriate Contract artifacts. Namely, we introduce dependencies on @cowprotocol/contracts which contain the abi files for GPv2Settlement and iERC20 and suffice for us to decode relevant event data from settlement transactions. For us these are Transfer, Settlement and Trade.

Event types are constructed out of each which only keep the parts that matter for our purposes (namely TradeEvent having owner as the only relevant field).
  • Loading branch information
bh2smith committed Apr 3, 2023
1 parent a146af8 commit 546f885
Show file tree
Hide file tree
Showing 16 changed files with 3,232 additions and 632 deletions.
22 changes: 22 additions & 0 deletions .github/workflows/pull-request.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,32 @@ jobs:
tenderly-ci:
name: Tenderly Lint & Tests
runs-on: ubuntu-latest
services:
postgres:
image: postgres
env:
POSTGRES_PASSWORD: postgres
# Set health checks to wait until postgres has started
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- name: Check out Git repository
uses: actions/checkout@v2

- name: DB Migration
uses: joshuaavalon/flyway-action@v3.0.0
with:
url: jdbc:postgresql://postgres:5432/postgres
user: postgres
password: postgres
env:
FLYWAY_LOCATIONS: filesystem:./internal_transfers/database/sql

- name: Set up Node.js
uses: actions/setup-node@v1
with:
Expand Down
30 changes: 22 additions & 8 deletions internal_transfers/README.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,32 @@
# Internal Transfers


## Motivation & Summary

Internal Settlements have been a challenge to evaluate slippage since some information
required for the computation never winds up on chain.
Specifically, when the driver decides to internalize an interaction provided by a solver,
the interaction is excluded from the settlement call data.
In order to recover this data we must make token transfers (or imbalances) from
internalized interactions transparently available for consumption.

This project replaces the subquery
This project replaces the subquery
[buffer_trades](https://github.com/cowprotocol/solver-rewards/blob/c7e9c85706decb1a1be28d639ee34e35646bca50/queries/dune_v2/period_slippage.sql#L239-L309)
(an approximation for internal interactions implemented purely within Dune Analytics) with the actual internalized data.

In brief, the project consists of a Data Pipeline implementing the following flow;

1. WebHook/Event Listener for CoW Protocol Settlement Events emitted by [CoW Protocol: GPv2Settlement](https://etherscan.io/address/0x9008d19f58aabd9ed0d60971565aa8510560ab41)
1. WebHook/Event Listener for CoW Protocol Settlement Events emitted
by [CoW Protocol: GPv2Settlement](https://etherscan.io/address/0x9008d19f58aabd9ed0d60971565aa8510560ab41)
2. Settlement Events trigger an ETL Pipeline that
- Fetches full/unoptimized call data provided by the solver for the winning settlement from the [Orderbook API](https://api.cow.fi/docs/#)
- Simulates the full call data extracting and classifying transfers from event logs
- Evaluates the `InternalizedLedger` as the difference `FullLedger - ActualLedger`
3. Finally, the `InternalizedLedger` from step 2 is written to a [Database](./database/README.md) and later synced into Dune community sources.
- Fetches full/unoptimized call data provided by the solver for the winning settlement from
the [Orderbook API](https://api.cow.fi/docs/#)
- Simulates the full call data extracting and classifying transfers from event logs
- Evaluates the `InternalizedLedger` as the difference `FullLedger - ActualLedger`
3. Finally, the `InternalizedLedger` from step 2 is written to a [Database](./database/README.md) and later synced into
Dune community sources.

For more Details on each component outlined above please visit respective readmes:


## [Webhook] Tenderly Actions

Documentation: https://tenderly.co/web3-actions
Expand All @@ -35,4 +37,16 @@ actions directory was scaffolded and deployed as follows:
```shell
tenderly actions init --language typescript
tenderly actions deploy
```

## Generate Database Schema

Following this article on [Postgres with Typescript](https://www.atdatabases.org/docs/pg-guide-typescript) we can
generate the schema
From within `actions/`

```shell
source
export DB_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
npx @databases/pg-schema-cli --database $DB_URL --directory src/__generated__
```
12 changes: 12 additions & 0 deletions internal_transfers/actions/index.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import { ActionFn, Context, Event, TransactionEvent } from "@tenderly/actions";
import { partitionEventLogs } from "./src/parse";
import { getDB, insertSettlementEvent } from "./src/database";

export const triggerInternalTransfersPipeline: ActionFn = async (
context: Context,
Expand All @@ -20,4 +21,15 @@ export const triggerInternalTransfersPipeline: ActionFn = async (
}
console.log(`Parsed ${transfers.length} (relevant) transfer events`);
console.log(`Parsed ${trades.length} trade events`);

const dbUrl = await context.secrets.get("DATABASE_URL");
await Promise.all(
settlements.map(async (settlement) => {
await insertSettlementEvent(
getDB(dbUrl),
{ txHash: txHash, blockNumber: transactionEvent.blockNumber },
settlement
);
})
);
};
1 change: 0 additions & 1 deletion internal_transfers/actions/jest.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,4 @@ module.exports = {
testEnvironment: "node",
testRegex: "/tests/.*\\.(test|spec)?\\.(ts|tsx)$",
moduleFileExtensions: ["ts", "tsx", "js", "jsx", "json", "node"],
resolveJsonModule: true,
};
Loading

0 comments on commit 546f885

Please sign in to comment.