-
Notifications
You must be signed in to change notification settings - Fork 11
feat: add production like example for apibara cli #124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
jaipaljadeja
commented
Dec 4, 2024
- fix infinite type infer errors in apibara config
- upgraded drizzle-orm dependency everywhere
- fix vcr errors
- added a full production like example with cli
📝 Walkthrough📝 WalkthroughWalkthroughThe changes in this pull request introduce several new JSON configuration files for the Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
461f830 to
5cf923e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments.
Actionable comments posted: 8
🧹 Outside diff range and nitpick comments (15)
packages/indexer/src/sinks/drizzle/utils.ts (1)
Line range hint
82-98: Consider enhancing cursor error handling and documentationThe cursor implementation using
bigintranges is robust but could benefit from:
- More descriptive error messages that explain the required format
- Documentation about the range bounds and their implications
- Validation for upper bound being greater than lower bound when both are provided
Consider enhancing the error handling:
if (lower === undefined) { - throw new Error("Lower bound cursor is required"); + throw new Error("Lower bound cursor is required. Expected a bigint value or [lower, upper] range."); } +if (upper !== undefined && Number(upper) <= Number(lower)) { + throw new Error("Upper bound must be greater than lower bound when both are provided."); +}examples/cli/indexers/2-starknet.indexer.ts (1)
45-45: Consider externalizing the contract address to a configuration parameterThe hardcoded address for the USDC contract on StarkNet could be extracted to a configuration file or environment variable to enhance maintainability and flexibility.
examples/cli/indexers/1-evm.indexer.ts (2)
46-46: Consider externalizing the contract address to a configuration parameterThe hardcoded address for the USDC contract on Ethereum could be extracted to a configuration file or environment variable to enhance maintainability and flexibility.
70-75: Optimize database inserts by batching operationsCurrently, database inserts are happening inside a loop for each log entry, which may result in performance issues due to multiple database transactions. Consider batching the inserts to improve performance.
Apply this diff to batch the inserts:
const logger = useLogger(); const { db } = useSink({ context }); const { logs } = block; logger.info("Transforming block ", endCursor?.orderKey); - for (const log of logs) { - await db.insert(ethereumUsdcTransfers).values({ - number: Number(endCursor?.orderKey), - hash: log.transactionHash, - }); - } + const rows = logs.map((log) => ({ + number: endCursor?.orderKey, + hash: log.transactionHash, + })); + await db.insert(ethereumUsdcTransfers).values(rows);examples/cli/lib/schema.ts (1)
4-12: Consider using a generic transfer tableBoth tables have identical structure. Consider creating a generic transfer table with a
networkcolumn to distinguish between networks.Example implementation:
export const usdcTransfers = pgIndexerTable("usdc_transfers", { id: bigint("id", { mode: "number" }).primaryKey(), network: text("network").notNull(), // 'ethereum' or 'starknet' number: bigint("number", { mode: "number" }), hash: text("hash").notNull(), indexed_at: timestamp("indexed_at").defaultNow().notNull(), });examples/cli/drizzle.config.ts (2)
3-5: Add validation for database connection stringWhile the environment variable fallback is good, consider adding validation to ensure the connection string is properly formatted.
function validateConnectionString(connString: string): string { const url = new URL(connString); if (url.protocol !== 'postgres:') { throw new Error('Invalid database protocol'); } return connString; } const connectionString = validateConnectionString( process.env.POSTGRES_CONNECTION_STRING ?? "postgres://postgres:postgres@localhost:5432/postgres" );
7-14: Add JSDoc comments for configuration optionsThe configuration would benefit from documentation explaining each option's purpose.
+/** + * Drizzle ORM configuration + * @property {string} schema - Path to the schema definition file + * @property {string} out - Output directory for generated migrations + * @property {string} dialect - Database dialect (postgresql) + * @property {Object} dbCredentials - Database connection credentials + */ export default { schema: "./lib/schema.ts", out: "./drizzle", dialect: "postgresql", dbCredentials: { url: connectionString, }, } satisfies Config;packages/indexer/src/vcr/record.ts (1)
Line range hint
31-43: Add error handling for file operations.The file operations could benefit from proper error handling and cleanup in case of failures.
Consider wrapping the file operations in try-catch:
+ try { await fs.mkdir(vcrConfig.cassetteDir, { recursive: true }); const filePath = path.join( vcrConfig.cassetteDir, `${cassetteOptions.name}.json`, ); await fs.writeFile(filePath, serialize(output), { flag: "w" }); + } catch (error) { + console.error(`Failed to write cassette file: ${error.message}`); + throw error; + }examples/cli/test/ethereum.test.ts (2)
23-31: Enhance test coverageThe current test only verifies the happy path. Consider adding tests for:
- Error handling (invalid blocks, network issues)
- Edge cases (empty blocks, multiple transfers in same block)
- Data validation (verify transfer amounts, addresses)
it("should handle empty blocks", async () => { const indexer = createIndexer({ database }); await vcr.run("ethereum-usdc-transfers-empty", indexer, { fromBlock: 10_000_006n, toBlock: 10_000_007n, }); const rows = await database.select().from(ethereumUsdcTransfers); expect(rows).toHaveLength(0); });
26-29: Consider parameterizing block rangesThe hardcoded block range makes the test less flexible. Consider using test parameters or environment variables.
- await vcr.run("ethereum-usdc-transfers", indexer, { - fromBlock: 10_000_000n, - toBlock: 10_000_005n, + const fromBlock = process.env.TEST_FROM_BLOCK ?? 10_000_000n; + const toBlock = process.env.TEST_TO_BLOCK ?? 10_000_005n; + await vcr.run("ethereum-usdc-transfers", indexer, { + fromBlock, + toBlock,examples/cli/test/starknet.test.ts (2)
11-16: Consider adding database cleanup in afterAll hook.While the in-memory database setup looks good, it's a best practice to clean up database resources after tests complete.
Add an afterAll hook:
+afterAll(async () => { + await database.execute(sql`DROP SCHEMA IF EXISTS public CASCADE`); +});
32-178: Consider breaking down the snapshot.The large inline snapshot makes the test harder to maintain. Consider breaking it down into smaller, more focused assertions:
- Verify block numbers are sequential
- Verify transaction status
- Verify data integrity
Example refactor:
const rows = await database.select().from(starknetUsdcTransfers); expect(rows).toHaveLength(13); expect(rows.every(row => row.transactionStatus === 'succeeded')).toBe(true); expect(rows.map(row => row.number)).toEqual( expect.arrayContaining([800001, 800002, 800003, 800004, 800005]) ); // Then use smaller snapshots for specific data pointsexamples/cli/cassettes/ethereum-usdc-transfers.json (1)
15-309: Consider adding data validation checks.While the event data structure is correct, consider adding runtime validation to ensure data integrity:
- Block numbers are sequential
- Parent block hashes match
- Transaction status is always present
Consider implementing a JSON schema validator for the cassette files to catch potential data issues early.
🧰 Tools
🪛 Gitleaks (8.21.2)
21-21: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
25-25: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
91-91: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
95-95: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
161-161: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
165-165: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
259-259: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
263-263: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
examples/cli/cassettes/starknet-usdc-transfers.json (2)
33-44: Consider documenting L1 data availability mode.The
l1DataAvailabilityMode: "blob"setting is important for production deployments. Consider adding documentation about its implications and requirements.Add a comment in the README or documentation explaining the L1 data availability modes and their trade-offs.
45-705: Consider adding event decoding information.The event data is stored in raw format. Consider adding:
- Event ABI information
- Decoded parameter names
- Human-readable value transformations
This would make the cassette files more maintainable and easier to debug.
🧰 Tools
🪛 Gitleaks (8.21.2)
79-79: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
83-83: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
497-497: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
501-501: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
558-558: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
562-562: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
653-653: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
657-657: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🛑 Comments failed to post (8)
examples/cli/indexers/2-starknet.indexer.ts (1)
67-67:
⚠️ Potential issuePotential loss of precision when converting BigInt to Number
The
orderKeyis aBigInt. Converting it to aNumbermay lead to loss of precision for large values. Consider usingendCursor?.orderKeydirectly or ensure that the values stay within the safe integer limits ofNumber.Apply this diff to fix the potential issue:
- number: Number(endCursor?.orderKey), + number: endCursor?.orderKey,Ensure that the
numberfield in thestarknetUsdcTransferstable supportsBigIntvalues.Committable suggestion skipped: line range outside the PR's diff.
packages/cli/src/config/index.ts (1)
4-7:
⚠️ Potential issuePotential breaking change in 'defineConfig' type parameters
The type constraint for
TindefineConfighas been narrowed toDeepPartial<Pick<ApibaraConfig<T, R>, "runtimeConfig">>. This change may break existing usage ofdefineConfigthat relies on other properties ofApibaraConfig, such aspresetsorrollupConfig.Consider evaluating the impact on existing users and whether this change requires a major version bump or providing migration guidance.
examples/cli/lib/schema.ts (1)
4-7: 🛠️ Refactor suggestion
Add primary keys and indexes to the transfer tables
The tables are missing primary keys which could lead to duplicate entries. Consider:
- Adding a primary key column (e.g.,
id)- Adding an index on the
hashcolumn for faster lookups- Adding timestamp columns for tracking when transfers were indexed
Here's the suggested schema improvement:
export const starknetUsdcTransfers = pgIndexerTable("starknet_usdc_transfers", { + id: bigint("id", { mode: "number" }).primaryKey(), number: bigint("number", { mode: "number" }), - hash: text("hash"), + hash: text("hash").notNull(), + indexed_at: timestamp("indexed_at").defaultNow().notNull(), });Apply similar changes to
ethereumUsdcTransfers.Also applies to: 9-12
examples/cli/lib/db.ts (2)
5-5:
⚠️ Potential issueUse environment variables for database connection
The connection string is hardcoded which is not secure for production environments. Use environment variables consistently across the codebase.
-const connectionString = "postgres://postgres:postgres@localhost:5432/postgres"; +const connectionString = process.env.POSTGRES_CONNECTION_STRING ?? + "postgres://postgres:postgres@localhost:5432/postgres";📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.const connectionString = process.env.POSTGRES_CONNECTION_STRING ?? "postgres://postgres:postgres@localhost:5432/postgres";
7-9: 🛠️ Refactor suggestion
Add pool configuration and error handling
The database pool lacks configuration options and error handling which are essential for production environments.
const pool = new pg.Pool({ connectionString, + max: 20, + idleTimeoutMillis: 30000, + connectionTimeoutMillis: 2000, }); + +pool.on('error', (err) => { + console.error('Unexpected error on idle client', err); + process.exit(-1); +});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.const pool = new pg.Pool({ connectionString, max: 20, idleTimeoutMillis: 30000, connectionTimeoutMillis: 2000, }); pool.on('error', (err) => { console.error('Unexpected error on idle client', err); process.exit(-1); });packages/indexer/src/sinks/drizzle/delete.ts (1)
28-41:
⚠️ Potential issueRemove @ts-ignore and improve type safety.
The
@ts-ignorecomment suggests a potential type safety issue with the cursor update SQL operation.Consider implementing proper type checking:
- // @ts-ignore - _cursor: sql`int8range(lower(_cursor), ${Number(this.endCursor?.orderKey!)}, '[)')`, + _cursor: this.endCursor?.orderKey + ? sql`int8range(lower(_cursor), ${Number(this.endCursor.orderKey)}, '[)')` + : undefined,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.where( where: SQL, ): PgUpdateWithout<PgUpdateBase<TTable, TQueryResult>, false, "where"> { return this.db .update(this.table) .set({ _cursor: this.endCursor?.orderKey ? sql`int8range(lower(_cursor), ${Number(this.endCursor.orderKey)}, '[)')` : undefined, }) .where(where) as PgUpdateWithout< PgUpdateBase<TTable, TQueryResult>, false, "where" >;examples/cli/drizzle/meta/0000_snapshot.json (2)
42-53: 🛠️ Refactor suggestion
Consider adding indexes and NOT NULL constraints
The
ethereum_usdc_transferstable has nullablenumberandhashcolumns, which could impact data integrity. Additionally, consider adding indexes for better query performance:
- Add NOT NULL constraints
- Add an index on
numberfor block-based queries- Add an index on
hashfor transaction lookups"number": { "name": "number", "type": "bigint", "primaryKey": false, - "notNull": false + "notNull": true }, "hash": { "name": "hash", "type": "text", "primaryKey": false, - "notNull": false + "notNull": true },Committable suggestion skipped: line range outside the PR's diff.
115-126: 🛠️ Refactor suggestion
Apply same constraints to starknet_usdc_transfers
The same data integrity concerns apply to the
starknet_usdc_transferstable."number": { "name": "number", "type": "bigint", "primaryKey": false, - "notNull": false + "notNull": true }, "hash": { "name": "hash", "type": "text", "primaryKey": false, - "notNull": false + "notNull": true },📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements."number": { "name": "number", "type": "bigint", "primaryKey": false, "notNull": true }, "hash": { "name": "hash", "type": "text", "primaryKey": false, "notNull": true },
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Outside diff range and nitpick comments (5)
examples/cli/cassettes/ethereum-usdc-transfers.json (2)
15-27: Consider adding pagination metadata.While the cursor-based pagination is implemented correctly, consider adding metadata about the total number of records or remaining pages to help with progress tracking and error handling.
{ "filter": { ... }, "messages": [ { "_tag": "data", "data": { + "metadata": { + "totalBlocks": 6, + "remainingBlocks": 5 + }, "cursor": { ... }, "endCursor": { ... },🧰 Tools
🪛 Gitleaks (8.21.2)
21-21: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
25-25: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
1-311: Consider adding documentation for test data.While the test data is well-structured, adding a comment section at the beginning of the file would help developers understand:
- The purpose of this test fixture
- The time range of the captured blocks
- The specific USDC transfer scenarios being tested
{ + "/*": { + "description": "Test fixture for USDC transfers on Ethereum mainnet", + "blocks": "10000000-10000005", + "scenarios": [ + "Standard transfers between EOA accounts", + "Transfers with varying amounts" + ] + }, "filter": {🧰 Tools
🪛 Gitleaks (8.21.2)
21-21: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
25-25: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
91-91: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
95-95: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
161-161: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
165-165: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
259-259: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
263-263: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
examples/cli/test/starknet.test.ts (3)
11-16: Consider adding database cleanupThe in-memory database setup looks good, but consider adding cleanup logic in an
afterAllhook to ensure proper test isolation.+ afterAll(async () => { + await database.connection.close(); + });
23-23: Test description could be more specificThe test description "should work" is too vague. Consider a more descriptive name that indicates what specifically is being tested.
- it("should work", async () => { + it("should correctly index USDC transfers for blocks 800000-800005", async () => {
26-29: Consider extracting block range constantsThe block range values could be extracted as constants at the top of the file for better maintainability and reusability.
+ const TEST_FROM_BLOCK = 800_000n; + const TEST_TO_BLOCK = 800_005n; + await vcr.run("starknet-usdc-transfers", indexer, { - fromBlock: 800_000n, - toBlock: 800_005n, + fromBlock: TEST_FROM_BLOCK, + toBlock: TEST_TO_BLOCK, });
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (10)
change/@apibara-indexer-604c3e20-41ce-41de-bd63-4e3dd5828add.json(1 hunks)change/apibara-b7f05b76-9746-4610-880e-19cd603487eb.json(1 hunks)examples/cli/cassettes/ethereum-usdc-transfers.json(1 hunks)examples/cli/cassettes/starknet-usdc-transfers.json(1 hunks)examples/cli/test/ethereum.test.ts(1 hunks)examples/cli/test/helper.ts(1 hunks)examples/cli/test/starknet.test.ts(1 hunks)examples/indexer/package.json(1 hunks)examples/starknet-indexer/package.json(1 hunks)packages/indexer/src/sinks/drizzle/delete.ts(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (6)
- change/apibara-b7f05b76-9746-4610-880e-19cd603487eb.json
- examples/cli/test/helper.ts
- examples/indexer/package.json
- examples/starknet-indexer/package.json
- change/@apibara-indexer-604c3e20-41ce-41de-bd63-4e3dd5828add.json
- examples/cli/test/ethereum.test.ts
🧰 Additional context used
📓 Learnings (1)
packages/indexer/src/sinks/drizzle/delete.ts (1)
Learnt from: fracek
PR: apibara/typescript-sdk#109
File: packages/indexer/src/sinks/drizzle/drizzle.test.ts:16-18
Timestamp: 2024-11-12T11:46:58.595Z
Learning: In `packages/indexer/src/sinks/drizzle`, the import references are correct and up-to-date according to the new code structure.
🪛 Gitleaks (8.21.2)
examples/cli/cassettes/starknet-usdc-transfers.json
18-18: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
22-22: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
79-79: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
83-83: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
497-497: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
501-501: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
558-558: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
562-562: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
653-653: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
657-657: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
examples/cli/cassettes/ethereum-usdc-transfers.json
21-21: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
25-25: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
91-91: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
95-95: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
161-161: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
165-165: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
259-259: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
263-263: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🔇 Additional comments (8)
examples/cli/cassettes/starknet-usdc-transfers.json (3)
2-11: LGTM: Filter configuration is well-structured.
The filter configuration correctly specifies the USDC contract address and event key for monitoring transfer events on StarkNet.
12-710: Test fixture data is comprehensive and well-structured.
The cassette contains a complete sequence of StarkNet blocks (800001-800005) with realistic data, including:
- Block headers with all required fields
- Gas price information in both FRI and WEI
- Multiple transfer events with transaction details
- Proper cursor handling for pagination
This provides a good test fixture for the apibara CLI.
🧰 Tools
🪛 Gitleaks (8.21.2)
18-18: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
22-22: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
79-79: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
83-83: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
497-497: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
501-501: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
558-558: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
562-562: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
653-653: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
657-657: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
18-18: Static analysis findings are false positives.
The Gitleaks tool has flagged several blockchain hashes as potential API keys. These are false positives as:
- The flagged values are StarkNet block and transaction hashes
- These hashes are public information on the blockchain
- They don't represent security credentials or sensitive data
Also applies to: 22-22, 79-79, 83-83, 497-497, 501-501, 558-558, 562-562, 653-653, 657-657
🧰 Tools
🪛 Gitleaks (8.21.2)
18-18: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
packages/indexer/src/sinks/drizzle/delete.ts (2)
13-13: LGTM: Import addition aligns with type safety improvements
The addition of PgUpdateWithout type import is consistent with the drizzle-orm upgrade and enhances type safety in the where() method implementation.
Line range hint 1-42: Verify drizzle-orm version compatibility
Since this file is part of the drizzle-orm upgrade mentioned in the PR objectives, let's ensure all related files are using compatible versions.
✅ Verification successful
All drizzle-orm dependencies are consistently using version ^0.37.0
The verification shows that all packages using drizzle-orm (packages/indexer, examples/cli, examples/indexer, and examples/starknet-indexer) are consistently using the same version ^0.37.0. This version alignment ensures compatibility across the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check for consistent drizzle-orm versions across the project
echo "Checking package.json files for drizzle-orm versions:"
find . -name "package.json" -type f -exec sh -c '
echo "\nChecking {}:"
jq -r ".dependencies[\"drizzle-orm\"] // .devDependencies[\"drizzle-orm\"]" "{}"
' \;
Length of output: 974
examples/cli/cassettes/ethereum-usdc-transfers.json (2)
2-14: LGTM: Well-structured filter configuration for USDC transfers.
The filter configuration correctly targets the USDC contract address and transfer event signature. The strict flag ensures exact matching of the specified topics.
52-80: LGTM: Transfer logs structure is comprehensive.
The transfer logs capture all essential information:
- Log indices and transaction indices for ordering
- Filter IDs for tracking matched filters
- Complete event data including topics and data field
- Transaction status for verification
Also applies to: 122-150, 192-248, 290-304
examples/cli/test/starknet.test.ts (1)
19-21: LGTM!
Good practice using beforeAll for database migration setup.
| where( | ||
| where: SQL, | ||
| ): PgUpdateWithout<PgUpdateBase<TTable, TQueryResult>, false, "where"> { | ||
| return this.db | ||
| .update(this.table) | ||
| .set({ | ||
| // @ts-ignore | ||
| _cursor: sql`int8range(lower(_cursor), ${Number(this.endCursor?.orderKey!)}, '[)')`, | ||
| }) | ||
| .where(where); | ||
| .where(where) as PgUpdateWithout< | ||
| PgUpdateBase<TTable, TQueryResult>, | ||
| false, | ||
| "where" | ||
| >; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider improving type safety and error handling
While the type updates improve safety, there are several areas that could be enhanced:
- The non-null assertion on
this.endCursor?.orderKey!could lead to runtime errors if the cursor is undefined - The
@ts-ignorecomment suggests underlying type issues that should be addressed explicitly
Consider applying these improvements:
where(
where: SQL,
): PgUpdateWithout<PgUpdateBase<TTable, TQueryResult>, false, "where"> {
+ if (this.endCursor?.orderKey === undefined) {
+ throw new Error('End cursor order key is required');
+ }
return this.db
.update(this.table)
.set({
- // @ts-ignore
- _cursor: sql`int8range(lower(_cursor), ${Number(this.endCursor?.orderKey!)}, '[)')`,
+ _cursor: sql`int8range(lower(_cursor), ${Number(this.endCursor.orderKey)}, '[)')`,
})
.where(where) as PgUpdateWithout<
PgUpdateBase<TTable, TQueryResult>,
false,
"where"
>;
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| where( | |
| where: SQL, | |
| ): PgUpdateWithout<PgUpdateBase<TTable, TQueryResult>, false, "where"> { | |
| return this.db | |
| .update(this.table) | |
| .set({ | |
| // @ts-ignore | |
| _cursor: sql`int8range(lower(_cursor), ${Number(this.endCursor?.orderKey!)}, '[)')`, | |
| }) | |
| .where(where); | |
| .where(where) as PgUpdateWithout< | |
| PgUpdateBase<TTable, TQueryResult>, | |
| false, | |
| "where" | |
| >; | |
| where( | |
| where: SQL, | |
| ): PgUpdateWithout<PgUpdateBase<TTable, TQueryResult>, false, "where"> { | |
| if (this.endCursor?.orderKey === undefined) { | |
| throw new Error('End cursor order key is required'); | |
| } | |
| return this.db | |
| .update(this.table) | |
| .set({ | |
| _cursor: sql`int8range(lower(_cursor), ${Number(this.endCursor.orderKey)}, '[)')`, | |
| }) | |
| .where(where) as PgUpdateWithout< | |
| PgUpdateBase<TTable, TQueryResult>, | |
| false, | |
| "where" | |
| >; |
| "withdrawals": [], | ||
| "transactions": [], | ||
| "receipts": [], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Empty arrays can be safely removed
The withdrawals, transactions, and receipts arrays are consistently empty in all blocks within the cassette file, and they are not referenced in any test files. These arrays appear to be part of a standard block structure but don't contain any data relevant to the test cases, which focus on USDC transfer logs. Removing these empty arrays will reduce file size and improve readability without affecting functionality.
🔗 Analysis chain
Verify the necessity of empty arrays.
The withdrawals, transactions, and receipts arrays are consistently empty across all blocks. If this data is not needed for the test cases, consider removing these fields to reduce the file size and improve readability.
Also applies to: 119-121, 189-191, 287-289
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check if these arrays are used in tests
rg -l "withdrawals|transactions|receipts" --type ts examples/cli/test/
Length of output: 72
Script:
#!/bin/bash
# Check if these arrays are used in any test files or source files
rg -l "withdrawals|transactions|receipts" examples/cli/
# Check the structure and usage of the cassette file
cat examples/cli/cassettes/ethereum-usdc-transfers.json
Length of output: 16789
| expect(rows).toMatchInlineSnapshot(` | ||
| [ | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800001, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x006dcfffad0aebb8e496bf1af95b97ab181e9b86bd61a07c2926407ea70c0d0e", | ||
| "number": 800001, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x07b2c7d48d92eb6110b5039a112110b4ee5b59b9249764d22441c988f2a3730d", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x0288e4f00a93f4ea8f61bc0c218df8070c2a4ecd219e18a8fe39d473e62dff08", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x029959d46cfadb2b0c8cc823a9e4c03b7f22f8c596c424c9e11db62ee7fa96ba", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x0247f793e0525ed7d9d369b1b701942e55c4d41a3116733f0c77af3c0dfd21ad", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x06683c568f2e977e3d4b6f4da70b3926950c83f5a7c6119b3f2a643643db8fed", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x041a41f4fdda921c6049188312aaaa28f63e6bb628d6dd604891449d4d76b395", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x0393e0fbc25313322c088d5d457961f72bf6f6f31cc846f3e8aa96c1e53922c3", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x05042aeb03be2ec2ae48d78dcc69e9d79221807cc70eedcf083db70adfd88eac", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800002, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x06ecfe783686b0dc2747acfcc77e7dcda2b2baea6e65bc8624537e14ca024343", | ||
| "number": 800002, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800003, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x0359aeed2843c282006cff7326fdce682efd3e6c43170df8fad02f1e0d1a9446", | ||
| "number": 800003, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800004, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x0725361f6a2bafeb496b0b879ed7711d8ae3afd9dfd973bfac97ff0e3cb0cd4b", | ||
| "number": 800004, | ||
| }, | ||
| { | ||
| "_cursor": Int8Range { | ||
| "range": Range { | ||
| "lower": 800005, | ||
| "mask": 36, | ||
| "upper": null, | ||
| }, | ||
| }, | ||
| "hash": "0x079bdefcacc75dd1db6dfcdf3b29e9c30d2b70002d9d85a3d1b2032e2c72251c", | ||
| "number": 800005, | ||
| }, | ||
| ] | ||
| `); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider adding error case tests
The test only covers the happy path. Consider adding test cases for error scenarios such as:
- Invalid block range
- Network errors
- Invalid transactions
Also, the snapshot is quite large. Consider either:
- Testing specific properties instead of the entire dataset
- Reducing the test block range
- Using a smaller subset of transactions for verification
Example of testing specific properties:
const rows = await database.select().from(starknetUsdcTransfers);
expect(rows).toHaveLength(13);
expect(rows[0]).toMatchObject({
number: 800001,
hash: expect.any(String),
});| const rows = await database.select().from(starknetUsdcTransfers); | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Schema is missing critical USDC transfer fields
The schema definition in examples/cli/lib/schema.ts shows that the starknetUsdcTransfers table only tracks:
number: block numberhash: transaction hash
The schema is missing essential fields for USDC transfers:
fromaddresstoaddressamounttimestamp
This confirms that the test cannot verify these important transfer details as they are not even part of the schema.
🔗 Analysis chain
Verify complete USDC transfer data coverage
The current test only verifies block numbers and transaction hashes. Consider verifying other important USDC transfer fields such as:
- from/to addresses
- transfer amounts
- token decimals
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check schema definition for USDC transfers table
ast-grep --pattern 'export const starknetUsdcTransfers = $table({
$$$
})'
Length of output: 77
Script:
#!/bin/bash
# Let's try a broader search for table definition
rg "starknetUsdcTransfers" -A 10
Length of output: 4322
|
Looks good! |