-
Notifications
You must be signed in to change notification settings - Fork 2
Tg identities v2 #467
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tg identities v2 #467
Conversation
- Moved the incentive point awarding logic for wallet and social media linking to be more readable and maintainable. - Updated the simulate flag in endpointHandlers to true
- Deleted the IncentiveController class to streamline the incentive point awarding process. - Updated GCRIdentityRoutines to utilize IncentiveManager for wallet and social media linking. - Removed references to IncentiveController in endpointHandlers and handleIdentityRequest for cleaner code structure.
Incentives
- Updated DAHR to include action type in web2Request for better request management. - Simplified Proxy request handling by removing unnecessary CONNECT method logic and streamlining response processing. - Enhanced error handling and response data collection in Proxy class. - Adjusted proxy server configuration for better target URL management.
…r turnstile verification to use verifier endpoint
Remove .env file and update demo SDK version in package.json; refacto…
- Introduced methods for adding and removing PQC identities in GCRIdentityRoutines. - Implemented PQC payload verification in IdentityManager. - Updated handleIdentityRequest to support PQC identity assignments and removals.
- Added validation for adding and removing PQC identities, ensuring proper data structure and existence checks. - Updated StoredIdentities type to include PQC identities structure. - Improved error handling for missing or duplicate identities during operations.
…te manageGCRRoutines to eliminate Turnstile verification logic
Remove SecurityManager and turnstile verification utility files; upda…
… remove unused imports
+ ugrade sdk
- Refactored the `verifyWeb2Proof` function to include error handling and improved success messaging. - Updated `GCRIdentityRoutines` to handle identities as objects, including address, signature, public key, and timestamp. - Modified `IdentityManager` to accept the sender's ed25519 address for payload verification. - Adjusted `handleIdentityRequest` to correctly process transaction data and sender information. - Improved type definitions for stored identities to accommodate new structure.
…to streamline identity request processing
…AHR and handleWeb2ProxyRequest
…t, and loopback addresses; update error handling in DAHR and handleWeb2ProxyRequest
…ivate, link-local, and loopback addresses; update dependencies
…l IPv6 multicast and unspecified addresses; enhance disallowed address checks for IPv4
add URL scheme validation in DAHR and handleWeb2ProxyRequest
Implements full telegram identity system integration including: - Add telegram incentive methods to IncentiveManager (telegramLinked/telegramUnlinked) - Add getAccountByTelegramUsername RPC endpoint for external system queries - Implement GCR.getAccountByTelegramUsername() with database JSONB queries - Add complete PointSystem telegram support: - LINK_TELEGRAM: 2 points value - awardTelegramPoints() with ownership verification - deductTelegramPoints() with ownership verification - Updated socialAccounts structure to include telegram - Add telegram support to GCR_Main entity type definitions The telegram bot can now submit identity transactions that are fully processed by the node's incentive and storage systems. Users receive 2 points for linking telegram accounts with proper anti-abuse measures. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
… it is done by bots
WalkthroughThis PR introduces major platform updates: unified crypto (ucrypto) across consensus/network/tx flows, a new MCP server feature, expanded multichain (Aptos) executors, a Rubic API-based bridges refactor, incentives with referrals and Telegram verification, repository-based Chain refactor and genesis handling, a Web2 proxy/DAHR overhaul with SSRF hardening, and extensive docs/config updates. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Node
participant MCP as MCPServerManager
participant Tools as DemosTools
Note over Node: Startup (index.ts)
Node->>Node: Read env (MCP_ENABLED, ports)
alt MCP enabled
Node->>MCP: createDemosMCPServer(options)
MCP->>Tools: createDemosNetworkTools(config)
Tools-->>MCP: Tool set
MCP-->>Node: start() success
Node->>Node: set isMCPServerStarted=true
else MCP disabled or error
Node->>Node: Skip / set false
end
sequenceDiagram
autonumber
participant Bot as Telegram Bot
participant User
participant Node
participant Genesis as Genesis Balances
Note over User,Bot: Off-chain signing at bot
User->>Bot: Sign challenge (local)
Bot->>Bot: Create TelegramSignedAttestation
Bot->>Bot: Sign attestation with bot key
Bot->>Node: Submit attestation in tx
Node->>Node: verifyTelegramProof(payload, sender)
Node->>Node: Verify bot signature (ucrypto)
Node->>Genesis: checkBotAuthorization(botAddress)
Genesis-->>Node: Authorized?
Node-->>Bot: Accept or reject
sequenceDiagram
autonumber
participant SDK
participant GCR as GCRIdentityRoutines/PointSystem
participant Ref as Referrals
participant DB as GCRMain Repository
SDK->>GCR: link (wallet/twitter/telegram) with referralCode?
GCR->>GCR: isFirstConnection?
alt referralCode present
GCR->>Ref: processReferral(newAccount, code, repo)
Ref->>DB: Update referrer/new user points & referralInfo
end
GCR->>DB: Update points breakdown & totals
DB-->>SDK: RPCResponse
sequenceDiagram
autonumber
participant Client
participant Node
participant Rubic as Rubic API v2
Client->>Node: manageBridges(get_trade|execute_trade)
alt get_trade
Node->>Rubic: POST /routes/quoteBest
Rubic-->>Node: Quote
Node-->>Client: Quote
else execute_trade
Node->>Rubic: POST /routes/swap (quoteId, addresses)
Rubic-->>Node: Swap data/raw tx
Node-->>Client: Swap data
end
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120+ minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests
Tip 👮 Agentic pre-merge checks are now available in preview!Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.
Please see the documentation for more information. Example: reviews:
pre_merge_checks:
custom_checks:
- name: "Undocumented Breaking Changes"
mode: "warning"
instructions: |
Pass/fail criteria: All breaking changes to public APIs, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints must be documented in the "Breaking Change" section of the PR description and in CHANGELOG.md. Exclude purely internal or private changes (e.g., code not exported from package entry points or explicitly marked as internal).Please share your feedback with us on this Discord post. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
|
This PR modifies Files that will be reverted:
|
|
This PR modifies Files that will be reverted:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 95
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (17)
src/libs/blockchain/routines/Sync.ts (1)
219-249: Do not call process.exit in library code; return false (or throw) insteadHard-exiting the process from a sync routine is a production‑blocker. Surface failure to the caller and decide at the edge (CLI/service) how to act.
-async function verifyLastBlockIntegrity( +async function verifyLastBlockIntegrity( peer: Peer, ourLastBlockNumber: number, ourLastBlockHash: string, -) { +): Promise<boolean> { // INFO: Verify genesis hash matches our genesis hash const genesisBlock = await getRemoteBlock(peer, 0) - if (!genesisBlock) { - log.error("[fastSync] Could not get genesis block from peer") - process.exit(1) - } + if (!genesisBlock) { + log.error("[fastSync] Could not get genesis block from peer") + return false + } const ourGenesis = await Chain.getGenesisBlock() - if (genesisBlock.hash !== ourGenesis.hash) { - log.error("[fastSync] Genesis hash is not coherent") - log.info("[fastSync] Our hash: " + ourGenesis.hash) - log.info("[fastSync] Peer hash: " + genesisBlock.hash) - process.exit(1) - } + if (genesisBlock.hash !== ourGenesis.hash) { + log.error("[fastSync] Genesis hash is not coherent") + log.info("[fastSync] Our hash: " + ourGenesis.hash) + log.info("[fastSync] Peer hash: " + genesisBlock.hash) + return false + } // Verify if the last block hash is coherent const lastSyncedBlock = await getRemoteBlock(peer, ourLastBlockNumber) - if (!lastSyncedBlock) { - log.error("[fastSync] Could not get last block from peer") - process.exit(1) - } + if (!lastSyncedBlock) { + log.error("[fastSync] Could not get last block from peer") + return false + } return lastSyncedBlock.hash === ourLastBlockHash }src/features/multichain/routines/executors/contract_read.ts (2)
56-62: Argument ordering bug: Object.values may scramble ABI arg order.Smart‑contract calls require positional args. Expect an array; avoid Object.values on objects.
- const methodParams = JSON.parse(params.params) + const methodParams = typeof params.params === "string" ? JSON.parse(params.params) : params.params console.log("calling SC method: " + params.method) console.log("calling SC with args: " + params.params) console.log("params.params contents:", methodParams) - // Convert the object values into an array - const argsArray = Object.values(methodParams) - const result = await contractInstance[params.method](...argsArray) // REVIEW Big IF + // Expect an array of positional arguments; fallback to [] if absent + const argsArray = Array.isArray(methodParams) ? methodParams : [] + if (!Array.isArray(methodParams)) { + return { result: "error", error: "params.params must be a JSON array of positional arguments" } + } + if (typeof contractInstance[params.method] !== "function") { + return { result: "error", error: `Method not found: ${params.method}` } + } + const result = await contractInstance[params.method](...argsArray)
23-27: Wrap EVM call path in try/catch and add minimal timeout where applicable.Prevent unhandled rejections from connect/getContractInstance/call.
- await evmInstance.connect() + try { + await evmInstance.connect() + } catch (e) { + return { result: "error", error: `EVM connect failed: ${e.toString()}` } + } ... - const contractInstance = await evmInstance.getContractInstance( + const contractInstance = await evmInstance.getContractInstance( params.address, params.abi, ) ... - const result = await contractInstance[params.method](...argsArray) // REVIEW Big IF + let result + try { + result = await contractInstance[params.method](...argsArray) + } catch (e) { + return { result: "error", error: `Contract call failed: ${e.toString()}` } + }Also applies to: 50-55, 62-69
src/features/InstantMessagingProtocol/signalingServer/signalingServer.ts (2)
172-183: Missing early return after invalid register schema.If validation fails, execution continues and calls
handleRegister(...)with invalid fields.if ( registerMessage.type !== "register" || !registerMessage.payload.clientId || !registerMessage.payload.publicKey || !registerMessage.payload.verification ) { this.sendError( ws, ImErrorType.INVALID_MESSAGE, "Invalid message schema", ) + return }
296-305: Bind the signature proof to the submitted registration details.You verify the signature but don’t ensure the signed message actually commits to
clientId(and optionally the encryption public key). Without binding, an attacker could replay a valid proof for different registration data.const verified = await ucrypto.verify(deserializedProof) if (!verified) { this.sendError(ws, ImErrorType.INVALID_PROOF, "Invalid proof") return } + // Ensure the signed message commits to this registration's claims + try { + const msgText = new TextDecoder().decode(deserializedProof.message) + const claims = JSON.parse(msgText) + if (claims?.clientId !== clientId) { + this.sendError(ws, ImErrorType.INVALID_PROOF, "Proof does not bind to clientId") + return + } + // Optional: also bind the submitted encryption public key (e.g., hash/fingerprint) if protocol specifies it. + } catch { + this.sendError(ws, ImErrorType.INVALID_PROOF, "Malformed signed message") + return + }.eslintrc.cjs (1)
41-49: Tighten naming rules: allow UPPER_CASE only for const variables.Current variableLike also covers properties/params; permitting UPPER_CASE there is too broad.
Apply:
- selector: "variableLike", - format: ["camelCase", "UPPER_CASE"], - leadingUnderscore: "allow", - trailingUnderscore: "allow", + selector: "variableLike", + format: ["camelCase"], + leadingUnderscore: "allow", + trailingUnderscore: "allow", + }, + { + selector: "variable", + modifiers: ["const"], + format: ["UPPER_CASE", "camelCase"],src/libs/blockchain/gcr/gcr_routines/identityManager.ts (1)
48-49: Remove the TypeScript error suppression.Using
@ts-expect-errorsuppresses important type checking. The BTC module should be properly typed or wrapped to conform to the expected interface.Consider properly typing the BTC module or creating a wrapper that conforms to the DefaultChain interface instead of suppressing the error.
src/libs/blockchain/gcr/gcr_routines/GCRBalanceRoutines.ts (3)
24-26: Avoid number/bigint comparisons; normalize amount once.
editOperation.amountis later cast toBigInt, but comparisons use mixed types, risking TypeError.Apply:
-// Safeguarding the operation by checking if the amount is positive -if (editOperation.amount <= 0) { +// Normalize amount to bigint once +const amount = typeof editOperation.amount === "bigint" + ? editOperation.amount + : BigInt(editOperation.amount) +// Safeguarding the operation by checking if the amount is positive +if (amount <= 0n) { return { success: false, message: "Invalid amount" } }
50-66: Use normalized bigint for balance arithmetic and checks.Continue using the normalized
amountto avoid mixed‑type ops.-const actualBalance = accountGCR.balance +const actualBalance = BigInt(accountGCR.balance) ... -if (editOperation.operation === "add") { - accountGCR.balance = - BigInt(accountGCR.balance) + BigInt(editOperation.amount) +if (operationToApply === "add") { + accountGCR.balance = actualBalance + amount -} else if (editOperation.operation === "remove") { +} else if (operationToApply === "remove") { ... - if ( - (actualBalance < editOperation.amount || - actualBalance === 0n) && - getSharedState.PROD - ) { + if ((actualBalance < amount || actualBalance === 0n) && getSharedState.PROD) { return { success: false, message: "Insufficient balance" } } - accountGCR.balance = - BigInt(accountGCR.balance) - BigInt(editOperation.amount) + accountGCR.balance = actualBalance - amount
36-39: Don’t mutate the input editOperation; derive an operationToApply.Side‑effects on arguments can surprise callers.
-// Reversing the operation if it is a rollback -if (editOperation.isRollback) { - editOperation.operation = - editOperation.operation === "add" ? "remove" : "add" -} +// Decide operation without mutating input +const operationToApply = + editOperation.isRollback + ? (editOperation.operation === "add" ? "remove" : "add") + : editOperation.operationAnd replace later uses with
operationToApply.src/libs/network/manageExecution.ts (1)
58-69: Fragile payload detection via try/catch—replace with a type guard.Current logic may swallow shape errors and pass invalid data to execution. Use structural checks instead.
- var validityDataPayload: ValidityData - // If content.data.response.rpc_public_key exists, we assign validityDataPayload to response - try { - if (content.data.response.rpc_public_key) { - validityDataPayload = content.data.response - } else { - validityDataPayload = content.data - } - } catch (e) { - validityDataPayload = content.data - } + let validityDataPayload: ValidityData + const d: any = content?.data + if ( + d && + typeof d === "object" && + "response" in d && + d.response && + typeof d.response === "object" && + "rpc_public_key" in d.response + ) { + validityDataPayload = d.response as ValidityData + } else { + validityDataPayload = d as ValidityData + }src/libs/consensus/v2/routines/broadcastBlockHash.ts (1)
121-126: Restore original return semantics: return [pro, con].PoRBFT destructures the result as [pro, con] and relies on those vote/rejection counts (src/libs/consensus/v2/PoRBFT.ts:495 and 505); the current code returns signatureCount taken from getSharedState.candidateBlock.validation_data.signatures (not the per-call pro counter / local block.validation_data), so values can diverge and change behavior. Revert the return to the measured pro/con.
- const signatureCount = Object.keys( - getSharedState.candidateBlock.validation_data.signatures, - ).length - // INFO: Return the candidate block signature count - return [signatureCount, shard.length - signatureCount] + return [pro, con]src/features/web2/handleWeb2.ts (1)
21-40: Avoid logging full Web2 payloads (PII/secret leakage); also standardize on project logger.Raw request logs can include headers/cookies/tokens. Replace console usage and redact.
export async function handleWeb2( web2Request: IWeb2Request, ): Promise<string | DAHR> { // TODO Remember that web2 could need to be signed and could need a fee - console.log("[PAYLOAD FOR WEB2] [*] Received a Web2 Payload.") - console.log("[PAYLOAD FOR WEB2] [*] Beginning sanitization checks...") - log.debug("[PAYLOAD FOR WEB2] [*] Web2 Request: " + JSON.stringify(web2Request, null, 2)) + log.info("[PAYLOAD FOR WEB2] Received a Web2 payload") + log.info("[PAYLOAD FOR WEB2] Beginning sanitization checks...") + log.debug("[PAYLOAD FOR WEB2] Web2 Request received (details redacted)") @@ - console.log( - "[REQUEST FOR WEB2] [+] Found and loaded payload.message as expected...", - ) + log.debug("[REQUEST FOR WEB2] Found and loaded payload.message") @@ - console.log("[handleWeb2] DAHR instance created.") + log.info("[handleWeb2] DAHR instance created") @@ - } catch (error: any) { - console.error("Error in handleWeb2:", error) - return error.message + } catch (error: any) { + log.error(`Error in handleWeb2: ${error?.stack || error}`) + return error?.message ?? "Unknown error" }src/libs/network/manageGCRRoutines.ts (1)
10-13: Consider making the params type more specific.The
params: any[]type provides no type safety. Consider defining specific parameter types for each method or using a discriminated union.-interface GCRRoutinePayload { - method: string - params: any[] // ? Define the params type or nah -} +type GCRRoutinePayload = + | { method: "identity_assign_from_write"; params: [any] } + | { method: "getIdentities"; params: [string] } + | { method: "getWeb2Identities"; params: [string] } + | { method: "getXmIdentities"; params: [string] } + | { method: "getPoints"; params: [string] } + | { method: "getReferralInfo"; params: [string] } + | { method: "validateReferralCode"; params: [string] } + | { method: "getAccountByTwitterUsername"; params: [string] } + | { method: "getAccountByTelegramUsername"; params: [string] }src/libs/network/endpointHandlers.ts (1)
347-372: Identity transaction error handling improvements needed.The identity handler catches errors but doesn't differentiate between different types of failures (e.g., network issues vs validation failures). Also, the success status is used inconsistently.
Apply this diff to improve error handling:
case "identity": try { const identityResult = await handleIdentityRequest( tx, sender, ) const status = identityResult.success ? "applied" : "not applied" result.success = identityResult.success - result.extra = { - message: - identityResult.message + `. Transaction ${status}.`, - } + result.response = { + message: identityResult.message, + status: status + } + result.extra = identityResult.success ? null : { + error: identityResult.message + } } catch (e) { console.error(e) - log.error("[handleverifyPayload] Error in identity: " + e) + log.error("[handleExecuteTransaction] Error in identity: " + e.toString()) result.success = false result.response = { - message: "Failed to verify signature", + message: "Failed to process identity request", } result.extra = { error: e.toString(), } } breaksrc/libs/blockchain/gcr/gcr_routines/manageNative.ts (1)
26-32: Catch ensureGCRForUser errors and propagate failures in manageNative balance routinesensureGCRForUser throws (e.g. "Invalid public key") and setBalance currently lets exceptions escape while callers assume success.
- Wrap ensureGCRForUser (and gcrRepository.save) in setBalance with try/catch and return [false, err.message] on failure. (src/libs/blockchain/gcr/gcr_routines/manageNative.ts — setBalance)
- Make addBalance, removeBalance and transferBalance check setBalance's returned [boolean,string] and propagate failures (return the tuple) instead of unconditionally returning success. (src/libs/blockchain/gcr/gcr_routines/manageNative.ts — addBalance, removeBalance, transferBalance)
- Reference: ensureGCRForUser throws on invalid pubkey. (src/libs/blockchain/gcr/gcr_routines/ensureGCRForUser.ts)
src/libs/network/manageBridge.ts (1)
7-11: Tighten BridgePayload typing; deprecate unusedchain; alignparamswith new static API shapes
chainis no longer used (Line 20 destructure drops it). Also,methodandparamscan be strongly typed to prevent runtime shape errors for the two paths.Apply this diff to update the types locally (keeps
chainoptional for backward compat):-interface BridgePayload { - method: string - chain: string - params: any[] // ? Define the params type or nah -} +type BridgeMethod = "get_trade" | "execute_trade" + +// Minimal local typing to match RubicService static API payloads +type TradePayload = { + fromChainId: number + toChainId: number + fromToken: "NATIVE" | "USDC" | "USDT" + toToken: "NATIVE" | "USDC" | "USDT" + amount: string | number | bigint +} +type ExecuteTradePayload = TradePayload & { + fromAddress: string + quoteId: string + toAddress?: string +} + +interface BridgePayload { + method: BridgeMethod + // Deprecated: no longer used. Keep optional for now to avoid breaking callers. + chain?: string + params: [TradePayload] | [ExecuteTradePayload] +}Also applies to: 20-20
| All protected endpoints require transmissions to be coming from a sudo keypair. Connect an authorized mnemonic as shown before proceeding: | ||
|
|
||
| ```ts | ||
| const demos = new Demos() | ||
|
|
||
| await demos.connect("https://demosnode.discus.sh") | ||
| await demos.connectWallet("authorized mnemonic here") | ||
| ``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Do not encourage hardcoding mnemonics; show a safe pattern.
Replace inline mnemonic with env/secret manager guidance and avoid pinning a production URL in examples.
All protected endpoints require transmissions to be coming from a sudo keypair. Connect an authorized mnemonic as shown before proceeding:
```ts
-const demos = new Demos()
-
-await demos.connect("https://demosnode.discus.sh")
-await demos.connectWallet("authorized mnemonic here")
+const demos = new Demos()
+// TIP: Read from env/secret manager; never paste mnemonics in code or terminals.
+await demos.connect(process.env.DEMOS_NODE_URL!)
+await demos.connectWallet(process.env.SUDO_MNEMONIC!)+> Security note: Prefer a hardware wallet signer when possible. Rotate sudo keys regularly and keep them scoped to the minimal permissions required.
<details>
<summary>🤖 Prompt for AI Agents</summary>
documentation/protected-endpoints.md lines 3-10: the example currently
encourages hardcoding a mnemonic and pins a production URL; change the snippet
to read the node URL and sudo mnemonic from environment variables or a secret
manager (e.g. DEMOS_NODE_URL and SUDO_MNEMONIC) instead of inline strings, and
swap the hardcoded production URL for a generic placeholder or env var; also add
a short security note recommending hardware wallets, key rotation, and
least-privilege scoping for sudo keys.
</details>
<!-- fingerprinting:phantom:triton:chinchilla -->
<!-- This is an auto-generated comment by CodeRabbit -->
| interface Response { | ||
| success: boolean | ||
| message: string | ||
| error?: string | ||
| txhash?: string | ||
| confirmationBlock: number | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Tighten the response contract; align casing and optionality.
Make txHash camelCase, model status explicitly, and don’t require confirmationBlock on submission.
-interface Response {
- success: boolean
- message: string
- error?: string
- txhash?: string
- confirmationBlock: number
-}
+interface AwardPointsResponse {
+ success: boolean
+ message: string
+ status: "submitted" | "confirmed" | "failed"
+ txHash?: string
+ confirmationBlock?: number
+ error?: string
+}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| interface Response { | |
| success: boolean | |
| message: string | |
| error?: string | |
| txhash?: string | |
| confirmationBlock: number | |
| } | |
| interface AwardPointsResponse { | |
| success: boolean | |
| message: string | |
| status: "submitted" | "confirmed" | "failed" | |
| txHash?: string | |
| confirmationBlock?: number | |
| error?: string | |
| } |
🤖 Prompt for AI Agents
In documentation/protected-endpoints.md around lines 38 to 44, the Response
interface needs tightening: change txhash to camelCase txHash, replace the
boolean success with an explicit status field (e.g., string union like
"submitted" | "confirmed" | "failed" or an enum) and make confirmationBlock
optional (it should not be required on initial submission); also ensure optional
fields use ? consistently (error?: string, txHash?: string, confirmationBlock?:
number) and update any example usages in the doc to match these names and
optionality.
| const result = await demos.call("awardPoints", [ | ||
| { | ||
| username: "username1", | ||
| points: 1, | ||
| }, | ||
| { | ||
| username: "username2", | ||
| points: 2, | ||
| }, | ||
| ]) as Response; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Document input constraints and batching semantics for awardPoints.
Define the request type, limits, and validation to avoid ambiguity (points int >= 0, username format, max batch size, atomicity).
-const result = await demos.call("awardPoints", [
+type AwardPointsRequest = Array<{ username: string; points: number }>
+// Constraints:
+// - username: case-insensitive, no leading '@' (normalize internally)
+// - points: integer >= 0 (no decimals), max per operation: 10_000 (verify)
+// - batch size: max 100 usernames (verify)
+// - atomicity: partial successes return per-item status (see below)
+
+const result = await demos.call("awardPoints", [
{
username: "username1",
points: 1,
},
{
username: "username2",
points: 2,
},
-]) as Response;
+]) as AwardPointsResponse;🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== rg 'awardPoints' ==="
rg -n --hidden -S 'awardPoints' || true
echo -e "\n=== rg 'demos.call' ==="
rg -n --hidden -S 'demos.call' || true
echo -e "\n=== show documentation/protected-endpoints.md (lines 1-160) ==="
if [ -f documentation/protected-endpoints.md ]; then
nl -ba documentation/protected-endpoints.md | sed -n '1,160p'
else
echo "documentation/protected-endpoints.md not found"
fi
echo -e "\n=== rg 'AwardPointsRequest|AwardPointsResponse|AwardPoints' ==="
rg -n --hidden -S 'AwardPointsRequest|AwardPointsResponse|AwardPoints' || true
echo -e "\n=== rg 'award_points|awardPointsResponse' ==="
rg -n --hidden -S 'award_points|awardPointsResponse' || trueLength of output: 916
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== documentation/protected-endpoints.md (1-200) ==="
if [ -f documentation/protected-endpoints.md ]; then
sed -n '1,200p' documentation/protected-endpoints.md || true
else
echo "documentation/protected-endpoints.md not found"
fi
echo -e "\n=== src/libs/network/server_rpc.ts (1-140) ==="
if [ -f src/libs/network/server_rpc.ts ]; then
sed -n '1,140p' src/libs/network/server_rpc.ts || true
else
echo "src/libs/network/server_rpc.ts not found"
fi
echo -e "\n=== src/libs/network/server_rpc.ts (260-320) ==="
if [ -f src/libs/network/server_rpc.ts ]; then
sed -n '260,320p' src/libs/network/server_rpc.ts || true
fi
echo -e "\n=== src/libs/blockchain/gcr/gcr.ts (740-860) ==="
if [ -f src/libs/blockchain/gcr/gcr.ts ]; then
sed -n '740,860p' src/libs/blockchain/gcr/gcr.ts || true
else
echo "src/libs/blockchain/gcr/gcr.ts not found"
fi
echo -e "\n=== rg: find awardPoints signatures/uses ==="
rg -n --hidden -S "awardPoints\\s*\\(" || true
echo -e "\n=== rg: find twitterUsernames occurrences ==="
rg -n --hidden -S "twitterUsernames" || trueLength of output: 13780
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== src/libs/blockchain/gcr/gcr.ts (660-980) ==="
if [ -f src/libs/blockchain/gcr/gcr.ts ]; then
sed -n '660,980p' src/libs/blockchain/gcr/gcr.ts || true
else
echo "src/libs/blockchain/gcr/gcr.ts not found"
fi
echo -e "\n=== src/libs/network/server_rpc.ts (240-320) ==="
if [ -f src/libs/network/server_rpc.ts ]; then
sed -n '240,320p' src/libs/network/server_rpc.ts || true
else
echo "src/libs/network/server_rpc.ts not found"
fi
echo -e "\n=== rg -n -C2 'getAddressesByTwitterUsernames|applyToTx' src || true ==="
rg -n -C2 "getAddressesByTwitterUsernames|applyToTx" src || trueLength of output: 18732
Document input constraints and fix RPC payload for awardPoints.
- Define request type:
type AwardPointsRequest = Array<{ username: string; points: number }>and update documentation/protected-endpoints.md example and response schema (GCR.awardPoints returns { success:boolean, message:string, txhash?:string, confirmationBlock:number }). - Enforce server-side validation: points must be numeric > 0 (current code rejects <= 0), optionally require integer (no decimals), username must exactly match stored Twitter identity (no implicit normalization), add configurable limits (recommend: per-item max 10_000; batch max 100) and return clear per-request errors.
- Fix bug: src/libs/network/server_rpc.ts (case "awardPoints") currently treats payload.params[0].message as string[] and passes it to GCR.awardPoints, which expects objects — change the RPC to forward AwardPointsRequest (or adapt GCR to accept legacy string[]).
- Document atomicity: current flow builds one transaction and calls HandleGCR.applyToTx which aborts the whole tx if any edit fails (operation is effectively atomic) — document this behavior or change to per-item partial reporting if needed.
🤖 Prompt for AI Agents
In documentation/protected-endpoints.md (around lines 46-55) and server code
src/libs/network/server_rpc.ts (case "awardPoints"), define and use a typed
AwardPointsRequest = Array<{ username: string; points: number }>, update the
example payload and the documented response schema to show GCR.awardPoints
returns { success: boolean; message: string; txhash?: string; confirmationBlock:
number }, and change the RPC handler to forward the structured
AwardPointsRequest (not payload.params[0].message as string[]) to
GCR.awardPoints; add server-side validation enforcing numeric points > 0
(optionally integer), username must exactly match stored Twitter identity,
enforce configurable limits (per-item max 10_000; batch max 100) and return
clear per-item error objects, and either document that HandleGCR.applyToTx is
atomic (aborts whole tx on any edit failure) or change to per-item partial
reporting as required.
| "typescript": "^5.8.3" | ||
| }, | ||
| "dependencies": { | ||
| "@cosmjs/encoding": "^0.33.1", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing deps for new Identity flow; add bip39 libs explicitly.
src/libs/identity/identity.ts imports bip39 and @scure/bip39 but package.json doesn’t declare them. This will fail at runtime.
Add:
"dependencies": {
+ "@scure/bip39": "^1.4.0",
+ "bip39": "^3.1.0",
"@cosmjs/encoding": "^0.33.1",Optionally remove one and standardize on @scure/bip39 (see identity.ts comment).
Also applies to: 53-55, 79-83, 91-93
🤖 Prompt for AI Agents
In package.json around lines 49 (and also apply same change to ranges 53-55,
79-83, 91-93), the new identity flow imports bip39 and @scure/bip39 but those
packages are not declared as dependencies; update package.json to add "bip39"
and "@scure/bip39" to dependencies (or choose one and standardize on
@scure/bip39 as noted in identity.ts), then run the package manager to update
lockfile (npm install / yarn install) so the runtime will have the required
modules.
| mainnet: "https://polygon-rpc.com", | ||
| testnet: "https://polygon-amoy.drpc.org", | ||
| mumbai: "https://rpc.ankr.com/polygon_mumbai", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Polygon testnet now Amoy, but chainIds still points to Mumbai — align mappings.
You switched provider to Amoy but chainIds.polygon lacks amoy (80002) and still has mumbai (80001). This mismatch can break lookups/chainID routing.
Proposed alignments:
- Either revert provider to Mumbai, or add polygon.amoy: 80002 in chainIds and migrate “testnet” to Amoy consistently.
Verification script:
Suggested change in chainIds.ts (separate commit):
polygon: {
mainnet: 137,
- mumbai: 80001,
+ mumbai: 80001,
+ amoy: 80002,
},And ensure callers don’t use ambiguous “testnet” for polygon.
🏁 Script executed:
#!/bin/bash
rg -nP -C2 '\bpolygon\b.*\b(testnet|mumbai|amoy)\b'
rg -nP -C2 'chainIds\s*=\s*{(?s).*polygon' sdk/localsdk/multichain/configs/chainIds.tsLength of output: 425
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "== configs listing =="
ls -la sdk/localsdk/multichain/configs || true
echo
echo "== evmProviders.ts snippet =="
sed -n '1,200p' sdk/localsdk/multichain/configs/evmProviders.ts || true
echo
echo "== Searching repo for 'chainIds' symbols =="
rg -n --hidden -S '\bchainIds\b' || true
echo
echo "== Searching for files named chainIds* =="
find . -type f -iname "*chainids*.ts" -o -iname "chainIds.*" -o -iname "*chain-ids*.ts" || true
echo
echo "== Searching for polygon mentions across repo =="
rg -n --hidden -S '\bpolygon\b' -C3 || true
echo
echo "== Searching for 'amoy' or '80002' or 'mumbai' =="
rg -n --hidden -S '\bamoy\b|\b80002\b|\bmumbai\b' -C3 || trueLength of output: 5775
Polygon testnet mapping mismatch — add Amoy (80002) or revert provider to Mumbai
evmProviders.ts sets polygon.testnet → https://polygon-amoy.drpc.org but chainIds.ts only defines polygon.mumbai = 80001; add amoy: 80002 to chainIds or change the provider back to Mumbai to avoid broken chainID lookups.
Files: sdk/localsdk/multichain/configs/evmProviders.ts (polygon.testnet) and sdk/localsdk/multichain/configs/chainIds.ts (polygon entry).
Suggested diff:
polygon: {
mainnet: 137,
- mumbai: 80001,
+ mumbai: 80001,
+ amoy: 80002,
},Ensure callers reference explicit subchain keys (mumbai or amoy) instead of the ambiguous "testnet".
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| mainnet: "https://polygon-rpc.com", | |
| testnet: "https://polygon-amoy.drpc.org", | |
| mumbai: "https://rpc.ankr.com/polygon_mumbai", | |
| polygon: { | |
| mainnet: 137, | |
| mumbai: 80001, | |
| amoy: 80002, | |
| }, |
| // Parse the operation to get the right compiled operation content | ||
| const derivedContent: bridge.NativeBridgeOperationCompiled["content"] = parseOperation(operation) | ||
| // eslint-disable-next-line prefer-const | ||
| let compiledOperation: bridge.NativeBridgeOperationCompiled = { | ||
| content: derivedContent, | ||
| signature: "", | ||
| rpc: getSharedState.identity.ed25519_hex.publicKey, | ||
| } | ||
| // TODO Generate the validUntil value based on current block + 3 | ||
| // Incorporate the compiled operation into a RPCResponse | ||
| response.response = compiledOperation | ||
| // TODO Return the response | ||
| return response | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Null/undefined content can leak into compiled payload; add validation and error mapping.
parseOperation can yield undefined (unsupported chain) or null (TODO stubs). Returning a compiled op with content: null will likely violate the contract.
Apply:
- const derivedContent: bridge.NativeBridgeOperationCompiled["content"] = parseOperation(operation)
- // eslint-disable-next-line prefer-const
- let compiledOperation: bridge.NativeBridgeOperationCompiled = {
- content: derivedContent,
- signature: "",
- rpc: getSharedState.identity.ed25519_hex.publicKey,
- }
- // TODO Generate the validUntil value based on current block + 3
- // Incorporate the compiled operation into a RPCResponse
- response.response = compiledOperation
- // TODO Return the response
- return response
+ try {
+ const derivedContent = parseOperation(operation)
+ if (derivedContent == null) {
+ return {
+ ...response,
+ result: 400,
+ response: null,
+ extra: { error: "Invalid or unimplemented operation content" },
+ }
+ }
+ const pubKey = getSharedState?.identity?.ed25519_hex?.publicKey
+ if (!pubKey) {
+ return {
+ ...response,
+ result: 500,
+ response: null,
+ extra: { error: "Identity public key unavailable" },
+ }
+ }
+ const compiledOperation: bridge.NativeBridgeOperationCompiled = {
+ content: derivedContent,
+ signature: "", // TODO: sign
+ rpc: pubKey,
+ }
+ // TODO: add validUntil based on current block + 3 (per demosdk contract)
+ response.response = compiledOperation
+ return response
+ } catch (err: any) {
+ return {
+ ...response,
+ result: 500,
+ response: null,
+ extra: { error: err?.message ?? "Internal error" },
+ }
+ }Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In src/libs/network/manageNativeBridge.ts around lines 21 to 34, parseOperation
can return null/undefined which may leak into the compiled payload; validate
derivedContent immediately after parseOperation and handle invalid results by
setting a proper error on the RPC response (e.g., response.error = { code:
"UNSUPPORTED_CHAIN" | "INVALID_OPERATION", message: "...", details:
derivedContent } or similar) and return the response instead of embedding null
into content; only build and attach compiledOperation when derivedContent is a
valid object, and ensure the error mapping uses consistent codes/messages
expected by the caller.
| // eslint-disable-next-line prefer-const | ||
| let compiledOperation: bridge.NativeBridgeOperationCompiled = { | ||
| content: derivedContent, | ||
| signature: "", | ||
| rpc: getSharedState.identity.ed25519_hex.publicKey, | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Unsigned compiled operation and fragile identity access.
signature: ""is a security gap; callers may assume server‑side signed payloads.getSharedState.identity.ed25519_hex.publicKeycan throw if any segment is undefined.
Proposed follow‑ups:
- Gate output until signing is wired (or set
require_reply: truewith clear client expectation). - Guard identity access (as in the diff above) and consider caching the pubkey at startup with health checks.
🤖 Prompt for AI Agents
In src/libs/network/manageNativeBridge.ts around lines 23-28, the
compiledOperation is created with an empty signature and directly accessing
getSharedState.identity.ed25519_hex.publicKey (which can throw); update this so
you do not emit unsigned payloads and you safely access the identity: only
populate the signature field after a successful server-side signing step (or
mark the message require_reply: true and document client expectations) and
replace the direct property access with guarded logic that checks/getters for
identity and ed25519_hex.publicKey (or use a cached startup-validated pubkey),
logging and aborting/returning an error if the key is missing. Ensure the code
paths explicitly handle missing identity (no exception) and only send
compiledOperation when signing or require_reply semantics are satisfied.
| function parseOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | ||
| let derivedContent: bridge.NativeBridgeOperationCompiled["content"] | ||
| if (operation.originChain === "EVM") { | ||
| derivedContent = parseEVMOperation(operation) | ||
| } else if (operation.originChain === "SOLANA") { | ||
| derivedContent = parseSOLANAOperation(operation) | ||
| } | ||
| return derivedContent | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Non‑exhaustive branch; potential undefined return and TS definite‑assignment issue.
derivedContent is declared then conditionally assigned; non‑EVM/SOLANA paths return undefined.
Apply:
-function parseOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] {
- let derivedContent: bridge.NativeBridgeOperationCompiled["content"]
- if (operation.originChain === "EVM") {
- derivedContent = parseEVMOperation(operation)
- } else if (operation.originChain === "SOLANA") {
- derivedContent = parseSOLANAOperation(operation)
- }
- return derivedContent
-}
+function parseOperation(
+ operation: bridge.NativeBridgeOperation
+): bridge.NativeBridgeOperationCompiled["content"] {
+ switch (operation.originChain) {
+ case "EVM":
+ return parseEVMOperation(operation)
+ case "SOLANA":
+ return parseSOLANAOperation(operation)
+ default:
+ throw new Error(`Unsupported originChain: ${String(operation.originChain)}`)
+ }
+}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function parseOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | |
| let derivedContent: bridge.NativeBridgeOperationCompiled["content"] | |
| if (operation.originChain === "EVM") { | |
| derivedContent = parseEVMOperation(operation) | |
| } else if (operation.originChain === "SOLANA") { | |
| derivedContent = parseSOLANAOperation(operation) | |
| } | |
| return derivedContent | |
| } | |
| function parseOperation( | |
| operation: bridge.NativeBridgeOperation | |
| ): bridge.NativeBridgeOperationCompiled["content"] { | |
| switch (operation.originChain) { | |
| case "EVM": | |
| return parseEVMOperation(operation) | |
| case "SOLANA": | |
| return parseSOLANAOperation(operation) | |
| default: | |
| throw new Error(`Unsupported originChain: ${String(operation.originChain)}`) | |
| } | |
| } |
🤖 Prompt for AI Agents
In src/libs/network/manageNativeBridge.ts around lines 41 to 49, the function
parseOperation declares derivedContent then only assigns it for originChain
"EVM" or "SOLANA", leaving other cases to return undefined and causing a TS
definite‑assignment error; make the switch exhaustive by either returning
directly from each branch and adding a final throw for unknown originChain
(e.g., throw new Error(`Unsupported originChain: ${operation.originChain}`)), or
add an explicit default branch that assigns a valid value or throws; ensure the
function always returns bridge.NativeBridgeOperationCompiled["content"] (or
narrows the input type so only EVM|SOLANA are possible) so the compiler and
runtime cannot encounter an undefined return.
| if (operation.originChain === "EVM") { | ||
| derivedContent = parseEVMOperation(operation) | ||
| } else if (operation.originChain === "SOLANA") { | ||
| derivedContent = parseSOLANAOperation(operation) | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Prefer typed discriminants over raw strings for originChain.
If the demosdk exposes an enum/union, use it to avoid typos and ease refactors.
Example:
// if available:
// import { OriginChain } from "@kynesyslabs/demosdk"
switch (operation.originChain /* as OriginChain */) { /* ... */ }🤖 Prompt for AI Agents
In src/libs/network/manageNativeBridge.ts around lines 43 to 47, the code
compares operation.originChain to raw strings ("EVM"/"SOLANA"); replace this
with the demosdk's typed discriminant (e.g., import OriginChain from the demosdk
or declare a local union type) and change the if/else to a switch on
operation.originChain typed as OriginChain (or cast) to leverage exhaustiveness
checking; also add a default/unreachable branch or an exhaustive check to handle
unexpected values so TypeScript will flag missing cases during refactors.
| function parseEVMOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | ||
| // TODO Implement the parsing | ||
| return null | ||
| } | ||
|
|
||
| function parseSOLANAOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | ||
| // TODO Implement the parsing | ||
| return null | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Return type mismatch; replace null stubs with explicit Not Implemented errors.
Returning null violates most content contracts; fail fast.
Apply:
function parseEVMOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] {
- // TODO Implement the parsing
- return null
+ // TODO: Implement EVM operation parsing
+ throw new Error("parseEVMOperation not implemented")
}
function parseSOLANAOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] {
- // TODO Implement the parsing
- return null
+ // TODO: Implement SOLANA operation parsing
+ throw new Error("parseSOLANAOperation not implemented")
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function parseEVMOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | |
| // TODO Implement the parsing | |
| return null | |
| } | |
| function parseSOLANAOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | |
| // TODO Implement the parsing | |
| return null | |
| } | |
| function parseEVMOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | |
| // TODO: Implement EVM operation parsing | |
| throw new Error("parseEVMOperation not implemented") | |
| } | |
| function parseSOLANAOperation(operation: bridge.NativeBridgeOperation): bridge.NativeBridgeOperationCompiled["content"] { | |
| // TODO: Implement SOLANA operation parsing | |
| throw new Error("parseSOLANAOperation not implemented") | |
| } |
🤖 Prompt for AI Agents
In src/libs/network/manageNativeBridge.ts around lines 51 to 59, the
parseEVMOperation and parseSOLANAOperation functions currently return null which
violates the declared return type and should be replaced with explicit "not
implemented" failures; change each stub to immediately throw a clear Not
Implemented error (for example throw new Error('Not implemented:
parseEVMOperation') and throw new Error('Not implemented:
parseSOLANAOperation')) so the functions fail fast and avoid returning null
values that break the contract.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review continued from previous batch...
| find .serena -type f -exec grep -l "^<<<<<<< HEAD" {} \; | while read -r file; do | ||
| echo "Fixing: $file" | ||
| # Keep everything between <<<<<<< HEAD and =======, remove rest | ||
| sed -i '/^<<<<<<< HEAD$/,/^=======$/{ /^<<<<<<< HEAD$/d; /^=======$/d; }; /^>>>>>>> /d' "$file" | ||
| done |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sed command may not handle all conflict marker patterns correctly.
The current sed approach has several issues:
- It doesn't handle the case where
<<<<<<< HEADappears mid-line - The pattern doesn't preserve content between HEAD and ======= markers correctly
- It doesn't handle
>>>>>>> branch_namewith varying branch names
Consider using a more robust conflict resolution approach:
- find .serena -type f -exec grep -l "^<<<<<<< HEAD" {} \; | while read -r file; do
- echo "Fixing: $file"
- # Keep everything between <<<<<<< HEAD and =======, remove rest
- sed -i '/^<<<<<<< HEAD$/,/^=======$/{ /^<<<<<<< HEAD$/d; /^=======$/d; }; /^>>>>>>> /d' "$file"
- done
+ for file in $CONFLICT_FILES; do
+ echo "Fixing: $file"
+ # Create a temporary file with resolved conflicts
+ awk '
+ /^<<<<<<< HEAD/ { in_head = 1; next }
+ /^=======/ { if(in_head) { in_head = 0; in_other = 1; next } }
+ /^>>>>>>> / { in_other = 0; next }
+ in_head { print }
+ !in_head && !in_other { print }
+ ' "$file" > "$file.tmp" && mv "$file.tmp" "$file"
+ doneCommittable suggestion skipped: line range outside the PR's diff.
| public static async getQuoteFromApi( | ||
| payload: BridgeTradePayload, | ||
| ): Promise<WrappedCrossChainTrade | RubicSdkError> { | ||
| await this.waitForInitialization() | ||
| const { fromChainId, toChainId, fromToken, toToken, amount } = payload | ||
|
|
||
| if (!this.sdk) { | ||
| const error = new Error("SDK not initialized") as RubicSdkError | ||
| const fromBlockchain = RubicService.getBlockchainName(fromChainId) | ||
| const toBlockchain = RubicService.getBlockchainName(toChainId) | ||
|
|
||
| return error | ||
| const fromTokenAddress = RubicService.getTokenAddress( | ||
| fromChainId, | ||
| fromToken, | ||
| ) | ||
| const toTokenAddress = RubicService.getTokenAddress(toChainId, toToken) | ||
|
|
||
| const quoteParams = { | ||
| srcTokenBlockchain: fromBlockchain, | ||
| srcTokenAddress: fromTokenAddress, | ||
| srcTokenAmount: amount.toString(), | ||
| dstTokenBlockchain: toBlockchain, | ||
| dstTokenAddress: toTokenAddress, | ||
| referrer: RUBIC_API_REFERRER_ADDRESS, | ||
| integratorAddress: RUBIC_API_INTEGRATOR_ADDRESS, | ||
| } | ||
|
|
||
| try { | ||
| const fromTokenAddress = this.getTokenAddress( | ||
| payload.fromChainId, | ||
| payload.fromToken, | ||
| ) | ||
| const toTokenAddress = this.getTokenAddress(payload.toChainId, payload.toToken) | ||
|
|
||
| const trades = await this.sdk.crossChainManager.calculateTrade( | ||
| { | ||
| address: fromTokenAddress, | ||
| blockchain: this.getBlockchainName(payload.fromChainId), | ||
| }, | ||
| payload.amount, | ||
| { | ||
| address: toTokenAddress, | ||
| blockchain: this.getBlockchainName(payload.toChainId), | ||
| }, | ||
| { | ||
| fromAddress: this.signer.address, | ||
| bridgeTypes: Object.values(BRIDGE_PROTOCOLS) | ||
| .filter(p => p !== "all") | ||
| .map(p => p.toLowerCase()), | ||
| gasCalculation: "enabled", | ||
| } as ExtendedCrossChainManagerCalculationOptions, | ||
| ) | ||
|
|
||
| console.log(`Received ${trades.length} trade options`) | ||
|
|
||
| if (trades.length === 0) { | ||
| const error = new Error("No trades found") as RubicSdkError | ||
|
|
||
| return error | ||
| const quoteResponse = await fetch(RUBIC_API_V2_ROUTES.QUOTE_BEST, { | ||
| method: "POST", | ||
| headers: { "Content-Type": "application/json" }, | ||
| body: JSON.stringify(quoteParams), | ||
| }) | ||
|
|
||
| if (!quoteResponse.ok) { | ||
| const errorText = await quoteResponse.text() | ||
| throw new Error( | ||
| `Rubic API v2 (quoteBest) error: ${quoteResponse.status} ${errorText}`, | ||
| ) | ||
| } | ||
|
|
||
| const filteredTrades = trades.filter( | ||
| trade => trade !== undefined && trade !== null && !trade.error, | ||
| ) | ||
|
|
||
| const bestTrade = filteredTrades[0] | ||
|
|
||
| return bestTrade | ||
| } catch (error: any) { | ||
| console.error("Error getting trade:", error) | ||
|
|
||
| return error as RubicSdkError | ||
| return await quoteResponse.json() | ||
| } catch (error) { | ||
| console.error("Error fetching quote from Rubic API v2:", error) | ||
| throw error | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Type mismatch with Rubic API v2 responses.
getQuoteFromApi returns raw API JSON but is typed as WrappedCrossChainTrade | RubicSdkError. Define API-specific response types (Quote/Swap) to avoid misleading consumers.
- ): Promise<WrappedCrossChainTrade | RubicSdkError> {
+ ): Promise<RubicV2QuoteResponse> {(Define RubicV2QuoteResponse per API schema.)
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In src/features/bridges/rubic.ts around lines 35 to 77, the function currently
returns raw Rubic v2 JSON while its signature promises WrappedCrossChainTrade |
RubicSdkError; define a RubicV2QuoteResponse (and any nested types) matching the
Rubic API v2 schema, use those types when parsing the response, validate
required fields at runtime (or use a small type guard), and map/convert the
validated RubicV2QuoteResponse into the internal WrappedCrossChainTrade or
RubicSdkError before returning; update the function to throw or return a typed
RubicSdkError on API errors and never return raw unvalidated JSON to callers.
| export async function setupRemoteMCPServer(port = 3001, host = "0.0.0.0") { | ||
| try { | ||
| // Create MCP server instance with SSE transport for remote access | ||
| const mcpServer = createDemosMCPServer({ | ||
| port, | ||
| host, | ||
| transport: "sse", | ||
| }) | ||
|
|
||
| // Create and register Demos Network tools | ||
| const demosTools = createDemosNetworkTools({ | ||
| enableBlockchainTools: true, | ||
| enableNetworkTools: true, | ||
| enablePeerTools: true, | ||
| }) | ||
|
|
||
| // Register all tools | ||
| for (const tool of demosTools) { | ||
| mcpServer.registerTool(tool) | ||
| } | ||
|
|
||
| // Start the server | ||
| await mcpServer.start() | ||
|
|
||
| log.info(`[MCP] Remote MCP server started on http://${host}:${port}`) | ||
| log.info(`[MCP] SSE endpoint: http://${host}:${port}/sse`) | ||
| log.info("[MCP] Remote clients can now connect to this MCP server") | ||
|
|
||
| return mcpServer | ||
|
|
||
| } catch (error) { | ||
| log.error(`[MCP] Failed to setup remote MCP server: ${String(error)}`) | ||
| throw error | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Security concern: Binding to 0.0.0.0 exposes the MCP server to all network interfaces.
The default host binding to 0.0.0.0 makes the server accessible from any network interface, which could be a security risk if not intended. Consider defaulting to localhost or 127.0.0.1 for security, and only use 0.0.0.0 when explicitly needed for remote access.
-export async function setupRemoteMCPServer(port = 3001, host = "0.0.0.0") {
+export async function setupRemoteMCPServer(port = 3001, host = "localhost") {Also, update the documentation to clarify the security implications:
* @param port - Port number to listen on (default: 3001)
-* @param host - Host address to bind to (default: "0.0.0.0" for all interfaces)
+* @param host - Host address to bind to (default: "localhost" for local access only. Use "0.0.0.0" with caution as it exposes the server to all network interfaces)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export async function setupRemoteMCPServer(port = 3001, host = "0.0.0.0") { | |
| try { | |
| // Create MCP server instance with SSE transport for remote access | |
| const mcpServer = createDemosMCPServer({ | |
| port, | |
| host, | |
| transport: "sse", | |
| }) | |
| // Create and register Demos Network tools | |
| const demosTools = createDemosNetworkTools({ | |
| enableBlockchainTools: true, | |
| enableNetworkTools: true, | |
| enablePeerTools: true, | |
| }) | |
| // Register all tools | |
| for (const tool of demosTools) { | |
| mcpServer.registerTool(tool) | |
| } | |
| // Start the server | |
| await mcpServer.start() | |
| log.info(`[MCP] Remote MCP server started on http://${host}:${port}`) | |
| log.info(`[MCP] SSE endpoint: http://${host}:${port}/sse`) | |
| log.info("[MCP] Remote clients can now connect to this MCP server") | |
| return mcpServer | |
| } catch (error) { | |
| log.error(`[MCP] Failed to setup remote MCP server: ${String(error)}`) | |
| throw error | |
| } | |
| } | |
| export async function setupRemoteMCPServer(port = 3001, host = "localhost") { | |
| try { | |
| // Create MCP server instance with SSE transport for remote access | |
| const mcpServer = createDemosMCPServer({ | |
| port, | |
| host, | |
| transport: "sse", | |
| }) | |
| // Create and register Demos Network tools | |
| const demosTools = createDemosNetworkTools({ | |
| enableBlockchainTools: true, | |
| enableNetworkTools: true, | |
| enablePeerTools: true, | |
| }) | |
| // Register all tools | |
| for (const tool of demosTools) { | |
| mcpServer.registerTool(tool) | |
| } | |
| // Start the server | |
| await mcpServer.start() | |
| log.info(`[MCP] Remote MCP server started on http://${host}:${port}`) | |
| log.info(`[MCP] SSE endpoint: http://${host}:${port}/sse`) | |
| log.info("[MCP] Remote clients can now connect to this MCP server") | |
| return mcpServer | |
| } catch (error) { | |
| log.error(`[MCP] Failed to setup remote MCP server: ${String(error)}`) | |
| throw error | |
| } | |
| } |
🤖 Prompt for AI Agents
In src/features/mcp/examples/remoteExample.ts around lines 33 to 67, the
function defaults host to "0.0.0.0" which exposes the MCP server to all network
interfaces; change the default host to "127.0.0.1" (or "localhost") and add an
explicit parameter or environment flag (e.g., allowRemote=true) that must be
truthy for callers to pass "0.0.0.0"; validate the host parameter at function
start to reject or require the flag when a non-loopback address is provided,
update the log messages to indicate when the server is bound for remote access
versus local-only, and add a short doc comment above the function explaining the
security implications and when to use 0.0.0.0.
| const demosTools = createDemosNetworkTools({ | ||
| enableBlockchainTools: true, | ||
| enableNetworkTools: true, | ||
| enablePeerTools: true, | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Inconsistent tool configuration between remote and local setups.
The remote server explicitly enables all tool categories, while the local server uses defaults. This inconsistency could lead to unexpected behavior differences.
// Create and register Demos Network tools
-const demosTools = createDemosNetworkTools()
+const demosTools = createDemosNetworkTools({
+ enableBlockchainTools: true,
+ enableNetworkTools: true,
+ enablePeerTools: true,
+})Also applies to: 94-94
🤖 Prompt for AI Agents
In src/features/mcp/examples/remoteExample.ts around lines 43-47 (and also line
94), the remote example explicitly passes
enableBlockchainTools/enableNetworkTools/enablePeerTools=true while the local
setup relies on defaults, causing inconsistent behavior; update the remote
example to mirror the local configuration by either removing the explicit
options so both use the shared defaults, or explicitly set the same option
values used by the local server so both examples match; apply the same change at
line 94 to keep both remote occurrences consistent.
| export async function setupDemosMCPServer(options?: { | ||
| port?: number | ||
| host?: string | ||
| transport?: "stdio" | "sse" | ||
| }) { | ||
| try { | ||
| // Create MCP server instance with optional transport configuration | ||
| const mcpServer = createDemosMCPServer({ | ||
| port: options?.port, | ||
| host: options?.host, | ||
| transport: options?.transport, | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add explicit return type for stronger typing.
Avoid any/implicit return types.
-export async function setupDemosMCPServer(options?: {
+export async function setupDemosMCPServer(options?: {
port?: number
host?: string
transport?: "stdio" | "sse"
-}) {
+}): Promise<ReturnType<typeof createDemosMCPServer>> {📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export async function setupDemosMCPServer(options?: { | |
| port?: number | |
| host?: string | |
| transport?: "stdio" | "sse" | |
| }) { | |
| try { | |
| // Create MCP server instance with optional transport configuration | |
| const mcpServer = createDemosMCPServer({ | |
| port: options?.port, | |
| host: options?.host, | |
| transport: options?.transport, | |
| }) | |
| export async function setupDemosMCPServer(options?: { | |
| port?: number | |
| host?: string | |
| transport?: "stdio" | "sse" | |
| }): Promise<ReturnType<typeof createDemosMCPServer>> { | |
| try { | |
| // Create MCP server instance with optional transport configuration | |
| const mcpServer = createDemosMCPServer({ | |
| port: options?.port, | |
| host: options?.host, | |
| transport: options?.transport, | |
| }) |
🤖 Prompt for AI Agents
In src/features/mcp/examples/simpleExample.ts around lines 35 to 46, the async
function setupDemosMCPServer lacks an explicit return type; update the signature
to declare the precise Promise return type (e.g. Promise<YourMCPServerType> or
Promise<void>), import or reference the concrete server type (or use
ReturnType<typeof createDemosMCPServer>), and ensure the function actually
returns the created mcpServer (or nothing) to match the annotated type.
| let score = 0 | ||
|
|
||
| // Check for suspicious keywords | ||
| const suspiciousKeywords = ["nexyai.io", "$fan", "maxi"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Avoid hardcoding suspicious keywords.
The suspicious keywords list is hardcoded and includes specific domains/terms that might change.
Consider moving these to a configuration file or environment variables for easier maintenance:
- const suspiciousKeywords = ["nexyai.io", "$fan", "maxi"]
+ const suspiciousKeywords = process.env.TWITTER_SUSPICIOUS_KEYWORDS?.split(',') ||
+ ["nexyai.io", "$fan", "maxi"]📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const suspiciousKeywords = ["nexyai.io", "$fan", "maxi"] | |
| const suspiciousKeywords = process.env.TWITTER_SUSPICIOUS_KEYWORDS?.split(',') || | |
| ["nexyai.io", "$fan", "maxi"] |
🤖 Prompt for AI Agents
In src/libs/identity/tools/twitter.ts around line 183, the suspiciousKeywords
array is hardcoded; move this list into configuration by reading from a config
module or an environment variable (e.g., SUSPICIOUS_KEYWORDS) that accepts a
comma-separated string, parse and trim entries at module init, and fall back to
a sensible default if unset; update any imports or usage sites to reference the
new config value and add a note in README/config docs and tests to cover the
env/config behavior.
| demos_twitter_username = "demos_network" | ||
| api_key = process.env.RAPID_API_KEY | ||
| api_host = process.env.RAPID_API_HOST | ||
| api_url = "https://" + this.api_host |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Validate environment variables on initialization.
The API configuration relies on environment variables but doesn't validate they exist.
Add validation in the constructor or getInstance:
static getInstance() {
if (!Twitter.instance) {
+ const apiKey = process.env.RAPID_API_KEY
+ const apiHost = process.env.RAPID_API_HOST
+ if (!apiKey || !apiHost) {
+ throw new Error("Missing required environment variables: RAPID_API_KEY and RAPID_API_HOST")
+ }
Twitter.instance = new Twitter()
}
return Twitter.instance
}Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In src/libs/identity/tools/twitter.ts around lines 410 to 413, the code reads
RAPID_API_KEY and RAPID_API_HOST from process.env without validation; update the
class constructor or getInstance to validate that RAPID_API_KEY and
RAPID_API_HOST are present (and non-empty), throwing a clear error or logging
and exiting if they are missing, and only then set api_key, api_host and compute
api_url as "https://" + api_host; this ensures initialization fails fast with a
descriptive message when required environment variables are not provided.
| const signatureValid = await ucrypto.verify({ | ||
| algorithm: validatedData.signature.type as SigningAlgorithm, | ||
| message: new TextEncoder().encode(hashedData), | ||
| publicKey: hexToUint8Array( | ||
| validatedData.rpc_public_key.data, | ||
| ) as any, | ||
| signature: hexToUint8Array(validatedData.signature.data) as any, | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add null checks for signature fields.
The code accesses validatedData.signature.type and validatedData.signature.data without checking if they exist first.
Apply this diff to add proper validation:
+ if (!validatedData.signature || !validatedData.signature.type || !validatedData.signature.data) {
+ log.error("[handleExecuteTransaction] Missing or invalid signature structure")
+ result.success = false
+ result.response = false
+ result.extra = "Invalid signature structure"
+ return result
+ }
+
const hashedData = Hashing.sha256(JSON.stringify(validatedData.data))
const signatureValid = await ucrypto.verify({
algorithm: validatedData.signature.type as SigningAlgorithm,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const signatureValid = await ucrypto.verify({ | |
| algorithm: validatedData.signature.type as SigningAlgorithm, | |
| message: new TextEncoder().encode(hashedData), | |
| publicKey: hexToUint8Array( | |
| validatedData.rpc_public_key.data, | |
| ) as any, | |
| signature: hexToUint8Array(validatedData.signature.data) as any, | |
| }) | |
| if (!validatedData.signature || !validatedData.signature.type || !validatedData.signature.data) { | |
| log.error("[handleExecuteTransaction] Missing or invalid signature structure") | |
| result.success = false | |
| result.response = false | |
| result.extra = "Invalid signature structure" | |
| return result | |
| } | |
| const hashedData = Hashing.sha256(JSON.stringify(validatedData.data)) | |
| const signatureValid = await ucrypto.verify({ | |
| algorithm: validatedData.signature.type as SigningAlgorithm, | |
| message: new TextEncoder().encode(hashedData), | |
| publicKey: hexToUint8Array( | |
| validatedData.rpc_public_key.data, | |
| ) as any, | |
| signature: hexToUint8Array(validatedData.signature.data) as any, | |
| }) |
🤖 Prompt for AI Agents
In src/libs/network/endpointHandlers.ts around lines 225 to 232, the code uses
validatedData.signature.type and validatedData.signature.data without checking
for a null/undefined signature; add a guard that verifies
validatedData.signature exists and that both signature.type and signature.data
are present and of expected types before calling ucrypto.verify, and if any are
missing return or throw a validation error (e.g., respond with 400 / throw a
descriptive Error) so the function never attempts to access properties on
undefined; optionally narrow types for signature in the validatedData type to
satisfy TypeScript.
| export async function manageExecution( | ||
| content: BundleContent, | ||
| sender: string, | ||
| ): Promise<RPCResponse> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Breaking change: new sender param—add runtime validation and verify all call sites are updated.
Reject empty/whitespace senders up front to avoid ambiguous identity handling. Also ensure every caller now passes sender.
Apply:
export async function manageExecution(
content: BundleContent,
sender: string,
): Promise<RPCResponse> {
+ if (!sender || typeof sender !== "string" || sender.trim() === "") {
+ return { ...emptyResponse, result: 400, response: "Invalid sender", require_reply: false }
+ }Run to find/update call sites:
🏁 Script executed:
#!/bin/bash
rg -nP --type=ts -C2 '\bmanageExecution\s*\('
rg -nP --type=ts -C2 '\bhandleValidateTransaction\s*\('
rg -nP --type=ts -C2 '\bhandleExecuteTransaction\s*\('Length of output: 2331
Enforce non-empty sender in manageExecution.
Reject empty/whitespace senders at runtime; call sites are already updated (src/libs/network/server_rpc.ts, ServerHandlers methods).
export async function manageExecution(
content: BundleContent,
sender: string,
): Promise<RPCResponse> {
+ if (!sender || typeof sender !== "string" || sender.trim() === "") {
+ return { ...emptyResponse, result: 400, response: "Invalid sender", require_reply: false }
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export async function manageExecution( | |
| content: BundleContent, | |
| sender: string, | |
| ): Promise<RPCResponse> { | |
| export async function manageExecution( | |
| content: BundleContent, | |
| sender: string, | |
| ): Promise<RPCResponse> { | |
| if (!sender || typeof sender !== "string" || sender.trim() === "") { | |
| return { ...emptyResponse, result: 400, response: "Invalid sender", require_reply: false } | |
| } |
🤖 Prompt for AI Agents
In src/libs/network/manageExecution.ts around lines 13–16, add an early runtime
validation that rejects empty or whitespace-only sender values: trim the sender
and if it is empty, return a failed RPCResponse (or throw a consistent RPC-level
error used in this module) indicating "invalid/empty sender" so callers receive
a clear rejection; keep this check at the top of manageExecution before any
other processing.
| case "getReferralInfo": { | ||
| const account = await ensureGCRForUser(params[0]) | ||
| response.response = account.referralInfo | ||
| break | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for database operations.
The getReferralInfo and validateReferralCode cases could throw errors if database operations fail. Consider wrapping in try-catch blocks.
case "getReferralInfo": {
- const account = await ensureGCRForUser(params[0])
- response.response = account.referralInfo
+ try {
+ const account = await ensureGCRForUser(params[0])
+ response.response = account.referralInfo
+ } catch (error) {
+ response.result = 500
+ response.response = `Failed to get referral info: ${error.message}`
+ }
break
}
case "validateReferralCode": {
- const account = await Referrals.findAccountByReferralCode(params[0])
- response.response = {
- isValid: account !== null,
- referrerPubkey: account?.pubkey || null,
- message: account
- ? "Referral code is valid"
- : "Referral code is invalid",
+ try {
+ const account = await Referrals.findAccountByReferralCode(params[0])
+ response.response = {
+ isValid: account !== null,
+ referrerPubkey: account?.pubkey || null,
+ message: account
+ ? "Referral code is valid"
+ : "Referral code is invalid",
+ }
+ } catch (error) {
+ response.result = 500
+ response.response = `Failed to validate referral code: ${error.message}`
}
break
}Also applies to: 60-70
🤖 Prompt for AI Agents
In src/libs/network/manageGCRRoutines.ts around lines 54-58 and also 60-70, the
database calls in the "getReferralInfo" and "validateReferralCode" cases are not
protected and may throw — wrap each case's async DB operations (ensureGCRForUser
and any DB lookups) in a try-catch, log the caught error with contextual info,
set response.error (or response.response to an error object/message) so the
caller receives a failure instead of an unhandled exception, and ensure the case
still breaks/returns after handling the error.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review continued from previous batch...
| const twitterUser = (account.identities.web2["twitter"] || []).find( | ||
| (twitterIdentity: Web2GCRData["data"]) => | ||
| twitterIdentity.userId === twitterUserId, | ||
| ) | ||
|
|
||
| if (twitterUser && twitterUser.username) { | ||
| const isFollowingDemos = await twitter.checkFollow( | ||
| twitterUser.username, | ||
| ) | ||
|
|
||
| if (isFollowingDemos) { | ||
| account.points.breakdown.demosFollow = pointValues.FOLLOW_DEMOS | ||
| account.points.totalPoints += pointValues.FOLLOW_DEMOS | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
External API call without error handling or timeout
The Twitter API call to check if a user follows Demos is made without error handling, timeout, or caching. This could cause point awarding to fail silently or hang.
Wrap the Twitter API call in proper error handling:
+try {
const twitter = Twitter.getInstance()
const twitterUser = (account.identities.web2["twitter"] || []).find(
(twitterIdentity: Web2GCRData["data"]) =>
twitterIdentity.userId === twitterUserId,
)
if (twitterUser && twitterUser.username) {
- const isFollowingDemos = await twitter.checkFollow(
- twitterUser.username,
- )
+ const isFollowingDemos = await Promise.race([
+ twitter.checkFollow(twitterUser.username),
+ new Promise<boolean>((resolve) => setTimeout(() => resolve(false), 5000))
+ ])
if (isFollowingDemos) {
account.points.breakdown.demosFollow = pointValues.FOLLOW_DEMOS
account.points.totalPoints += pointValues.FOLLOW_DEMOS
}
}
+} catch (error) {
+ log.warn(`Failed to check Twitter follow status: ${error}`)
+ // Continue without awarding follow points
+}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const twitterUser = (account.identities.web2["twitter"] || []).find( | |
| (twitterIdentity: Web2GCRData["data"]) => | |
| twitterIdentity.userId === twitterUserId, | |
| ) | |
| if (twitterUser && twitterUser.username) { | |
| const isFollowingDemos = await twitter.checkFollow( | |
| twitterUser.username, | |
| ) | |
| if (isFollowingDemos) { | |
| account.points.breakdown.demosFollow = pointValues.FOLLOW_DEMOS | |
| account.points.totalPoints += pointValues.FOLLOW_DEMOS | |
| } | |
| } | |
| try { | |
| const twitter = Twitter.getInstance() | |
| const twitterUser = (account.identities.web2["twitter"] || []).find( | |
| (twitterIdentity: Web2GCRData["data"]) => | |
| twitterIdentity.userId === twitterUserId, | |
| ) | |
| if (twitterUser && twitterUser.username) { | |
| const isFollowingDemos = await Promise.race([ | |
| twitter.checkFollow(twitterUser.username), | |
| new Promise<boolean>((resolve) => setTimeout(() => resolve(false), 5000)) | |
| ]) | |
| if (isFollowingDemos) { | |
| account.points.breakdown.demosFollow = pointValues.FOLLOW_DEMOS | |
| account.points.totalPoints += pointValues.FOLLOW_DEMOS | |
| } | |
| } | |
| } catch (error) { | |
| log.warn(`Failed to check Twitter follow status: ${error}`) | |
| // Continue without awarding follow points | |
| } |
🤖 Prompt for AI Agents
In src/features/incentive/PointSystem.ts around lines 226-240 the external call
twitter.checkFollow(...) is invoked without error handling, timeout, or fallback
which can hang or throw and prevent points being awarded; wrap the call in a
try/catch, enforce a timeout (e.g. Promise.race with a configurable timeout) so
the call fails fast, log any errors with context (account id/twitterUserId) and
ensure the code treats failures as "not following" (do not throw), and
optionally consult or populate a short-lived cache before/after the call to
avoid repeated external calls.
| let walletIsAlreadyLinked = false | ||
| let hasExistingWalletOnChain = false | ||
| const walletIsAlreadyLinkedMessage = "This wallet is already linked" | ||
| const hasExistingWalletOnChainMessage = `A ${chain} wallet is already linked. Please disconnect it first.` | ||
| try { | ||
| // Get current points and identities from GCR | ||
| const userPointsWithIdentities = await this.getUserPointsInternal( | ||
| userId, | ||
| ) | ||
|
|
||
| if ( | ||
| userPointsWithIdentities.linkedWallets.includes( | ||
| `${chain}:${walletAddress}`, | ||
| ) | ||
| ) { | ||
| walletIsAlreadyLinked = true | ||
| } | ||
|
|
||
| // Check if any wallet of this chain type is already linked | ||
| const hasExistingChainWallet = | ||
| userPointsWithIdentities.linkedWallets.some(wallet => | ||
| wallet.startsWith(`${chain}:`), | ||
| ) | ||
|
|
||
| if (hasExistingChainWallet) { | ||
| hasExistingWalletOnChain = true | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Logic flaw: Points awarded despite validation failures
The method continues to award points even when walletIsAlreadyLinked or hasExistingWalletOnChain is true. The points are awarded at line 316-322 before these checks affect the response.
Move the validation checks before awarding points:
try {
// Get current points and identities from GCR
const userPointsWithIdentities = await this.getUserPointsInternal(
userId,
)
if (
userPointsWithIdentities.linkedWallets.includes(
`${chain}:${walletAddress}`,
)
) {
walletIsAlreadyLinked = true
+ return {
+ result: 400,
+ response: {
+ pointsAwarded: 0,
+ totalPoints: userPointsWithIdentities.totalPoints,
+ message: walletIsAlreadyLinkedMessage,
+ },
+ require_reply: false,
+ extra: {},
+ }
}
// Check if any wallet of this chain type is already linked
const hasExistingChainWallet =
userPointsWithIdentities.linkedWallets.some(wallet =>
wallet.startsWith(`${chain}:`),
)
if (hasExistingChainWallet) {
hasExistingWalletOnChain = true
+ return {
+ result: 400,
+ response: {
+ pointsAwarded: 0,
+ totalPoints: userPointsWithIdentities.totalPoints,
+ message: hasExistingWalletOnChainMessage,
+ },
+ require_reply: false,
+ extra: {},
+ }
}
// Award points by updating the GCR
await this.addPointsToGCR(
userId,
pointValues.LINK_WEB3_WALLET,
"web3Wallets",
chain,
referralCode,
)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| let walletIsAlreadyLinked = false | |
| let hasExistingWalletOnChain = false | |
| const walletIsAlreadyLinkedMessage = "This wallet is already linked" | |
| const hasExistingWalletOnChainMessage = `A ${chain} wallet is already linked. Please disconnect it first.` | |
| try { | |
| // Get current points and identities from GCR | |
| const userPointsWithIdentities = await this.getUserPointsInternal( | |
| userId, | |
| ) | |
| if ( | |
| userPointsWithIdentities.linkedWallets.includes( | |
| `${chain}:${walletAddress}`, | |
| ) | |
| ) { | |
| walletIsAlreadyLinked = true | |
| } | |
| // Check if any wallet of this chain type is already linked | |
| const hasExistingChainWallet = | |
| userPointsWithIdentities.linkedWallets.some(wallet => | |
| wallet.startsWith(`${chain}:`), | |
| ) | |
| if (hasExistingChainWallet) { | |
| hasExistingWalletOnChain = true | |
| } | |
| let walletIsAlreadyLinked = false | |
| let hasExistingWalletOnChain = false | |
| const walletIsAlreadyLinkedMessage = "This wallet is already linked" | |
| const hasExistingWalletOnChainMessage = `A ${chain} wallet is already linked. Please disconnect it first.` | |
| try { | |
| // Get current points and identities from GCR | |
| const userPointsWithIdentities = await this.getUserPointsInternal( | |
| userId, | |
| ) | |
| if ( | |
| userPointsWithIdentities.linkedWallets.includes( | |
| `${chain}:${walletAddress}`, | |
| ) | |
| ) { | |
| walletIsAlreadyLinked = true | |
| return { | |
| result: 400, | |
| response: { | |
| pointsAwarded: 0, | |
| totalPoints: userPointsWithIdentities.totalPoints, | |
| message: walletIsAlreadyLinkedMessage, | |
| }, | |
| require_reply: false, | |
| extra: {}, | |
| } | |
| } | |
| // Check if any wallet of this chain type is already linked | |
| const hasExistingChainWallet = | |
| userPointsWithIdentities.linkedWallets.some(wallet => | |
| wallet.startsWith(`${chain}:`), | |
| ) | |
| if (hasExistingChainWallet) { | |
| hasExistingWalletOnChain = true | |
| return { | |
| result: 400, | |
| response: { | |
| pointsAwarded: 0, | |
| totalPoints: userPointsWithIdentities.totalPoints, | |
| message: hasExistingWalletOnChainMessage, | |
| }, | |
| require_reply: false, | |
| extra: {}, | |
| } | |
| } |
🤖 Prompt for AI Agents
In src/features/incentive/PointSystem.ts around lines 287 to 313, the code
computes walletIsAlreadyLinked and hasExistingWalletOnChain but the function
still proceeds to award points later even when those validations fail; move
these validation checks to occur before any call that grants points (i.e.,
compute userPointsWithIdentities, set walletIsAlreadyLinked and
hasExistingWalletOnChain, then immediately return or throw the appropriate
error/response when either flag is true), ensuring no awardPoints or
points-creation logic runs if a wallet is already linked or a chain wallet
exists.
| await this.addPointsToGCR( | ||
| userId, | ||
| -pointValues.LINK_WEB3_WALLET, | ||
| "web3Wallets", | ||
| chain, | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Negative points can corrupt account state
The deduction methods don't validate that the resulting total points won't go negative, which could corrupt the account state.
Add validation to prevent negative totals:
await this.addPointsToGCR(
userId,
-pointValues.LINK_WEB3_WALLET,
"web3Wallets",
chain,
)
+
+// Verify points didn't go negative
+const account = await ensureGCRForUser(userId)
+if (account.points.totalPoints < 0) {
+ account.points.totalPoints = 0
+ const db = await Datasource.getInstance()
+ const repository = db.getDataSource().getRepository(GCRMain)
+ await repository.save(account)
+}Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In src/features/incentive/PointSystem.ts around lines 523 to 528, the code calls
addPointsToGCR with a negative value but does not validate that the resulting
user total won't become negative; update the deduction flow to first fetch the
user's current point total for the relevant category (or the aggregate GCR
total) and compute the post-deduction balance, then if the result would be
negative either throw a controlled error or prevent the deduction (e.g., return
early or clamp to zero) and ensure the change is persisted transactionally so
the database cannot end up with negative points; add appropriate error messages
and unit tests to cover attempting deductions that would underflow the balance.
| const cleanKey = publicKey.startsWith("0x") | ||
| ? publicKey.slice(2) | ||
| : publicKey | ||
| const checksumHash = Hashing.sha256(cleanKey + "CHECKSUM") | ||
| const checksumBytes = Buffer.from(checksumHash.slice(0, 4), "hex") | ||
| const checksumB58 = bs58.encode(checksumBytes) | ||
| return checksumB58.substring(0, 2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Use SHA-256 from ucrypto for consistency.
The generateChecksum method uses a different SHA-256 implementation than the main hash generation, which could lead to inconsistencies. Consider using the same hashing utility throughout.
Let me verify which hashing implementation is standard across the codebase:
🏁 Script executed:
#!/bin/bash
# Check for ucrypto usage in hashing contexts
rg -n "ucrypto.*sha256|sha256.*ucrypto" --type ts
# Check if Hashing utility uses ucrypto internally
cat src/libs/crypto/hashing.tsLength of output: 674
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== Hashing.sha256 usages ==="
rg -n --hidden --type ts "Hashing\.sha256\(" -S || true
echo
echo "=== Hashing.sha256Bytes usages ==="
rg -n --hidden --type ts "Hashing\.sha256Bytes\(" -S || true
echo
echo "=== Any direct crypto.createHash usages ==="
rg -n --hidden --type ts "createHash\(" -S || true
echo
echo "=== Files referencing 'ucrypto' ==="
rg -n --hidden --type ts "ucrypto" -S || true
echo
echo "=== referrals.ts (first 200 lines) ==="
sed -n '1,200p' src/features/incentive/referrals.ts || true
echo
echo "=== hashing.ts ==="
sed -n '1,240p' src/libs/crypto/hashing.ts || trueLength of output: 16267
Unify SHA-256 implementation — use Node's crypto consistently.
Hashing.sha256 uses node‑forge while sha256Bytes uses crypto.createHash('sha256'); generateChecksum (src/features/incentive/referrals.ts) calls Hashing.sha256. Consolidate to a single implementation (e.g., make Hashing.sha256 delegate to crypto.createHash or route callers to sha256Bytes) and add tests asserting identical hex outputs for representative inputs.
| newUserAccount: GCRMain, | ||
| gcrMainRepository: any, | ||
| ): Promise<void> { | ||
| if (!referrerAccount.points.breakdown.referrals) { | ||
| referrerAccount.points.breakdown.referrals = 0 | ||
| } | ||
|
|
||
| if (!newUserAccount.points.breakdown.referrals) { | ||
| newUserAccount.points.breakdown.referrals = 0 | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix potential null pointer exception when initializing points breakdown.
The code assumes referrerAccount.points and newUserAccount.points exist, but doesn't verify this. If the points object or its breakdown property is null/undefined, this will cause a runtime error.
Apply this defensive initialization:
private static async awardReferralPoints(
referrerAccount: GCRMain,
newUserAccount: GCRMain,
gcrMainRepository: any,
): Promise<void> {
+ // Ensure points structure exists
+ if (!referrerAccount.points) {
+ referrerAccount.points = {
+ totalPoints: 0,
+ breakdown: {},
+ lastUpdated: new Date()
+ }
+ }
+ if (!referrerAccount.points.breakdown) {
+ referrerAccount.points.breakdown = {}
+ }
if (!referrerAccount.points.breakdown.referrals) {
referrerAccount.points.breakdown.referrals = 0
}
+ if (!newUserAccount.points) {
+ newUserAccount.points = {
+ totalPoints: 0,
+ breakdown: {},
+ lastUpdated: new Date()
+ }
+ }
+ if (!newUserAccount.points.breakdown) {
+ newUserAccount.points.breakdown = {}
+ }
if (!newUserAccount.points.breakdown.referrals) {
newUserAccount.points.breakdown.referrals = 0
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| newUserAccount: GCRMain, | |
| gcrMainRepository: any, | |
| ): Promise<void> { | |
| if (!referrerAccount.points.breakdown.referrals) { | |
| referrerAccount.points.breakdown.referrals = 0 | |
| } | |
| if (!newUserAccount.points.breakdown.referrals) { | |
| newUserAccount.points.breakdown.referrals = 0 | |
| } | |
| private static async awardReferralPoints( | |
| referrerAccount: GCRMain, | |
| newUserAccount: GCRMain, | |
| gcrMainRepository: any, | |
| ): Promise<void> { | |
| // Ensure points structure exists | |
| if (!referrerAccount.points) { | |
| referrerAccount.points = { | |
| totalPoints: 0, | |
| breakdown: {}, | |
| lastUpdated: new Date() | |
| } | |
| } | |
| if (!referrerAccount.points.breakdown) { | |
| referrerAccount.points.breakdown = {} | |
| } | |
| if (!referrerAccount.points.breakdown.referrals) { | |
| referrerAccount.points.breakdown.referrals = 0 | |
| } | |
| if (!newUserAccount.points) { | |
| newUserAccount.points = { | |
| totalPoints: 0, | |
| breakdown: {}, | |
| lastUpdated: new Date() | |
| } | |
| } | |
| if (!newUserAccount.points.breakdown) { | |
| newUserAccount.points.breakdown = {} | |
| } | |
| if (!newUserAccount.points.breakdown.referrals) { | |
| newUserAccount.points.breakdown.referrals = 0 | |
| } |
🤖 Prompt for AI Agents
In src/features/incentive/referrals.ts around lines 191 to 200, the code
accesses referrerAccount.points.breakdown.referrals and
newUserAccount.points.breakdown.referrals without verifying points or breakdown
exist; add defensive initialization by ensuring referrerAccount.points and
referrerAccount.points.breakdown exist (create them as empty objects if missing)
before setting referrals, and do the same for newUserAccount; explicitly set
referrals to 0 only after guaranteeing the parent objects exist so no
null/undefined property access occurs.
| if (context === "telegram") { | ||
| // Telegram uses dual signature validation (user + bot signatures) | ||
| // The proof is a TelegramSignedAttestation object, not a URL | ||
| try { | ||
| // Import verifyWeb2Proof which handles telegram verification | ||
| const { verifyWeb2Proof } = await import("@/libs/abstraction") | ||
|
|
||
| const verificationResult = await verifyWeb2Proof( | ||
| { | ||
| context: "telegram", | ||
| username: data.username, | ||
| userId: data.userId, | ||
| proof: data.proof, | ||
| }, | ||
| accountGCR.pubkey, // sender's ed25519 address | ||
| ) | ||
|
|
||
| proofOk = verificationResult.success | ||
|
|
||
| if (!proofOk) { | ||
| log.error( | ||
| `Telegram verification failed: ${verificationResult.message}`, | ||
| ) | ||
| return { | ||
| success: false, | ||
| message: verificationResult.message, | ||
| } | ||
| } | ||
|
|
||
| log.info( | ||
| `Telegram identity verified: ${data.username} (${data.userId})`, | ||
| ) | ||
| } catch (error) { | ||
| log.error(`Telegram proof verification failed: ${error}`) | ||
| proofOk = false | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Dynamic import inside a conditional could impact performance.
The dynamic import of verifyWeb2Proof happens inside the Telegram verification path, which could cause delays. Consider importing it at module level or caching the import.
Move the import outside the conditional or cache it:
+ // Cache the dynamic import
+ let verifyWeb2ProofCache: typeof import("@/libs/abstraction").verifyWeb2Proof | null = null
+
static async applyWeb2IdentityAdd(
editOperation: any,
gcrMainRepository: Repository<GCRMain>,
simulate: boolean,
): Promise<GCRResult> {
// ... existing code ...
if (context === "telegram") {
try {
- const { verifyWeb2Proof } = await import("@/libs/abstraction")
+ if (!verifyWeb2ProofCache) {
+ const module = await import("@/libs/abstraction")
+ verifyWeb2ProofCache = module.verifyWeb2Proof
+ }
- const verificationResult = await verifyWeb2Proof(
+ const verificationResult = await verifyWeb2ProofCache(
// ... rest of the code📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if (context === "telegram") { | |
| // Telegram uses dual signature validation (user + bot signatures) | |
| // The proof is a TelegramSignedAttestation object, not a URL | |
| try { | |
| // Import verifyWeb2Proof which handles telegram verification | |
| const { verifyWeb2Proof } = await import("@/libs/abstraction") | |
| const verificationResult = await verifyWeb2Proof( | |
| { | |
| context: "telegram", | |
| username: data.username, | |
| userId: data.userId, | |
| proof: data.proof, | |
| }, | |
| accountGCR.pubkey, // sender's ed25519 address | |
| ) | |
| proofOk = verificationResult.success | |
| if (!proofOk) { | |
| log.error( | |
| `Telegram verification failed: ${verificationResult.message}`, | |
| ) | |
| return { | |
| success: false, | |
| message: verificationResult.message, | |
| } | |
| } | |
| log.info( | |
| `Telegram identity verified: ${data.username} (${data.userId})`, | |
| ) | |
| } catch (error) { | |
| log.error(`Telegram proof verification failed: ${error}`) | |
| proofOk = false | |
| } | |
| // Cache the dynamic import | |
| let verifyWeb2ProofCache: typeof import("@/libs/abstraction").verifyWeb2Proof | null = null | |
| static async applyWeb2IdentityAdd( | |
| editOperation: any, | |
| gcrMainRepository: Repository<GCRMain>, | |
| simulate: boolean, | |
| ): Promise<GCRResult> { | |
| // ... existing code ... | |
| if (context === "telegram") { | |
| // Telegram uses dual signature validation (user + bot signatures) | |
| // The proof is a TelegramSignedAttestation object, not a URL | |
| try { | |
| if (!verifyWeb2ProofCache) { | |
| const module = await import("@/libs/abstraction") | |
| verifyWeb2ProofCache = module.verifyWeb2Proof | |
| } | |
| const verificationResult = await verifyWeb2ProofCache( | |
| { | |
| context: "telegram", | |
| username: data.username, | |
| userId: data.userId, | |
| proof: data.proof, | |
| }, | |
| accountGCR.pubkey, // sender's ed25519 address | |
| ) | |
| proofOk = verificationResult.success | |
| if (!proofOk) { | |
| log.error( | |
| `Telegram verification failed: ${verificationResult.message}`, | |
| ) | |
| return { | |
| success: false, | |
| message: verificationResult.message, | |
| } | |
| } | |
| log.info( | |
| `Telegram identity verified: ${data.username} (${data.userId})`, | |
| ) | |
| } catch (error) { | |
| log.error(`Telegram proof verification failed: ${error}`) | |
| proofOk = false | |
| } |
| const signature_ = await ucrypto.sign( | ||
| getSharedState.signingAlgorithm, | ||
| new TextEncoder().encode(JSON.stringify(tx.content)), | ||
| ) | ||
| if (!signature) { | ||
| return [false, "Failed to sign transaction"] | ||
| } | ||
| return [true, signature] | ||
| } | ||
|
|
||
| // INFO Given a signed transaction, verify it against the address of the sender | ||
| // Returns [result, message] | ||
| static verify(tx: Transaction) { | ||
| // Check sanity of the structure of the tx object | ||
| if (!tx.content) { | ||
| return [false, "Missing tx.content"] | ||
| } | ||
| if (!tx.signature) { | ||
| return [false, "Missing tx.signature"] | ||
| if (!signature_) { | ||
| return [false, "Failed to sign transaction"] | ||
| } | ||
| // verify using identity.cryptography.verify(tx.content, tx.signature, publicKey) | ||
| const verified = Cryptography.verify( | ||
| JSON.stringify(tx.content), | ||
| tx.signature.data.toString("hex"), | ||
| tx.content.from.toString("hex"), | ||
| ) | ||
| return [verified, "Result of verify()"] | ||
| return [ | ||
| true, | ||
| { | ||
| type: getSharedState.signingAlgorithm, | ||
| data: uint8ArrayToHex(signature_.signature), | ||
| }, | ||
| ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Async signing without error context
The signing operation returns a tuple but doesn't provide meaningful error context when signing fails.
Enhance error handling with more context:
const signature_ = await ucrypto.sign(
getSharedState.signingAlgorithm,
new TextEncoder().encode(JSON.stringify(tx.content)),
)
if (!signature_) {
- return [false, "Failed to sign transaction"]
+ return [false, `Failed to sign transaction with algorithm: ${getSharedState.signingAlgorithm}`]
}
+
+if (!signature_.signature || signature_.signature.length === 0) {
+ return [false, "Signature generation returned empty signature"]
+}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const signature_ = await ucrypto.sign( | |
| getSharedState.signingAlgorithm, | |
| new TextEncoder().encode(JSON.stringify(tx.content)), | |
| ) | |
| if (!signature) { | |
| return [false, "Failed to sign transaction"] | |
| } | |
| return [true, signature] | |
| } | |
| // INFO Given a signed transaction, verify it against the address of the sender | |
| // Returns [result, message] | |
| static verify(tx: Transaction) { | |
| // Check sanity of the structure of the tx object | |
| if (!tx.content) { | |
| return [false, "Missing tx.content"] | |
| } | |
| if (!tx.signature) { | |
| return [false, "Missing tx.signature"] | |
| if (!signature_) { | |
| return [false, "Failed to sign transaction"] | |
| } | |
| // verify using identity.cryptography.verify(tx.content, tx.signature, publicKey) | |
| const verified = Cryptography.verify( | |
| JSON.stringify(tx.content), | |
| tx.signature.data.toString("hex"), | |
| tx.content.from.toString("hex"), | |
| ) | |
| return [verified, "Result of verify()"] | |
| return [ | |
| true, | |
| { | |
| type: getSharedState.signingAlgorithm, | |
| data: uint8ArrayToHex(signature_.signature), | |
| }, | |
| ] | |
| const signature_ = await ucrypto.sign( | |
| getSharedState.signingAlgorithm, | |
| new TextEncoder().encode(JSON.stringify(tx.content)), | |
| ) | |
| if (!signature_) { | |
| return [false, `Failed to sign transaction with algorithm: ${getSharedState.signingAlgorithm}`] | |
| } | |
| if (!signature_.signature || signature_.signature.length === 0) { | |
| return [false, "Signature generation returned empty signature"] | |
| } | |
| return [ | |
| true, | |
| { | |
| type: getSharedState.signingAlgorithm, | |
| data: uint8ArrayToHex(signature_.signature), | |
| }, | |
| ] |
🤖 Prompt for AI Agents
In src/libs/blockchain/transaction.ts around lines 80 to 94, the async sign call
can fail silently and the current code returns a generic failure tuple without
error context; wrap the ucrypto.sign call in a try/catch, check that the
resolved value contains a valid signature property before using it, and on error
return a failure tuple that includes the error message (or serialized error) for
debugging, e.g. catch the thrown error and return [false, `Failed to sign
transaction: ${error.message || String(error)}`]; also ensure successful return
still returns the algorithm and hex-encoded signature as before.
| signature: JSON.stringify(tx.signature), // REVIEW This is a horrible thing, if it even works | ||
| ed25519_signature: tx.ed25519_signature, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
REVIEW comment indicates known technical debt
The comment explicitly marks this as "horrible" - serializing the entire signature object as JSON is indeed problematic for performance and maintainability.
Consider using a proper serialization format or storing signature components separately:
-signature: JSON.stringify(tx.signature), // REVIEW This is a horrible thing, if it even works
+signature: tx.signature.data,
+signatureType: tx.signature.type,Then update the RawTransaction type definition and deserialization accordingly.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| signature: JSON.stringify(tx.signature), // REVIEW This is a horrible thing, if it even works | |
| ed25519_signature: tx.ed25519_signature, | |
| signature: tx.signature.data, | |
| signatureType: tx.signature.type, | |
| ed25519_signature: tx.ed25519_signature, |
| case "nativeBridge": | ||
| payload = tx.content.data | ||
| var nativeBridgeResult = await handleNativeBridgeTx( | ||
| payload[1] as NativeBridgeOperationCompiled, | ||
| ) | ||
| if (nativeBridgeResult === null) { | ||
| result.success = false | ||
| result.response = false | ||
| result.extra = { | ||
| error: "Failed to handle native bridge transaction", | ||
| } | ||
| } | ||
| result.response = nativeBridgeResult | ||
| break |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Native bridge transaction error handling is incomplete.
The native bridge handler doesn't handle all error cases properly. When handleNativeBridgeTx returns null, the error is set but result.response still contains null, which might cause issues downstream.
Apply this diff to fix the error handling:
case "nativeBridge":
payload = tx.content.data
- var nativeBridgeResult = await handleNativeBridgeTx(
+ const nativeBridgeResult = await handleNativeBridgeTx(
payload[1] as NativeBridgeOperationCompiled,
)
if (nativeBridgeResult === null) {
result.success = false
- result.response = false
+ result.response = {
+ message: "Native bridge transaction processing failed"
+ }
result.extra = {
error: "Failed to handle native bridge transaction",
}
+ } else {
+ result.response = nativeBridgeResult
+ result.success = true
}
- result.response = nativeBridgeResult
break📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| case "nativeBridge": | |
| payload = tx.content.data | |
| var nativeBridgeResult = await handleNativeBridgeTx( | |
| payload[1] as NativeBridgeOperationCompiled, | |
| ) | |
| if (nativeBridgeResult === null) { | |
| result.success = false | |
| result.response = false | |
| result.extra = { | |
| error: "Failed to handle native bridge transaction", | |
| } | |
| } | |
| result.response = nativeBridgeResult | |
| break | |
| case "nativeBridge": { | |
| payload = tx.content.data | |
| const nativeBridgeResult = await handleNativeBridgeTx( | |
| payload[1] as NativeBridgeOperationCompiled, | |
| ) | |
| if (nativeBridgeResult === null) { | |
| result.success = false | |
| result.response = { | |
| message: "Native bridge transaction processing failed", | |
| } | |
| result.extra = { | |
| error: "Failed to handle native bridge transaction", | |
| } | |
| } else { | |
| result.response = nativeBridgeResult | |
| result.success = true | |
| } | |
| break | |
| } |
🧰 Tools
🪛 Biome (2.1.2)
[error] 376-379: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.
The declaration is defined in this switch clause:
Safe fix: Wrap the declaration in a block.
(lint/correctness/noSwitchDeclarations)
🤖 Prompt for AI Agents
In src/libs/network/endpointHandlers.ts around lines 374 to 387, the native
bridge error path sets result.response to null after marking failure; change the
control flow so that when handleNativeBridgeTx returns null you set
result.success = false, result.response = false, and result.extra with the
error, and then skip assigning result.response = nativeBridgeResult (eg. use an
else or early break/return); when nativeBridgeResult is non-null set
result.success = true (or leave unchanged) and assign result.response =
nativeBridgeResult.
| const response = _.cloneDeep(emptyResponse) | ||
| response.result = 200 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add error handling around upstream API calls and set correct RPC status codes
Calls to RubicService can throw; currently unhandled, which may bubble and bypass RPC error mapping. Also result is set to 200 before we know the outcome.
Apply this diff to validate input, catch upstream errors, and map statuses deterministically:
const response = _.cloneDeep(emptyResponse)
- response.result = 200
@@
- switch (method) {
- case "get_trade":
- response.response = await RubicService.getQuoteFromApi(params[0])
- break
- case "execute_trade": {
- response.response = await RubicService.getSwapDataFromApi(params[0])
- break
- }
- default:
- response.response = false
- break
- }
+ if (!Array.isArray(params) || params.length === 0 || params[0] == null) {
+ response.response = false
+ response.result = 400
+ response.extra = "Missing required params[0] payload"
+ return response
+ }
+
+ try {
+ switch (method) {
+ case "get_trade":
+ response.response = await RubicService.getQuoteFromApi(params[0])
+ break
+ case "execute_trade":
+ response.response = await RubicService.getSwapDataFromApi(params[0])
+ break
+ default:
+ response.response = false
+ response.extra = `Unknown method '${method}'`
+ break
+ }
+ } catch (err) {
+ response.response = false
+ response.result = 502
+ response.extra = `Upstream Rubic API error: ${(err as Error)?.message ?? "unknown"}`
+ return response
+ }
@@
- if (response.response === false) {
- response.result = 400
- response.extra = "Payload failed to execute"
- }
+ if (response.response) {
+ response.result = 200
+ } else {
+ response.result = 400
+ response.extra = response.extra || "Payload failed to execute"
+ }Also applies to: 22-35, 37-42
User description
test
PR Type
Enhancement, Documentation
Description
• Major cryptography migration: Migrated from Forge to unified crypto system with Post-Quantum Cryptography (PQC) support and ed25519 fallback
• Complete incentive system: Implemented comprehensive point system for Web3 wallet, Twitter, GitHub, and Telegram linking with referral support
• Identity management overhaul: Added Telegram identity verification, Twitter bot detection, and multi-chain validation with enhanced security
• Server infrastructure upgrade: Replaced Fastify with Bun-based HTTP server featuring rate limiting, enhanced security, and protected endpoints
• MCP server integration: Implemented Model Context Protocol server manager for blockchain tools with stdio and SSE transport support
• Enhanced Web2 proxy security: Added SSRF protection, DNS validation, private IP blocking, and deterministic response hashing
• Blockchain improvements: Refactored chain management with static repositories, transaction history retrieval, and genesis block recovery
• Social platform integrations: Added comprehensive Twitter API integration with bot detection and GitHub authentication support
• Cross-chain enhancements: Implemented Ethereum and Solana transaction analysis, Aptos contract operations, and native bridge handling
• Consensus strengthening: Enhanced PoRBFT with genesis anchoring, improved validator seed algorithm, and better error handling
Diagram Walkthrough
File Walkthrough
33 files
PointSystem.ts
Complete incentive point system implementationsrc/features/incentive/PointSystem.ts
• Added comprehensive point system for user incentives with Web3
wallet, Twitter, GitHub, and Telegram linking
• Implemented point
awarding and deduction methods with referral code support
• Added
identity verification from GCR and Twitter follow checking
functionality
• Created singleton pattern with methods for managing
user points and breakdown tracking
server_rpc.ts
Server migration to Bun with enhanced securitysrc/libs/network/server_rpc.ts
• Replaced Fastify server with Bun-based HTTP server implementation
•
Added rate limiting middleware and IP-based transaction limits per
block
• Enhanced signature validation with support for multiple
algorithms (ed25519, falcon, ml-dsa)
• Added protected endpoints with
sudo access control and new RPC methods for campaign data and point
awarding
chain.ts
Blockchain chain management refactoring and enhancementssrc/libs/blockchain/chain.ts
• Refactored to use static repositories for better performance and
consistency
• Added transaction history retrieval method with
filtering capabilities
• Enhanced genesis block generation with user
account restoration from recovery data
• Improved block and
transaction management with shared state integration
GCRIdentityRoutines.ts
Complete identity management system with incentivessrc/libs/blockchain/gcr/gcr_routines/GCRIdentityRoutines.ts
• Added comprehensive identity management for XM, Web2, and PQC
identities
• Implemented incentive point integration for identity
linking/unlinking operations
• Added Telegram identity verification
with dual signature validation
• Enhanced identity verification with
first-connection checking and point awarding logic
findGenesisBlock.ts
Genesis block recovery and restoration enhancementssrc/libs/blockchain/routines/findGenesisBlock.ts
• Added GCR recovery data integration for genesis block restoration
•
Implemented latest recovery file detection and balance merging logic
•
Enhanced genesis block initialization with user account data
restoration
• Added maintenance hooks support for genesis block
operations
twitter.ts
Twitter Bot Detection and API Integration Systemsrc/libs/identity/tools/twitter.ts
• Added comprehensive Twitter bot detection system with
TwitterBotDetector class
• Implemented multiple bot detection
algorithms including profile analysis, activity patterns, and follower
analysis
• Added Twitter API integration methods for timeline,
followers, and tweet data retrieval
• Created singleton Twitter class
with URL parsing and bot checking capabilities
gcr.ts
GCR Account Management and Campaign Analytics Enhancementsrc/libs/blockchain/gcr/gcr.ts
• Added methods to query accounts by Twitter and Telegram usernames
using JSONB queries
• Implemented campaign data aggregation
functionality for user statistics and points
• Added point awarding
system for Twitter users with transaction creation and validation
•
Removed legacy GCR status getter methods and added new account
management features
rubic.ts
Rubic Bridge API v2 Integration Refactorsrc/features/bridges/rubic.ts
• Refactored from SDK-based approach to direct Rubic API v2
integration
• Removed complex custom EVM provider and SDK
initialization code
• Added static methods for quote retrieval, swap
data fetching, and raw transaction sending
• Simplified architecture
with direct HTTP API calls instead of SDK wrapper
rateLimiter.ts
Advanced Rate Limiting Middleware Implementationsrc/libs/network/middleware/rateLimiter.ts
• Created comprehensive rate limiting middleware with IP-based
tracking
• Implemented configurable limits per method with blocking
and cleanup mechanisms
• Added IP persistence, whitelisting, and
statistics tracking features
• Included proper client IP detection
with proxy header support
MCPServer.ts
MCP Server Manager for Blockchain Toolssrc/features/mcp/MCPServer.ts
• Implemented Model Context Protocol (MCP) server manager for
blockchain tools
• Added support for both stdio and SSE transport
types with Express integration
• Created tool registration system with
Zod schema validation
• Included server lifecycle management and
graceful shutdown capabilities
beforeFindGenesisHooks.ts
Blockchain Maintenance Hooks and Account Review Systemsrc/libs/blockchain/routines/beforeFindGenesisHooks.ts
• Added maintenance hooks for awarding Demos follow points to Twitter
users
• Implemented account review system with bot detection and
transaction validation
• Created batch processing for account
management and invalid account removal
• Included referral system
cleanup and point normalization features
transaction.ts
Transaction System Cryptography Migration and PQC Supportsrc/libs/blockchain/transaction.ts
• Migrated from Forge cryptography to unified crypto system with
multiple signature algorithms
• Added support for PQC (Post-Quantum
Cryptography) signatures with ed25519 fallback
• Enhanced signature
validation with identity verification and dual signature support
•
Updated transaction structure to include
from_ed25519_addressanded25519_signaturefieldsendpointHandlers.ts
Endpoint Handlers Cryptography Migration and Enhancementsrc/libs/network/endpointHandlers.ts
• Updated transaction validation and execution to use unified crypto
system
• Added sender parameter validation and enhanced signature
verification
• Integrated identity request handling and native bridge
transaction support
• Removed Forge cryptography dependencies and
updated to new signature format
index.ts
Main Application MCP Integration and Crypto Migrationsrc/index.ts
• Integrated MCP server startup with configuration and port management
• Migrated to unified crypto system initialization and Bun RPC server
• Added Chain setup and last block initialization
• Updated identity
management to use new keypair system
handleNativeBridgeTx.ts
Native Bridge Transaction Handler Stubsrc/libs/network/routines/transactions/handleNativeBridgeTx.ts
• Added placeholder handler for native bridge transactions
• Created
basic function structure for
NativeBridgeOperationCompiledprocessingmanageNodeCall.ts
Enhanced RPC endpoints with new Twitter integration and mempoolupdatessrc/libs/network/manageNodeCall.ts
• Added new imports for encryption utilities, Twitter tools, and
mempool management
• Replaced
Chain.getPendingPool()withMempool.getMempool()for mempool operations• Updated peer identity
response to use new
uint8ArrayToHexconversion• Added new RPC
endpoints:
getTransactionHistory,getTweetwith Twitter bot detection• Replaced direct GCR calls with
ensureGCRForUserutility functionindex.ts
Added Telegram identity verification and enhanced Web2 proofvalidationsrc/libs/abstraction/index.ts
• Added comprehensive Telegram proof verification with dual signature
validation
• Implemented bot authorization checking against genesis
block balances
• Added Twitter bot detection to prevent bot account
connections
• Updated signature verification to use new
ucryptoencryption utilities
demosTools.ts
New MCP tools module for blockchain and network operationssrc/features/mcp/tools/demosTools.ts
• Created comprehensive MCP tools for Demos Network operations
•
Implemented blockchain query tools (blocks, chain height, last block)
• Added network status and node identity tools
• Included peer
management tools for connection monitoring
referrals.ts
Complete referral system implementation with point rewardssrc/features/incentive/referrals.ts
• Implemented referral code generation using Base58 encoding
• Added
referral processing with bonus point awards
• Created eligibility
checking and duplicate referral prevention
• Included database
operations for referral tracking
identityManager.ts
Enhanced identity management with multi-chain validation and filteringsrc/libs/blockchain/gcr/gcr_routines/identityManager.ts
• Added connection filtering with Twitter account requirements
•
Implemented EVM and Solana chain validation with testnet restrictions
• Enhanced payload verification with crosschain address activity
checks
• Added PQC identity verification for post-quantum cryptography
support
aptos_contract_read.ts
New Aptos blockchain contract read executor implementationsrc/features/multichain/routines/executors/aptos_contract_read.ts
• Created Aptos smart contract read functionality using REST API
•
Implemented Move view function calling with type arguments support
•
Added comprehensive parameter validation and error handling
• Included
both direct REST API and SDK-based implementations
crosschain.ts
Cross-chain transaction analysis tools for Ethereum and Solanasrc/libs/identity/tools/crosschain.ts
• Created Ethereum transaction querying using Etherscan API
•
Implemented Solana transaction fetching via Helius API
• Added
transaction counting utilities for both chains
• Included
comprehensive error handling and API key management
sharedState.ts
Enhanced shared state with crypto upgrades and rate limitingsrc/utilities/sharedState.ts
• Added new signing algorithm configuration and keypair management
•
Implemented rate limiting configuration with IP whitelisting
• Added
MCP server status tracking and SUDO public key support
• Enhanced
identity management with unified crypto support
validateTransaction.ts
Modernized transaction validation with new cryptography systemsrc/libs/blockchain/routines/validateTransaction.ts
• Updated transaction validation to use new
ucryptosignatureverification
• Replaced legacy cryptography with modern encryption
utilities
• Enhanced validity data signing with algorithm-specific
signatures
• Improved error handling and validation messaging
Sync.ts
Enhanced blockchain sync with genesis validation and integrity checkssrc/libs/blockchain/routines/Sync.ts
• Added genesis block validation during sync integrity checks
•
Implemented remote block fetching utility function
• Enhanced sync
verification with genesis hash comparison
• Improved error handling
with process exit on critical failures
manageConsensusRoutines.ts
Enhanced consensus routine management with improved validationsrc/libs/network/manageConsensusRoutines.ts
• Updated consensus routine management with sender parameter
•
Simplified signature verification by removing redundant checks
•
Enhanced shard membership validation for consensus participants
•
Improved secretary identity verification in greenlight operations
DAHR.ts
Enhanced DAHR proxy with URL validation and simplified responsessrc/features/web2/dahr/DAHR.ts
• Updated DAHR proxy handling with URL validation integration
•
Simplified response structure by removing attestation complexity
•
Enhanced proxy startup with normalized URL handling
• Improved error
handling for invalid URLs
bunServer.ts
New Bun-based HTTP server implementation with middleware supportsrc/libs/network/bunServer.ts
• Created lightweight HTTP server implementation using Bun runtime
•
Implemented middleware chain pattern for request processing
• Added
CORS and JSON response utilities
• Included route handling for GET and
POST methods
identity.ts
Enhanced identity system with BIP39 mnemonic and unified cryptosrc/libs/identity/identity.ts
• Added BIP39 mnemonic support for identity generation
• Implemented
unified cryptography system with master seed
• Enhanced identity
loading with mnemonic-based key derivation
• Added comprehensive seed
generation and validation
PeerManager.ts
Modernized peer management with new cryptography systemsrc/libs/peer/PeerManager.ts
• Updated peer management to use new cryptography utilities
• Replaced
legacy signature operations with
ucryptofunctions• Enhanced peer
authentication with modern encryption
• Improved connection string
signing and verification
IncentiveManager.ts
New incentive management system with multi-platform integrationsrc/libs/blockchain/gcr/gcr_routines/IncentiveManager.ts
• Created comprehensive incentive management system
• Implemented
hooks for wallet, Twitter, GitHub, and Telegram linking
• Added point
award and deduction operations
• Included referral code support for
bonus rewards
DAHRFactory.ts
Enhanced DAHR factory with improved session managementsrc/features/web2/dahr/DAHRFactory.ts
• Enhanced DAHR factory with improved cleanup operations
• Extended
session timeout to 24 hours for better user experience
• Added proper
proxy cleanup during expired session removal
• Improved memory
management with async cleanup operations
PoRBFT.ts
Enhanced PoRBFT consensus with improved error handlingsrc/libs/consensus/v2/PoRBFT.ts
• Enhanced consensus routine with improved error handling
• Added
process exit on critical consensus failures
• Improved mempool merging
debug output formatting
• Strengthened consensus reliability with
fail-fast approach
3 files
Proxy.ts
Enhanced Web2 proxy security and reliabilitysrc/features/web2/proxy/Proxy.ts
• Enhanced SSRF protection with DNS resolution validation and private
IP blocking
• Added deterministic response hashing for
request/response integrity verification
• Improved SSL certificate
handling with configurable verification settings
• Refactored proxy
server lifecycle management with proper cleanup and error handling
validator.ts
Secure URL validation module with SSRF protectionsrc/features/web2/validator.ts
• Implemented comprehensive URL validation with SSRF protection
•
Added private IP address and localhost blocking
• Included URL
normalization with canonical formatting
• Created IPv4/IPv6 address
validation with security checks
getCommonValidatorSeed.ts
Strengthened validator seed algorithm with genesis anchoringsrc/libs/consensus/v2/routines/getCommonValidatorSeed.ts
• Enhanced CVSA algorithm with genesis block anchoring for security
•
Improved seed generation using block hash and number combinations
•
Added comprehensive security analysis documentation
• Strengthened
chain integrity validation against tampering
1 files
.env.example
Environment configuration template for social integrations.env.example
• Added environment variable template for Twitter and GitHub
integration
• Included credentials placeholders for social media
authentication
2 files
twitter.ts
Refactored Twitter proof parser to use centralized Twitter utilitiessrc/libs/abstraction/web2/twitter.ts
• Simplified Twitter proof parser by delegating to Twitter utility
class
• Removed complex cookie management and login logic
• Updated to
use centralized Twitter tools for tweet operations
• Enhanced error
handling for tweet validation
handleWeb2.ts
Simplified Web2 request handling with streamlined proxy creationsrc/features/web2/handleWeb2.ts
• Simplified Web2 request handling by removing attestation logic
•
Streamlined DAHR creation process with direct proxy handling
• Removed
complex quorum and signature verification code
• Enhanced error
handling and logging
2 files
remoteExample.ts
MCP server examples for remote and local network accesssrc/features/mcp/examples/remoteExample.ts
• Created remote MCP server setup example with SSE transport
•
Implemented network-accessible MCP server configuration
• Added
comprehensive documentation and usage examples
• Included both remote
and local server setup patterns
simpleExample.ts
Simple MCP server examples with transport configuration optionssrc/features/mcp/examples/simpleExample.ts
• Created basic MCP server setup examples with configuration options
•
Implemented both stdio and SSE transport examples
• Added server
lifecycle management with graceful shutdown
• Included comprehensive
documentation and usage patterns
89 files
Summary by CodeRabbit
New Features
Improvements
Documentation