Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions magicblock-chainlink/src/chainlink/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ use errors::ChainlinkResult;
use fetch_cloner::FetchCloner;
use log::*;
use magicblock_core::traits::AccountsBank;
use solana_account::AccountSharedData;
use solana_account::{AccountSharedData, ReadableAccount};
use solana_pubkey::Pubkey;
use solana_sdk::{
commitment_config::CommitmentConfig, transaction::SanitizedTransaction,
Expand Down Expand Up @@ -137,7 +137,11 @@ impl<T: ChainRpcClient, U: ChainPubsubClient, V: AccountsBank, C: Cloner>
let blacklisted_accounts =
blacklisted_accounts(&self.validator_id, &self.faucet_id);
let removed = self.accounts_bank.remove_where(|pubkey, account| {
!account.delegated() && !blacklisted_accounts.contains(pubkey)
(!account.delegated()
// This fixes the edge-case of accounts that were in the process of
// being undelegated but never completed while the validator was running
|| account.owner().eq(&dlp::id()))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this be fixed if we set delegation flag to false in magic-program right when undelegation was requested? @thlorenz @GabrielePicco

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be an improper fix since we don't know if the account was properly undelegated yet.
There is a reason for this two step process.

We will provide a more proper fix soon .. this is just an unblocker.

Copy link
Contributor

@taco-paco taco-paco Oct 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is it improper? In context of ER it was undelegated. Everything else is the matter of properly committing it. We shall not bring a network or Intent service related issues into Account state.
In context of ER account it whether delegated or not.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think setting the delegation flag to false is necessary but not sufficient to properly handle all edge cases. We also need additional context to determine when an account is awaiting_undelegation, so we can correctly handle transactions that attempt to use that account while it is in the process of being undelegated versus after it has already been undelegated. In the first case, transactions in the ER should fail. In the second case, the cloning pipeline should "re-fetch" the account

&& !blacklisted_accounts.contains(pubkey)
});

debug!("Removed {removed} non-delegated accounts");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,6 @@ async fn ixtest_get_existing_account_for_valid_slot() {

let pubkey = random_pubkey();
let rpc_client = remote_account_provider.rpc_client();
airdrop(rpc_client, &pubkey, 1_000_000).await;

{
// Fetching immediately does not return the account yet
Expand All @@ -108,6 +107,12 @@ async fn ixtest_get_existing_account_for_valid_slot() {
assert!(!remote_account.is_found());
}

// Fetching account after airdrop will add it to validator and create subscription
airdrop(rpc_client, &pubkey, 1_000_000).await;
sleep_ms(500).await;

// Airdrop again and wait for subscription update to be processed
airdrop(rpc_client, &pubkey, 1_000_000).await;
info!("Waiting for subscription update...");
sleep_ms(1_500).await;

Expand All @@ -118,6 +123,7 @@ async fn ixtest_get_existing_account_for_valid_slot() {
remote_account_provider.try_get(pubkey).await.unwrap();
assert!(remote_account.is_found());
assert!(remote_account.slot() >= cs);
assert_eq!(remote_account.fresh_lamports().unwrap(), 2_000_000);
}
}

Expand Down
4 changes: 2 additions & 2 deletions test-integration/test-cloning/tests/01_program-deploy.rs
Original file line number Diff line number Diff line change
Expand Up @@ -178,12 +178,12 @@ async fn test_clone_mini_v4_loader_program_and_upgrade() {
)
.await;

const MAX_RETRIES: usize = 20;
const MAX_RETRIES: usize = 50;
let mut remaining_retries = MAX_RETRIES;
loop {
ctx.wait_for_delta_slot_ephem(5).unwrap();

let bump = remaining_retries - MAX_RETRIES + 1;
let bump = (remaining_retries - MAX_RETRIES) + 1;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Fix the incorrect bump calculation formula.

The current formula produces negative values after the first iteration:

  • Iteration 1: remaining_retries=50, bump = (50-50)+1 = 1
  • Iteration 2: remaining_retries=49, bump = (49-50)+1 = 0
  • Iteration 3: remaining_retries=48, bump = (48-50)+1 = -1

This results in messages like "Hola Mundo 0", "Hola Mundo -1", which is incorrect for a retry counter.

Apply this diff to fix the formula:

-            let bump = (remaining_retries - MAX_RETRIES) + 1;
+            let bump = (MAX_RETRIES - remaining_retries) + 1;

This will produce incrementing values: 1, 2, 3, 4, ... as expected for a retry counter.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let bump = (remaining_retries - MAX_RETRIES) + 1;
let bump = (MAX_RETRIES - remaining_retries) + 1;
🤖 Prompt for AI Agents
In test-integration/test-cloning/tests/01_program-deploy.rs around line 186, the
bump calculation uses (remaining_retries - MAX_RETRIES) + 1 which yields
negative or zero values after the first iteration; replace that formula with
(MAX_RETRIES - remaining_retries) + 1 so bump increments 1,2,3,... as retries
decrease.

let msg = format!("Hola Mundo {bump}");
let ix = sdk.log_msg_instruction(&payer.pubkey(), &msg);
let (sig, found) = ctx
Expand Down