Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FRAME: Move pallets over to use fungible traits #226

Open
7 of 33 tasks
Tracked by #327
gavofyork opened this issue Jan 27, 2023 · 5 comments
Open
7 of 33 tasks
Tracked by #327

FRAME: Move pallets over to use fungible traits #226

gavofyork opened this issue Jan 27, 2023 · 5 comments
Labels
I6-meta A specific issue for grouping tasks or bugs of a specific category. T1-FRAME This PR/Issue is related to core FRAME, the framework.

Comments

@gavofyork
Copy link
Member

gavofyork commented Jan 27, 2023

  • Move over all pallets using Currency to fungible traits:
    • Basic code switch over: pallets should provide a utility for cancelling a lock/reserve and instating a freeze/hold.
    • Staking pallet should use hold instead of lock.
    • Permissionless fee-free dispatchable in all pallets to move over.
    • Mark all Currency traits as deprecated.

Eventually, we will also want to deprecate and remove Balances dispatchables:

  • Substitute all usages of Balances dispatchables with fungible API.
  • Mark dispatchables as deprecated.
  • Remove them (this can only be done once XCM execution from private accounts is safe and enabled).

Pallet migrations needed

Here's a list of pallets, turning this into a tracking issue of the migration process.

@shawntabrizi
Copy link
Member

I made a comment here, but will bring it here too.

While changing from Currency to Fungible trait, I think you should also update the name of the Associated type.

For example, this is weird to me:

/// Currency type that this works on.
type Currency: FunInspect<Self::AccountId, Balance = Self::CurrencyBalance>
	+ FunMutate<Self::AccountId>
	+ FunBalanced<Self::AccountId>
	+ FunHoldMutate<Self::AccountId, Reason = Self::RuntimeHoldReason>;

Perhaps we should use the name NativeBalance for the native token balance (Balances pallet), and TokenBalance for use of the Assets pallet and stuff.

@juangirini juangirini transferred this issue from paritytech/substrate Aug 24, 2023
@the-right-joyce the-right-joyce added I6-meta A specific issue for grouping tasks or bugs of a specific category. T1-FRAME This PR/Issue is related to core FRAME, the framework. and removed J1-meta labels Aug 25, 2023
Ank4n added a commit that referenced this issue Sep 29, 2023
closes #158.
partially addresses
#226.

Instead of fragile calculation of current balance by looking at `free
balance - ED`, Nomination Pool now freezes ED in the pool reward account
to restrict an account from going below minimum balance. This also has a
nice side effect that if ED changes, we know how much is the imbalance
in ED frozen in the pool and the current required ED. A pool operator
can diligently top up the pool with the deficit in ED or vice versa,
withdraw the excess they transferred to the pool.

## Notable changes
- New call `adjust_pool_deposit`: Allows to top up the deficit or
withdraw the excess deposited funds to the pool.
- Uses Fungible trait (instead of Currency trait). Since NP was not
doing any locking/reserving previously, no migration is needed for this.
- One time migration of freezing ED from each of the existing pools (not
very PoV friendly but fine for relay chain).
Ank4n pushed a commit that referenced this issue Apr 9, 2024
Part of #226 
Related #1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by #1296,
needs the `Unbalanced::decrease_balance` fix
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
dharjeezy pushed a commit to dharjeezy/polkadot-sdk that referenced this issue Apr 9, 2024
Part of paritytech#226 
Related paritytech#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech#1296,
needs the `Unbalanced::decrease_balance` fix
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
serban300 pushed a commit to serban300/parity-bridges-common that referenced this issue Apr 9, 2024
Part of paritytech/polkadot-sdk#226
Related paritytech/polkadot-sdk#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech/polkadot-sdk#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75ac49786a7246531cf729b25c208cd38e6)
serban300 added a commit to paritytech/parity-bridges-common that referenced this issue Apr 9, 2024
* Migrate fee payment from `Currency` to `fungible` (#2292)

Part of paritytech/polkadot-sdk#226
Related paritytech/polkadot-sdk#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech/polkadot-sdk#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75ac49786a7246531cf729b25c208cd38e6)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0fcd37e4e8c14aeb83b5c9e680981e16079)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
serban300 added a commit to paritytech/parity-bridges-common that referenced this issue Apr 9, 2024
* Migrate fee payment from `Currency` to `fungible` (#2292)

Part of paritytech/polkadot-sdk#226
Related paritytech/polkadot-sdk#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech/polkadot-sdk#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75ac49786a7246531cf729b25c208cd38e6)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0fcd37e4e8c14aeb83b5c9e680981e16079)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
serban300 added a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Migrate fee payment from `Currency` to `fungible` (paritytech#2292)

Part of paritytech#226
Related paritytech#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (paritytech#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0f)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
serban300 added a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Migrate fee payment from `Currency` to `fungible` (paritytech#2292)

Part of paritytech#226
Related paritytech#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (paritytech#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0f)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
serban300 added a commit to serban300/polkadot-sdk that referenced this issue Apr 9, 2024
* Migrate fee payment from `Currency` to `fungible` (paritytech#2292)

Part of paritytech#226
Related paritytech#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (paritytech#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0f)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 10, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
serban300 added a commit to serban300/polkadot-sdk that referenced this issue Apr 10, 2024
* Migrate fee payment from `Currency` to `fungible` (paritytech#2292)

Part of paritytech#226
Related paritytech#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (paritytech#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0f)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
serban300 pushed a commit to serban300/polkadot-sdk that referenced this issue Apr 10, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
serban300 added a commit to serban300/polkadot-sdk that referenced this issue Apr 10, 2024
* Migrate fee payment from `Currency` to `fungible` (paritytech#2292)

Part of paritytech#226
Related paritytech#1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by paritytech#1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (paritytech#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0f)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
bkchr pushed a commit that referenced this issue Apr 10, 2024
* Migrate fee payment from `Currency` to `fungible` (#2292)

Part of #226
Related #1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by #1296,
needs the `Unbalanced::decrease_balance` fix

(cherry picked from commit bda4e75)

* Upgrade `trie-db` from `0.28.0` to `0.29.0` (#3982)

- What does this PR do?
1. Upgrades `trie-db`'s version to the latest release. This release
includes, among others, an implementation of `DoubleEndedIterator` for
the `TrieDB` struct, allowing to iterate both backwards and forwards
within the leaves of a trie.
2. Upgrades `trie-bench` to `0.39.0` for compatibility.
3. Upgrades `criterion` to `0.5.1` for compatibility.
- Why are these changes needed?
Besides keeping up with the upgrade of `trie-db`, this specifically adds
the functionality of iterating back on the leafs of a trie, with
`sp-trie`. In a project we're currently working on, this comes very
handy to verify a Merkle proof that is the response to a challenge. The
challenge is a random hash that (most likely) will not be an existing
leaf in the trie. So the challenged user, has to provide a Merkle proof
of the previous and next existing leafs in the trie, that surround the
random challenged hash.

Without having DoubleEnded iterators, we're forced to iterate until we
find the first existing leaf, like so:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        println!("RECONSTRUCTED TRIE {:#?}", trie);

        // Create an iterator over the leaf nodes.
        let mut iter = trie.iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        let mut prev_key = None;
        for element in &mut iter {
            if element.is_ok() {
                let (key, _) = element.unwrap();
                prev_key = Some(key);
                break;
            }
        }
        assert!(prev_key.is_some());

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        assert!(prev_key.unwrap() <= challenge_hash.to_vec());

        // The next element should exist (meaning there is no other existing leaf between the
        // previous and next leaf) and it should be greater than the challenged hash.
        let next_key = iter.next().unwrap().unwrap().0;
        assert!(next_key >= challenge_hash.to_vec());
```

With DoubleEnded iterators, we can avoid that, like this:
```rust
        // ************* VERIFIER (RUNTIME) *************
        // Verify proof. This generates a partial trie based on the proof and
        // checks that the root hash matches the `expected_root`.
        let (memdb, root) = proof.to_memory_db(Some(&root)).unwrap();
        let trie = TrieDBBuilder::<LayoutV1<RefHasher>>::new(&memdb, &root).build();

        // Print all leaf node keys and values.
        println!("\nPrinting leaf nodes of partial tree...");
        for key in trie.key_iter().unwrap() {
            if key.is_ok() {
                println!("Leaf node key: {:?}", key.clone().unwrap());

                let val = trie.get(&key.unwrap());

                if val.is_ok() {
                    println!("Leaf node value: {:?}", val.unwrap());
                } else {
                    println!("Leaf node value: None");
                }
            }
        }

        // println!("RECONSTRUCTED TRIE {:#?}", trie);
        println!("\nChallenged key: {:?}", challenge_hash);

        // Create an iterator over the leaf nodes.
        let mut double_ended_iter = trie.into_double_ended_iter().unwrap();

        // First element with a value should be the previous existing leaf to the challenged hash.
        double_ended_iter.seek(&challenge_hash.to_vec()).unwrap();
        let next_key = double_ended_iter.next_back().unwrap().unwrap().0;
        let prev_key = double_ended_iter.next_back().unwrap().unwrap().0;

        // Since hashes are `Vec<u8>` ordered in big-endian, we can compare them directly.
        println!("Prev key: {:?}", prev_key);
        assert!(prev_key <= challenge_hash.to_vec());

        println!("Next key: {:?}", next_key);
        assert!(next_key >= challenge_hash.to_vec());
```
- How were these changes implemented and what do they affect?
All that is needed for this functionality to be exposed is changing the
version number of `trie-db` in all the `Cargo.toml`s applicable, and
re-exporting some additional structs from `trie-db` in `sp-trie`.

---------

Co-authored-by: Bastian Köcher <git@kchr.de>
(cherry picked from commit 4e73c0f)

* Update polkadot-sdk refs

* Fix Cargo.lock

---------

Co-authored-by: Liam Aharon <liam.aharon@hotmail.com>
Co-authored-by: Facundo Farall <37149322+ffarall@users.noreply.github.com>
jonathanudd pushed a commit to jonathanudd/polkadot-sdk that referenced this issue Apr 10, 2024
* Add Instance type parameter to pallet

* Sketch out what the runtime could look like

* Allow runtime to compile with multiple bridge pallets

* Cargo Fmt

* Allow an instance of a PoA chain to be used with currency-exchange

I specify that it's only _an instance_ instead of _instances_ since the currency-exchange
pallet does not support multiple instances itself. What this commit does is make it so
that the different instances of the PoA chains we currently have are compatible with the
currency-exchange pallet through the implementation of the PeerBlockchain trait.

* Add Instance type parameter to Currency Exchange pallet

* Wire up currency exchange intances in runtime

* Rust Fmt

* Show sccache

* Allow Eth pallet to use a default instance

* Use a default instance in Eth pallet tests

* Remove Rialto and Kovan feature flags

Through some discussions it has been decided that the `bridge-node` should, like
Substrate's `node-template`, be a showcase of the different pallets available in
a project. Because of this I've removed the feature flags for the Rialto and Kovan
networks in favour of having both of them included in the runtime.

* Update the chain_spec to use both Rialto and Kovan configs

* Update pallet level calls used by Substrate client

Allows the project to compile. However, it should be noted that in reality
we shouldn't be hardcoding the pallet we're calling.

* Allow currency-exchange pallet to use a default instance

* Support benchmarking an instance of the Eth pallet

* Update currency exchange benchmarks to work with instances

* Fix test helpers which now need a PoA instance

* Remove Actions for checking Rialto and Kovan features

* Add missing comments

* Update Runtime API string constants

* Add issue number for generic chain support in relay

* Add Runtime APIs for instances of the currency-exchange pallet

* Rust Fmt

Co-authored-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
EgorPopelyaev pushed a commit that referenced this issue May 27, 2024
Part of #226
Related #1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by #1296,
needs the `Unbalanced::decrease_balance` fix
EgorPopelyaev pushed a commit that referenced this issue May 27, 2024
Part of #226
Related #1833

- Deprecate `CurrencyAdapter` and introduce `FungibleAdapter`
- Deprecate `ToStakingPot` and replace usage with `ResolveTo`
- Required creating a new `StakingPotAccountId` struct that implements
`TypedGet` for the staking pot account ID
- Update parachain common utils `DealWithFees`, `ToAuthor` and
`AssetsToBlockAuthor` implementations to use `fungible`
- Update runtime XCM Weight Traders to use `ResolveTo` instead of
`ToStakingPot`
- Update runtime Transaction Payment pallets to use `FungibleAdapter`
instead of `CurrencyAdapter`
- [x] Blocked by #1296,
needs the `Unbalanced::decrease_balance` fix
rustadot pushed a commit to rustadot/recurrency that referenced this issue Sep 5, 2024
# Goal
The goal of this PR is to replace the `Currency` trait with the
`fungible` trait in the `capacity` pallet.

Closes #942 
Closes #1532 

# Discussion
The following Parity issues/PRs were used as references for changes:
[Deprecate Currency - PR
12951](paritytech/substrate#12951) (Explanation
of necessary changes.)
[FRAME: Move pallets over to use fungible
traits](paritytech/polkadot-sdk#226) (Issue to
track Parity's efforts to update their pallets.)
[pallet vesting / update to use
fungible](https://github.com/paritytech/polkadot-sdk/pull/1760/files)
(Example of some necessary changes.)
[Fungibles: migrate Democracy
pallet](paritytech/polkadot-sdk#1861) (Example
of storage migration for Locks->Freezes.)

# Changes
- `.cargo-deny.toml` Added "multiformats" for `cid` crate to fix
warning.
- `vscode/settings.json` Added script that sets up the source map for
the rust std library files for debugging.
-  `Cargo.lock` cargo updated.
-  `e2e/package-lock.json` npm updated.
-  Replaced traits as needed: Use `tokens::fungible::` 
    - `InspectFungible` for `balance()` and `reducible_balance()`
    - `InspectFreeze` for `balance_frozen()`
    - `Mutate` for `set_balance()`, `mint_into()`
    - `MutateFreeze` for `set_freeze()`, and `thaw()`
- Added `pub enum FreezeReason` to support `freezes`
- Updated error handling as `set_freeze()` and `thaw()` can fail, so
errors needed to be propagated.
- Updated pallets/mocks to set MaxFreezes to 2, one for Capacity, one for TimeRelease
- Updated runtime pallet configs to use BalancesMaxXXXXXs
- Updated tests with `.expect()` where `set_freeze()` or `thaw()` can
fail.
- `FreezeIdentifier` and `RuntimeFreezeReason` configured with defaults.
- Added v3 migration to Capacity.

# Storage Migrations
The value of `BalancesMaxFreezes` has been updated, which will impact
the storage of the Balances pallet by changing `T::MaxFreezes`, see the
code here:
[substrate/frame/balances/src/lib.rs:480](https://github.com/paritytech/polkadot-sdk/blob/release-polkadot-v1.1.0/substrate/frame/balances/src/lib.rs#L480)

```rust
	/// Freeze locks on account balances.
	#[pallet::storage]
	pub type Freezes<T: Config<I>, I: 'static = ()> = StorageMap<
		_,
		Blake2_128Concat,
		T::AccountId,
		BoundedVec<IdAmount<T::FreezeIdentifier, T::Balance>, T::MaxFreezes>,
		ValueQuery,
	>;
```
The previous value of `T::MaxFreezes` was `0` so no data could be stored
in `Freezes`, therefore no storage migration for `Freezes` is needed for
this change. Even if there was data in storage, it would only need to be
migrated if `T::MaxFreezes` is *decreased*.

However, the current chain has data in `Locks` that needs to migrated to
`Freezes`. Testing has shown that these `Locks` will no longer be
accessible once the new traits are in place.

The `Balances` pallet is configured to store account data using the
`System` pallet. Therefore, these two pallets must be included when
using `try-runtime` for testing.

The migration for `Capacity` will access its storage to determine which
accounts have `Locks` that need to be translated to `Freezes`. Then, the
old `Currency` trait is used to remove the `Locks` and the new
`fungible` trait is used to set the `Freeze`.

# How to Review
- [ ] Read through [Deprecate Currency - PR
12951](paritytech/substrate#12951) to understand
context and check that Currency traits were properly replaced with
fungible traits.
- [ ] Check impact of changing to `set_freeze()` and `thaw()` which can
now fail and make sure all error states are propagated correctly without
possibility for `panic`
- [ ] Check if `balance()` is used correctly, or should be changed to
`reducible_balance()`. The calculations evaluating the Existential
Deposit (ED) have been updated and Parity comments indicate that
`reducible_balance()` is most likely the value needed.
- [ ] Ensure that the migration weights calculations are correct.
(Please let me know if you would like to walk through the migration code
path together).

# How to Test Runtime Migrations
[Install the CLI version of
try-runtime](https://paritytech.github.io/try-runtime-cli/try_runtime/#installation),
then run try-runtime to test the migration against Frequency Rococo:
```bash
cargo build --release --features frequency-rococo-testnet,try-runtime && \
try-runtime --runtime ./target/release/wbuild/frequency-runtime/frequency_runtime.wasm on-runtime-upgrade live --uri wss://rpc.rococo.frequency.xyz:443 -pallet  Capacity  --pallet Balances --pallet System
```
Alternatively, you can use the non-release version for faster compiles:
```bash
cargo build --features frequency-rococo-testnet,try-runtime && \
try-runtime --runtime ./target/debug/wbuild/frequency-runtime/frequency_runtime.wasm on-runtime-upgrade live --uri wss://rpc.rococo.frequency.xyz:443 -pallet  Capacity --pallet Balances --pallet System
```
You should see output like this:
```bash
[2023-12-16T17:03:57Z INFO  runtime::capacity] migrated 344
[2023-12-16T17:03:57Z INFO  runtime::capacity] migrated 345
[2023-12-16T17:03:57Z INFO  runtime::capacity] migrated 346
[2023-12-16T17:03:57Z INFO  runtime::capacity] migrated 347
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migration finished  
[2023-12-16T17:03:57Z INFO  runtime::capacity] Migration calculated weight = Weight { ref_time: 78200000000, proof_size: 0 }
[2023-12-16T17:03:57Z INFO  runtime::capacity] ✅ migration post_upgrade checks passed

// end of Capacity v2 migration, v3 migration follows

[2023-12-16T17:03:57Z INFO  runtime::capacity] Running pre_upgrade...
[2023-12-16T17:03:57Z INFO  runtime::capacity] Finish pre_upgrade for 347 records
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 Capacity Locks->Freezes migration started
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0xac4e66be328c4d235be27fcb0af8ccbe12dce375236f9ccc5516780522bc8870, amount:1500000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x3048b40a2e7185e510b695c7ba15f31218a1d4a501c6a71596e4f60317fc180f, amount:800000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x663ea97a6b40a51ed79f3328f9629423e5525c84bb178e63a854e4ce497fde25, amount:900000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x1488b10710285607257cbaa5f2ee273b50613358c50e443b8e43790ba7e15705, amount:900000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x283ac37222b6e34afb4b187e41e8e828c2e2d2b7f5bc027a2bf98eccbca76d47, amount:200000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x1a038c038ece2448a22096697a35905a98cc14138666432a1767142e67f76414, amount:900000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0xa848c35ccb975405ceec7d275d2cbbfe0df577f54a4e87caee5a4bf64f68ad01, amount:500000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x2aa39545e21212127c97ce1059988eb7f8dabe94f61e023bde1b85fd9625a638, amount:1000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x9404948a1b48c6818117f5803e5b868515fc88d1bcbd6fa27a1083089e5b5604, amount:900000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0xa0e340ec2c110a262b3f730c1cd333d52d6bb9fcb0b149bdd7216fb8cd4b2468, amount:100000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x28d9dadd32d1864fa4aa553a74ff82637acb131079bfec5b2777df7dea4a2e5d, amount:900000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0x4c54252a95ae9e334be70375d14d51d4d89a6ae2deeaf9ca47091d797f319620, amount:200000000
[2023-12-16T17:03:57Z INFO  runtime::capacity] 🔄 migrated account 0xe2133699b4fdc8444b98f9fddc7039374302a9a41e9ce53eafab113fd236c739, amount:200000000
```

The total weight calculated for the Capacity migration on testnet:
```bash
[2023-12-18T14:50:36Z INFO  runtime::capacity] total accounts migrated from locks to freezes: 347
[2023-12-18T14:50:36Z INFO  runtime::capacity] 🔄 Capacity Locks->Freezes migration finished
[2023-12-18T14:50:36Z INFO  runtime::capacity] Capacity Migration calculated weight = Weight { ref_time: 260375000000, proof_size: 0 }
[2023-12-18T14:50:36Z INFO  runtime::capacity] ✅ migration post_upgrade checks passed
```

The total weight calculated for the Capacity migration on main net:
```bash
[2023-12-18T14:57:04Z INFO  runtime::capacity] 🔄 Capacity Locks->Freezes migration finished
[2023-12-18T14:57:04Z INFO  runtime::capacity] Capacity Migration calculated weight = Weight { ref_time: 1625000000, proof_size: 0 }
[2023-12-18T14:57:04Z INFO  runtime::capacity] ✅ migration post_upgrade checks passed
```

# Upgrade Notes

1. `scripts/upgrade_accounts.py` should be executed to ensure that all
accounts have been upgraded before running the migration.

# Checklist
- [x] Chain spec updated
- [ ] Custom RPC OR Runtime API added/changed? Updated js/api-augment.
- [ ] Design doc(s) updated
- [x] Tests added
- [ ] Benchmarks added
- [x] Weights updated

---------

Co-authored-by: Matthew Orris <--help>
Co-authored-by: Enddy Dumbrique <enddy.dumbrique@amplica.com>
Co-authored-by: Frequency CI [bot] <do-not-reply@users.noreply.github.com>
rustadot pushed a commit to rustadot/recurrency that referenced this issue Sep 5, 2024
…1818)

# Goal
The goal of this PR is to replace the `Currency` trait with the
`fungible` trait in the `time-release` pallet.
This work was split from PR #1779.

Closes #942 
Closes #1833 

# Discussion
The following Parity issues/PRs were used as references for changes:
[Deprecate Currency - PR
12951](paritytech/substrate#12951) (Explanation
of necessary changes.)
[FRAME: Move pallets over to use fungible
traits](paritytech/polkadot-sdk#226) (Issue to
track Parity's efforts to update their pallets.)
[pallet vesting / update to use
fungible](https://github.com/paritytech/polkadot-sdk/pull/1760/files)
(Example of some necessary changes.)
[Fungibles: migrate Democracy
pallet](paritytech/polkadot-sdk#1861) (Example
of storage migration for Locks->Freezes.)

# Changes
-  Replaced traits as needed: Use `tokens::fungible::` 
    - `InspectFungible` for `balance()` and `reducible_balance()`
    - `InspectFreeze` for `balance_frozen()`
    - `Mutate` for `set_balance()`, `mint_into()`
    - `MutateFreeze` for `set_freeze()`, and `thaw()`
- Added `pub enum FreezeReason` to support `freezes`
- Updated error handling as `set_freeze()` and `thaw()` can fail, so
errors needed to be propagated.
- Updated runtime pallet configs to use BalancesMaxXXXXXs
- Updated tests with `.expect()` where `set_freeze()` or `thaw()` can
fail.
- `FreezeIdentifier` and `RuntimeFreezeReason` configured with defaults.
- Updated time-release pallet to propagate errors from set_freeze/thaw.
- Added v2 migration to TimeRelease.

# Storage Migrations
The value of `BalancesMaxFreezes` has been updated, which will impact
the storage of the Balances pallet by changing `T::MaxFreezes`, see the
code here:
[substrate/frame/balances/src/lib.rs:480](https://github.com/paritytech/polkadot-sdk/blob/release-polkadot-v1.1.0/substrate/frame/balances/src/lib.rs#L480)

```rust
	/// Freeze locks on account balances.
	#[pallet::storage]
	pub type Freezes<T: Config<I>, I: 'static = ()> = StorageMap<
		_,
		Blake2_128Concat,
		T::AccountId,
		BoundedVec<IdAmount<T::FreezeIdentifier, T::Balance>, T::MaxFreezes>,
		ValueQuery,
	>;
```
The previous value of `T::MaxFreezes` was `0` so no data could be stored
in `Freezes`, therefore no storage migration for `Freezes` is needed for
this change. Even if there was data in storage, it would only need to be
migrated if `T::MaxFreezes` is *decreased*.

However, the current chain has data in `Locks` that needs to migrated to
`Freezes`. Testing has shown that these `Locks` will no longer be
accessible once the new traits are in place.

The `Balances` pallet is configured to store account data using the
`System` pallet. Therefore, these two pallets must be included when
using `try-runtime` for testing.

The migration for `TimeRelease` will access its storage to determine
which accounts have `Locks` that need to be translated to `Freezes`.
Then, the old `Currency` trait is used to remove the `Locks` and the new
`fungible` trait is used to set the `Freeze`.

# How to Review
- [x] Read through [Deprecate Currency - PR
12951](paritytech/substrate#12951) to understand
context and check that Currency traits were properly replaced with
fungible traits.
- [ ] Check impact of changing to `set_freeze()` and `thaw()` which can
now fail and make sure all error states are propagated correctly without
possibility for `panic`
- [ ] Check if `balance()` is used correctly, or should be changed to
`reducible_balance()`. The calculations evaluating the Existential
Deposit (ED) have been updated and Parity comments indicate that
`reducible_balance()` is most likely the value needed.
- [ ] Ensure that the migration weights calculations are correct.
(Please let me know if you would like to walk through the migration code
path together).

# How to Test Runtime Migrations
[Install the CLI version of
try-runtime](https://paritytech.github.io/try-runtime-cli/try_runtime/#installation),
then run try-runtime to test the migration against Frequency Rococo:
```bash
cargo build --release --features frequency-rococo-testnet,try-runtime && \
try-runtime --runtime ./target/release/wbuild/frequency-runtime/frequency_runtime.wasm on-runtime-upgrade live --uri wss://rpc.rococo.frequency.xyz:443 --pallet TimeRelease --pallet Balances --pallet System
```
Alternatively, you can use the non-release version for faster compiles:
```bash
cargo build --features frequency-rococo-testnet,try-runtime && \
try-runtime --runtime ./target/debug/wbuild/frequency-runtime/frequency_runtime.wasm on-runtime-upgrade live --uri wss://rpc.rococo.frequency.xyz:443 --pallet TimeRelease --pallet Balances --pallet System
```
And for testing on main-net:
```bash
cargo build --features frequency,try-runtime && \
try-runtime --runtime ./target/debug/wbuild/frequency-runtime/frequency_runtime.wasm on-runtime-upgrade live --uri wss://1.rpc.frequency.xyz:443  --pallet Balances --pallet System --pallet TimeRelease
```
Testing with a snapshot:
```bash
# create a snapshot (or use existing one)
try-runtime create-snapshot --uri https:://rpc.rococo.frequency.xyz:443 testnet-all-pallets.state

# use the testnet snapshot 
cargo build --features frequency-rococo-testnet,try-runtime && \
try-runtime --runtime ./target/debug/wbuild/frequency-runtime/frequency_runtime.wasm on-runtime-upgrade \
snap --path testnet-all-pallets.state

# use the mainnet snapshot
cargo build --features frequency,try-runtime && \
try-runtime --runtime ./target/debug/wbuild/frequency-runtime/frequency_runtime.wasm on-runtime-upgrade \
snap --path mainnet-all-pallets.state
```
You should see output like this:
```bash
[2023-12-20T22:06:50Z INFO  runtime::time-release] Running pre_upgrade...
[2023-12-20T22:06:50Z INFO  runtime::time-release] Finish pre_upgrade for 6 records
[2023-12-20T22:06:50Z INFO  runtime::time-release] Running storage migration...
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 Time Release Locks->Freezes migration started
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 migrated account 0x702cfcc9149d3c6f65728d3a5312d66a83fb70ed942cedb8e6450f4198ce7a77, amount:1000000000
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 migrated account 0xce3bcb8ac19cdeb3ee14173be5f474292ff11ae56d4d0fa3f2bdaf24b4ef5842, amount:10000000000000
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 migrated account 0x041a99f3614052bdd5b0aed6ed5805f592aacbcc0d5a443821f3b4339c44c11f, amount:400000050
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 migrated account 0x26db9b4eeb5b5d511abd26903a25578f355e34e33316aeb2d34e846045cc7e45, amount:10000000000
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 migrated account 0xc8d6262ff9fc322e59bcd36a36e310cfc7c50134e309a82f4330648e2eff7368, amount:1411
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 migrated account 0x485dc3b17bcebba3013e47150e588c941a1f9778378367125e98a2e8f140325e, amount:3000000000
[2023-12-20T22:06:50Z INFO  runtime::time-release] total accounts migrated from locks to frozen 6
[2023-12-20T22:06:50Z INFO  runtime::time-release] 🔄 Time Release migration finished
[2023-12-20T22:06:50Z INFO  runtime::time-release] Time Release Migration calculated weight = Weight { ref_time: 4525000000, proof_size: 0 }
[2023-12-20T22:06:50Z INFO  runtime::time-release] ✅ migration post_upgrade checks passed
```

The total weight calculated for the TimeRelease migration on testnet:
```bash
[2023-12-18T14:50:36Z INFO  runtime::time-release] total accounts migrated from locks to frozen 6
[2023-12-18T14:50:36Z INFO  runtime::time-release] 🔄 Time Release migration finished
[2023-12-18T14:50:36Z INFO  runtime::time-release] Time Release Migration calculated weight = Weight { ref_time: 4525000000, proof_size: 0 }
[2023-12-18T14:50:36Z INFO  runtime::time-release] ✅ migration post_upgrade checks passed
```

The total weight calculated for the TimeRelease migration on mainnet:
```bash
[2023-12-18T14:57:04Z INFO  runtime::time-release] 🔄 Time Release migration finished
[2023-12-18T14:57:04Z INFO  runtime::time-release] Time Release Migration calculated weight = Weight { ref_time: 16525000000, proof_size: 0 }
[2023-12-18T14:57:04Z INFO  runtime::time-release] ✅ migration post_upgrade checks passed
```
# Upgrade Notes

1. `scripts/upgrade_accounts.py` should be executed to ensure that all
accounts have been upgraded before running the migration. This step may
be a no-op, if all accounts have previously been upgraded.

# Checklist
- [x] Chain spec updated
- [ ] Custom RPC OR Runtime API added/changed? Updated js/api-augment.
- [ ] Design doc(s) updated
- [ ] Tests added
- [ ] Benchmarks added
- [ ] Weights updated

---------

Co-authored-by: Matthew Orris <--help>
Co-authored-by: Aramik <aramikm@gmail.com>
Co-authored-by: Robert La Ferla <robert@onemsg.co>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
I6-meta A specific issue for grouping tasks or bugs of a specific category. T1-FRAME This PR/Issue is related to core FRAME, the framework.
Projects
Status: Draft
Status: In Progress
Development

No branches or pull requests

8 participants