Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase system scalability by improving internal_resolve and admin_destroy_market #101

Closed
sea212 opened this issue Apr 26, 2021 · 6 comments · Fixed by #806
Closed

Increase system scalability by improving internal_resolve and admin_destroy_market #101

sea212 opened this issue Apr 26, 2021 · 6 comments · Fixed by #806
Assignees
Labels
p:high High priority, prioritize the resolution of this issue t:enhancement The issue described an enhancement

Comments

@sea212
Copy link
Member

sea212 commented Apr 26, 2021

Issue
Currently the resolution of a market has an high complexity, because for every asset for the market in question, the system iterates over every existing account and over every account holding at least one of those assets. This leaves us with a high time complexity of ac + bc ∈ O(a), which ultimately limits the systems scalability based on the total number of existing accounts.

a = num. accounts
b = num. accounts with assets for the market in question
c = num. assets for market in question

problematic snippet in admin-destroy-market

for asset in Self::outcome_assets(market_id, market).iter() {
let accounts = T::Shares::accounts_by_currency_id(*asset);
T::Shares::destroy_all(*asset, accounts.iter().cloned());
}

problematic snippet in internal-resolve (used by on_finalize and admin_move_market_to_resolved)

for asset in assets.iter() {
if let Asset::CategoricalOutcome(_, inner_index) = asset {
if index == *inner_index {
continue;
}
let accounts = T::Shares::accounts_by_currency_id(*asset);
T::Shares::destroy_all(*asset, accounts.iter().cloned());
}

---
Both functions use accounts_by_currency_id, which iterates over every account:

fn accounts_by_currency_id(
currency_id: Self::CurrencyId,
) -> Vec<(T::AccountId, AccountData<T::Balance>)> {
<Accounts<T>>::iter()
.filter_map(|(k0, k1, v)| {
if k1 == currency_id {
Some((k0, v))
} else {
None
}
})
.collect()
}

Potential solution
Use a suitable data structure to efficiently store, search and delete accounts that hold assets for a specific market, such as a search trie. A StorageMap (using the identity "hasher") should do the job ( O(log b) time complexity for every store, search and delete and O(b) space complexity). By using a dedicated efficient data structure, we can iterate over every asset holding account once for every asset (of the current market), effectively removing the need to additionally iterate over every existing account for every asset (of the current market).

@lsaether lsaether added p:medium Medium priority, this issue can wait but should be done fairly soon and removed enhancement labels May 24, 2021
@maltekliemann maltekliemann added this to the v0.3.4 milestone May 4, 2022
@sea212 sea212 modified the milestones: v0.3.4, v0.3.3, v0.3.5 May 31, 2022
@sea212 sea212 added the t:enhancement The issue described an enhancement label Jul 10, 2022
@vivekvpandya
Copy link
Contributor

vivekvpandya commented Aug 3, 2022

@sea212 as per your solution the storage item will be in PredictionMarket pallet? Also once storage is populated from Accounts of orml tokens , why it won't require any updates if original source i.e orml_pallet's storage gets updated?

It will, and that's a huge problem with that potential solution.

@vivekvpandya
Copy link
Contributor

One solution is to have a fork of orml_tokens with an extra storage map which can map from currency_id to BoundedVec, or perhaps if possible (market_id,currency_id) -> BoundedVec and add logic to keep this map update by doing required extra work on orml_token traits and extrinsics.

Or fork orml_tokens pallet and provide a trait from PredictionMarket to update storage for currency_id -> accounts when an account for given currency_id has 0 balance or becomes dead.

@zeitgeistpm/substratedevs any thoughts ?

@sea212
Copy link
Member Author

sea212 commented Aug 3, 2022

We have discussed this option once and came to the conclusion that this is viable, but it is coupled with an enormous ongoing effort to maintain the orml fork. This might be a last resort. Other solutions that require less on-going maintenance would be preferred.

@vivekvpandya
Copy link
Contributor

We can possibly create a wrapper pallet which is almost similar to https://github.com/open-web3-stack/open-runtime-module-library/blob/master/currencies/src/lib.rs but provides additional data strorage for currency_id-> Vec and speed up the on_resolution().

@vivekvpandya vivekvpandya linked a pull request Aug 15, 2022 that will close this issue
@sea212 sea212 added p:high High priority, prioritize the resolution of this issue and removed p:medium Medium priority, this issue can wait but should be done fairly soon labels Sep 29, 2022
@vivekvpandya
Copy link
Contributor

Reopen for future use.

@vivekvpandya vivekvpandya reopened this Sep 30, 2022
@sea212
Copy link
Member Author

sea212 commented Sep 30, 2022

This issue is resolved. Resolution complexity is bounded now and does not depend on accounts. Future tasks that resulted from the PR #806 are creating a garbage collection for worthless outcome tokens and pool shares that applies multi-block cleanup whenever there is some weight available #792

@sea212 sea212 closed this as completed Sep 30, 2022
@sea212 sea212 removed this from the v0.3.6 milestone Oct 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
p:high High priority, prioritize the resolution of this issue t:enhancement The issue described an enhancement
Projects
None yet
4 participants