New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transactions with blacklisted eth receivers are included in a batch profitability comparison #40
Comments
This is an excellent catch! Thank you for taking the time to investigate the issue. I'll look into it myself today and follow up with more comments. |
So I've dug into this and it seems very straightforward to add the same check from here to here This seems like a 20 minute patch, maybe an hour or two if you want to make the blacklist check more generic and avoid code duplciation. From your comments I assume you're digging into reducing the repeated fee computation that's currently a big part of batch creation? Batch creation is pretty complex, and optimizing it will quickly baloon into a lot of work, not that I don't think it's important just wondering on priorities. Thanks for finding and reporting this bug! |
Agree that refactoring should be avoided at this point and with the proposed solution. One is related to the GetFees function(the sum will always be 0), |
GT or GTE either should work there. Good catch on get fees as well. If you have the fixes locally could you get them up as PRs? I'd love to review/merge so that we can have them ready for the next upgrade. |
…rofitability comparison Gravity-Bridge#40 fix
…rofitability comparison Gravity-Bridge#40 fix
resolved in #57 |
When building a batch, transactions are first collected from the outgoing transaction pool in order to calculate the total bridge fee and check if a more profitable batch of the same token type already exists in the pool. When doing this, there is no check if the translation’s destination address is included in the blacklist or not. Later, when transactions are collected to be included in the batch, additional blacklist filtering is applied. This can cause potential creation of the less profitable batch than the one of the same token type with the biggest nonce already in the pool. Also, the double retrieval of the calculations seems unnecessary.
The text was updated successfully, but these errors were encountered: