-
Notifications
You must be signed in to change notification settings - Fork 579
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move mining logic from NodeJS to Rust #916
Conversation
header_bytes, initial_randomness, target, batch_size | ||
); | ||
|
||
// Return result |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added these because I found this code unnecessarily hard to parse at a glance for what it does, but I'm not going to fight for it
@@ -122,14 +122,14 @@ describe('Miner', () => { | |||
|
|||
describe('mineHeader', () => { | |||
beforeEach(() => { | |||
mocked(blockHeaderModule.hashBlockHeader).mockReset() | |||
mocked(nativeModule.mineHeaderBatch).mockReset() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Git makes this diff pretty gross, I just re-wrote the block entirely, so might be easier to review without a diff-view
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, looks good!
|
||
pub fn randomize_header(initial_randomness: i64, i: i64, mut header_bytes: &mut [u8]) -> i64 { | ||
// The intention here is to wrap randomness between 0 inclusive and Number.MAX_SAFE_INTEGER inclusive | ||
let randomness = if i > MAX_SAFE_INTEGER + initial_randomness { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not necessary for this PR obviously, but I realized it might be easier to support the whole 64-bit randomness space with this Rust miner. Randomness is treated as a number for convenience, but the goal is just to flip bits in this 64-bit value to try to manipulate the hash.
Hmm, I guess the director would need to treat randomness as a bigint if it still wanted to use numbers in that case. Anyway, nothing actionable in this PR here, just a thought that came to mind 😄
* Add TCP Secure flag to allow secure config over TCP (#908) * Dont set config overrides unless not default value (#910) We found a bug where the base command was setting config overrides for values that are the same as the flags default value. This causes the config file to be ignored. Now, we only set the config override if it's different from the default, which allows the config not to be over-ridden with the flag default. * Fix the Sapling related links (#907) * Move mining logic from NodeJS to Rust (#916) * Move mining logic from NodeJS to Rust * Linter/license cleanup * Fix one more lint check * Add trait to since it already implements it * Change ironfish-cli build dir to ./ from ./src (#915) Need to make this change so we can include package.json in the source * Fix up ChainProcessor and use in Accounts store for processing blocks (#914) * Remove unused name field from chainProcessor * Tag the chainProcessor logger * Throw error if chain processor head not found in chain * Make chainProcessor add/remove private These should only be called inside chainProcessor when updating the head. * Set hash to previousBlockHash when removing blocks * Use chainProcessor in Accounts * Add hashChanged to chainProcessor update * Remove unnecessary test * Fix circular dependency import * Clear the chain processor when resetting accounts Co-authored-by: NullSoldier <nullprogrammer@gmail.com> * Allow version to be driven by lib consumer (#917) This makes a change so that previously, the version sent out everywhere and the agent string was driven by the library. This doesn't tell the full story, because the real version is the person who is consuming the library and potentially writing custom code on top of the library. This lets users who consume the library override the package and produce their own agent and version library using the new Package type that you can produce from a Package.json file and pass down into the node. * feat(cli,ironfish): Show heap total in status and rename RSS to Memory (#918) * feat(cli,ironfish): Show heap total in status and rename RSS to Memory * Fix test * Format miner hash rate using metric system (#921) * Format miner hash rate using metric system This PR wil now format the hash rate in metric increments similar to file sizes while still printing the original value. It also adds tests for the FileUtils, and fixes a few bugs I saw with rounding that produced odd values. * Removed double license * Add debug command with accounts head info (#920) * Add debug command with accounts head info * Add more system info, whether telemetry is enabled * consistent casing and name fix * Update Package usage to match new code * Minor typos in README.md (#919) ## Summary `More details on our documentation` -> `More details in our documentation` And hypenating `command line interface` -> `command-line interface` ## Testing Plan n/a ## Breaking Change Is this a breaking change? If yes, add notes below on why this is breaking and what additional work is required, if any. `Documentation update, not breaking change` * Bump version to 0.1.20 Co-authored-by: mat-if <97762857+mat-if@users.noreply.github.com> Co-authored-by: Andrei Singular/Kitezh <77510705+andreysingular@users.noreply.github.com> Co-authored-by: Derek Guenther <derek@ironfish.network> Co-authored-by: Rohan Jadvani <5459049+rohanjadvani@users.noreply.github.com> Co-authored-by: Ben Holden-Crowther <4144334+blrhc@users.noreply.github.com>
Summary
Moves the actual logic of hashing and comparing into Rust for some straight-forward performance improvement. Simple napkin math comparison of my computer shows a ~10-15x speedup.
Testing Plan
Ran side-by-side with the original NodeJS implementation using 2 separate nodes
Breaking Change
Is this a breaking change? If yes, add notes below on why this is breaking and
what additional work is required, if any.