-
Notifications
You must be signed in to change notification settings - Fork 236
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
large file #6
Comments
Hi, thanks to use my code. Are you using substracted keys for the 120 Puzzle? Well BSGS works well with less public keys in the file, every 2^X publick keys in the file wil drop the speed by half or in other words will increase the the time of each thread cycle by 2 times. Lets to suppose that one single publickey takes 10 seconds per cicle
so on, the math behind the speed is per public key, so make you own time estimations. Good luck, best regards! |
thank you |
Try first 50 or 100 publickeys, it will take some time but also it will, for 20 millions of keys i recomend to you use the mode xpoint, this is the next: ./keyhunt -m xpoint -f random_xvalues.txt -b 120 -s 60 -R the file will have only the 32 bytes of the X value of the publickeys in hexadecimal format like this:
the speed of this is not so fast but if you make 20 millons of substracted key the real speed is the program speed * the number of substracted keys. |
yess that was what i am looking thank you. |
@thearthouse Can you post script to automate generation of subtracted keys by distance and number of keys to be generated. Thanks. I m available at my username gmail address too. |
i have generated random 20 mln adresses bsgs in range 800000000000000000000000000000:FFFFFFFFFFFFFFFFFFFFFFFFFFFFFF.
load it to keyhunt.
run cammand ./keyhunt -m bsgs -f random_pubs.txt -r 800000000000000000000000000000:FFFFFFFFFFFFFFFFFFFFFFFFFFFFFF -s 60 -R
and its stuck here :
[+] Version 0.1.20210112 BSGS
[+] Setting mode BSGS
[+] Stats output every 60 seconds
[+] Setting random mode.
[+] Opening file random_pubs.txt
[+] Added 19999999 points from file
[+] Setting N up to 17592186044416.
[+] Init bloom filter for 4194304 elements : 7.00 MB
[+] Allocating 128.00 MB for aMP Points
[+] Precalculating 4194304 aMP points
[+] Allocating 160.00 MB for bP Points
[+] precalculating 4194304 bP points
[+] Sorting 4194304 elements
[+] Thread 0: 0000000000000000000000000000000000ec842d5ee3e39c0f384e482569bc7f > running 1 hour no update
why: i want to test if it can load large file and wanna test randomnes
The text was updated successfully, but these errors were encountered: