Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the upper bound of "imported new state entries"? #14647

Closed
sonulrk opened this issue Jun 18, 2017 · 207 comments
Closed

What is the upper bound of "imported new state entries"? #14647

sonulrk opened this issue Jun 18, 2017 · 207 comments

Comments

@sonulrk
Copy link

@sonulrk sonulrk commented Jun 18, 2017

System information

Geth version: 1.6.5
OS & Version: Windows 7 x64
geth Command: geth --fast --cache 8192

Expected behaviour

Geth should start in full mode.

Actual behaviour

After nearing the current block geth is continuously "imported new state entries".

Steps to reproduce the behaviour

Currently running since 10 days.

Geth console info

eth.blockNumber
6
eth.syncing
{
currentBlock: 3890742,
highestBlock: 3890893,
knownStates: 17124512,
pulledStates: 17105895,
startingBlock: 3890340
}

Backtrace

INFO [06-18|10:10:31] Imported new state entries count=384 elapsed=22.001ms processed=17118951 pending=24263
INFO [06-18|10:10:32] Imported new state entries count=384 elapsed=33.001ms processed=17119335 pending=23819
INFO [06-18|10:10:33] Imported new state entries count=384 elapsed=111.006ms processed=17119719 pending=23875
INFO [06-18|10:10:34] Imported new state entries count=384 elapsed=131.007ms processed=17120103 pending=23855
INFO [06-18|10:10:35] Imported new state entries count=384 elapsed=116.006ms processed=17120487 pending=23978
INFO [06-18|10:10:36] Imported new state entries count=384 elapsed=134.007ms processed=17120871 pending=24186
INFO [06-18|10:10:38] Imported new state entries count=384 elapsed=305.017ms processed=17121255 pending=27727
INFO [06-18|10:10:42] Imported new state entries count=384 elapsed=448.025ms processed=17121639 pending=33614
INFO [06-18|10:10:46] Imported new state entries count=384 elapsed=441.025ms processed=17122023 pending=39642
INFO [06-18|10:10:48] Imported new state entries count=384 elapsed=44.002ms processed=17122407 pending=39170
INFO [06-18|10:10:52] Imported new state entries count=384 elapsed=427.024ms processed=17122791 pending=45142
INFO [06-18|10:10:55] Imported new state entries count=384 elapsed=473.027ms processed=17123175 pending=51166
INFO [06-18|10:10:58] Imported new state entries count=384 elapsed=448.025ms processed=17123559 pending=57128
INFO [06-18|10:11:01] Imported new state entries count=384 elapsed=444.025ms processed=17123943 pending=63129
INFO [06-18|10:11:04] Imported new state entries count=384 elapsed=441.025ms processed=17124327 pending=69173
INFO [06-18|10:11:04] Imported new state entries count=1 elapsed=0s processed=17124328 pending=69172
INFO [06-18|10:11:07] Imported new state entries count=384 elapsed=442.025ms processed=17124712 pending=75182
INFO [06-18|10:11:10] Imported new state entries count=384 elapsed=470.026ms processed=17125096 pending=81186
INFO [06-18|10:11:11] Imported new state entries count=384 elapsed=335.019ms processed=17125480 pending=81736
INFO [06-18|10:11:14] Imported new state entries count=384 elapsed=440.025ms processed=17125864 pending=87718
INFO [06-18|10:11:15] Imported new state entries count=384 elapsed=140.008ms processed=17126248 pending=87812
INFO [06-18|10:11:16] Imported new state entries count=384 elapsed=31.001ms processed=17126632 pending=87226
INFO [06-18|10:11:18] Imported new state entries count=384 elapsed=88.005ms processed=17127016 pending=87040
INFO [06-18|10:11:19] Imported new state entries count=384 elapsed=39.002ms processed=17127400 pending=86803
INFO [06-18|10:11:20] Imported new state entries count=384 elapsed=36.002ms processed=17127784 pending=86585
INFO [06-18|10:11:23] Imported new state entries count=1 elapsed=0s processed=17127785 pending=86272
INFO [06-18|10:11:23] Imported new state entries count=384 elapsed=1.610s processed=17128169 pending=86271
INFO [06-18|10:11:25] Imported new state entries count=384 elapsed=143.008ms processed=17128553 pending=87792
INFO [06-18|10:11:28] Imported new state entries count=384 elapsed=183.010ms processed=17128937 pending=90117
INFO [06-18|10:11:28] Imported new state entries count=1 elapsed=1ms processed=17128938 pending=90120
INFO [06-18|10:11:28] Imported new state entries count=1 elapsed=0s processed=17128939 pending=90118
INFO [06-18|10:11:29] Imported new state entries count=384 elapsed=102.005ms processed=17129323 pending=90022
INFO [06-18|10:11:30] Imported new state entries count=384 elapsed=184.010ms processed=17129707 pending=92320
INFO [06-18|10:11:32] Imported new state entries count=384 elapsed=185.010ms processed=17130091 pending=94665
INFO [06-18|10:11:34] Imported new state entries count=384 elapsed=187.010ms processed=17130475 pending=97053
INFO [06-18|10:11:36] Imported new state entries count=384 elapsed=194.011ms processed=17130859 pending=99550
INFO [06-18|10:11:38] Imported new state entries count=384 elapsed=183.010ms processed=17131243 pending=101954
INFO [06-18|10:11:40] Imported new state entries count=384 elapsed=202.011ms processed=17131627 pending=104395
INFO [06-18|10:11:42] Imported new state entries count=384 elapsed=196.011ms processed=17132011 pending=106904
INFO [06-18|10:11:44] Imported new state entries count=384 elapsed=186.010ms processed=17132395 pending=109176
INFO [06-18|10:11:47] Imported new state entries count=384 elapsed=184.010ms processed=17132779 pending=111554
INFO [06-18|10:11:47] Imported new state entries count=2 elapsed=184.010ms processed=17132781 pending=111554
INFO [06-18|10:11:48] Imported new state entries count=384 elapsed=34.002ms processed=17133165 pending=110760
INFO [06-18|10:11:50] Imported new state entries count=384 elapsed=193.011ms processed=17133549 pending=113172

@ghost
Copy link

@ghost ghost commented Jun 28, 2017

Yes .. same exact effect
Command is: geth --syncmode=fast --cache=4096 console

@fjl
Copy link
Contributor

@fjl fjl commented Jun 28, 2017

Please try geth v1.6.6.

@n0-m4d
Copy link

@n0-m4d n0-m4d commented Jun 30, 2017

in my case v1.6.6 does not fix it

@pebwindkraft
Copy link

@pebwindkraft pebwindkraft commented Jul 2, 2017

same here, described detailed status here:
#14571

@egortsaryk9
Copy link

@egortsaryk9 egortsaryk9 commented Jul 4, 2017

The same for me for --testnet (ropsten) on Mac OS. The geth version is 1.6.6.
I'm running:
geth --testnet --syncmode "fast" --rpc --rpcapi db,eth,net,web3,personal --cache=1024 --rpcport 8545 --rpcaddr 127.0.0.1 --rpccorsdomain "*" --bootnodes "enode://20c9ad97c081d63397d7b685a412227a40e23c8bdc6688c6f37e97cfbc22d2b4d1db1510d8f61e6a8866ad7f0e17c02b14182d37ea7c3c8b9c2683aeb6b733a1@52.169.14.227:30303,enode://6ce05930c72abc632c58e2e4324f7c7ea478cec0ed4fa2528982cf34483094e9cbc9216e7aa349691242576d552a2a56aaeae426c5303ded677ce455ba1acd9d@13.84.180.240:30303"
and when it comes near to latest blocks (at https://ropsten.etherscan.io/) it continuously "imports new state entries". If I restart the cmd above it fetches most recent blocks but never reach latest ones.

@porfavorite
Copy link

@porfavorite porfavorite commented Jul 8, 2017

same here... the geth is 1.6.6, running geth --testnet --fast --cache=1024

@adhicl
Copy link

@adhicl adhicl commented Jul 11, 2017

I am in same state after a week using geth --fast --cache=1024. Anyone knows what should I do right now?

@thecopy
Copy link

@thecopy thecopy commented Jul 17, 2017

Same situation on testnet. OSX, geth 1.6.7-stable-ab5646c5. Started with --fast and --cache 1500

@sonulrk
Copy link
Author

@sonulrk sonulrk commented Jul 17, 2017

What I could understand from this geth fast mode problem is that:

  1. You must use Quad core processor with 4 gb or more RAM
  2. You must use SSD instead of HDD
  3. Your internet connection must be at least 2mbps or more and reliable.
    If all above are checked then you should try geth. Using geth in full mode you would most probably synced in a week or two max. In fast mode it depends on your luck but you are most probably synced in 2-3 days or never.
@clowestab
Copy link

@clowestab clowestab commented Aug 23, 2017

Encountering similar issues. Geth --fast sync does not sync, let alone fast.

Using 1.6.7-stable on Ubuntu, and it gets within 100 blocks and then endlessly imports state entries.

@alfkors
Copy link

@alfkors alfkors commented Sep 1, 2017

@clowestab did you ever get it to sync?

@kashkalik
Copy link

@kashkalik kashkalik commented Sep 2, 2017

Having same issue. Also using ubuntu, and geth endlessly syncs.

@mboehler
Copy link

@mboehler mboehler commented Sep 7, 2017

I am having the same issue. Has anyone been able to solve this problem?

@brennino
Copy link

@brennino brennino commented Sep 8, 2017

Same problem here, I run geth with 1024 cache and fast syncing three days ago and, after reaching the last block number one day ago, it had never stopped the "Imported new state entries" state.

During this "state entries" phase, however, my balance changed from 0 to the correct value and the block number reported changed from 0 to the a number near the highestBlock value.

This is geth version on my ubuntu machine:

Geth
Version: 1.6.7-stable
Git Commit: ab5646c
Architecture: amd64
Protocol Versions: [63 62]
Network Id: 1
Go Version: go1.8.1
Operating System: linux
GOPATH=
GOROOT=/usr/lib/go-1.8

and this is the result now every half second:
...
INFO [09-08|10:05:34] Imported new state entries count=11 flushed=7 elapsed=216.807ms processed=25170515 pending=1852 retry=1 duplicate=5293 unexpected=6261
INFO [09-08|10:05:34] Imported new state entries count=2 flushed=5 elapsed=521.989µs processed=25170517 pending=1847 retry=0 duplicate=5293 unexpected=6261
INFO [09-08|10:05:34] Imported new state entries count=7 flushed=1 elapsed=263.738ms processed=25170524 pending=1862 retry=2 duplicate=5293 unexpected=6261
INFO [09-08|10:05:34] Imported new state entries count=2 flushed=4 elapsed=6.934ms processed=25170526 pending=1859 retry=0 duplicate=5293 unexpected=6261
INFO [09-08|10:05:34] Imported new state entries count=1 flushed=0 elapsed=27.828ms processed=25170527 pending=1861 retry=1 duplicate=5293 unexpected=6261
INFO [09-08|10:05:34] Imported new state entries count=5 flushed=5 elapsed=19.440ms processed=25170532 pending=1860 retry=0 duplicate=5293 unexpected=6261
...

this is the result of eth.syncing and other geth tools:

eth.syncing
{
currentBlock: 4249131,
highestBlock: 4250814,
knownStates: 25172364,
pulledStates: 25170517,
startingBlock: 0
}

net.peerCount
25

eth.blockNumber
4244762

The ether balance of my wallet is not 0 and is reported correcly and updated to the last transaction made one day ago.

How long is this state supposed to last?

@brennino
Copy link

@brennino brennino commented Sep 8, 2017

Another information, my .ethereum folder size is currently 41 GB, maybe a little too large for a right fast sync.

@sonulrk
Copy link
Author

@sonulrk sonulrk commented Sep 8, 2017

I think never.

@brennino
Copy link

@brennino brennino commented Sep 8, 2017

UPDATE: I stopped geth with CTRL-D and reopen it. Now it seems that the "Imported new state entries" phase halted and geth is working correctly updating only new blocks.
It seems that the problem is that fast sync continue forever to download states and is not aware that the blockchain has already in a right, consistent state.

So, for now until the issue is solved this is my advise:

  • Start geth fast sync
  • wait until the command eth.syncing report a currentBlock near the highestBlock
  • After that, every few hours run the command: eth.blockNumber ... firstly it returns probably 0 but continue to wait.
  • When eth.blockNumber returns a value different from 0 and near the currentBlock value close geth with (CTRL-D or with the command "exit" so it can close correclty and under control) and wait until the program closes in the right way and your operating system shell comes back
  • Reopen geth with fast option. You will see the warning "Blockchain not empty, fast sync disabled"... this is the correct behavior, is telling you that fast sync has been finished.
  • Now the "Imported new state entries" messages disappear and you just see this messages every few seconds:
    INFO [09-08|21:12:06] Imported new chain segment blocks=1 txs=131 mgas=5.379 elapsed=13.855s mgasps=0.388 number=4251422 hash=697652…d85ce8

Now your problem has been solved and probably geth avoid to download a lot of states, reducing the hard disk space taken by geth too.

This is valid until the issue will be solved and geth will become aware when the blockchain is correctly synced.

This is my experience, maybe work, maybe not. For me it worked.
Hope this helps,
Marco

@sonulrk
Copy link
Author

@sonulrk sonulrk commented Sep 8, 2017

Congrats but I had run it for more than 8 hours and eth.blockNumber had shown 6 always. I have to change to parity to sync the blockchain.
Edit: Fast sync worked at first run only on consecutive run you have to run in full mode or geth automatically give warning "Blockchain not empty, fast sync disable" and continue with full mode.

@brennino
Copy link

@brennino brennino commented Sep 9, 2017

I think you have just to wait until eth.blockNumber shows a number near currentBlock before close it and start it again.
I forget to tell to remove old blockchains before starting the task I told before.
Yes, fast sync can be used with an empty blockchain and only the first time.
The command for clear the whole blockchain is "geth removedb" on the operating system shell, it removes everything has been downloaded before.
After that you are able to start a fast sync again from an empty blockchain and follow the procedure I told in my prev post hoping it works.

I'm not a geth developer, I just use it so I can't solve the problem or tell you what it does internally or why the command returns you "6" and what you have to do, but it seems that it downloads a lot of states and, when it finds the head state it's able to build the full blockchain. For me this happened when geth.syncing showed a "knownStates" near : 20.000.000 but it can happen before or after.

During my test, after fast sync finished to download all the blocks headers, it takes more than 24 hours more for having eth.blockNumber = 4244762 . I run geth on a server in with a band of 100 Mb/s.

When it showed to me "0" I let it doing the work and after 24 hours I see the command returns 4244762. I haven't tried to run the command in the middle so I don't know if the command returns other numbers before reaching the last block.

I have never used parity but is seems good and use less disk space than geth so it worth a try.
Maybe some geth dev can make things more clear.

@fjl
Copy link
Contributor

@fjl fjl commented Sep 10, 2017

We believe this is fixed on the master branch. Fast sync takes a while (especially with the mainnet), but will terminate eventually.

@vincentvc
Copy link

@vincentvc vincentvc commented Sep 12, 2017

@brennino
My eth.blockNumber shows 0 after almost several days sync. I am wondering whether the fast sync will fail if I stop and restart the sync process in middle for several times.

@manicprogrammer
Copy link
Contributor

@manicprogrammer manicprogrammer commented Sep 12, 2017

@vincentvc fast sync in geth only works when the database is empty thus you get one chance to fast sync and then it will be the full sync after that [STATED IN ERROR THE FOLLOWING- THIS IS INCORRECT AS POINTED OUT BY FJL: thus yes if you stop and restart anytime before that first fast sync finishes you won't do a fast sync from that point.] My experience with the scenario listed here was two fold. I used the latest build off of main and I made sure I had my database on an SSD. doesn't seem like the SSD vs HDD thing would matter so much but in my experience, until I put it on the SSD I could never get that first sync to finish up - not to say that is true for everyone just my experience.

@skarn01
Copy link

@skarn01 skarn01 commented Sep 13, 2017

I'm having the same problem, continuous imported state I'm currently trying as @brennino says and will come back with result later... currently 350k states processed

here some info :

> eth.syncing
{
  currentBlock: 4269853,
  highestBlock: 4270000,
  knownStates: 357664,
  pulledStates: 348163,
  startingBlock: 4268019
}

net.peerCount 10

eth.blockNumber 0
update : almost 24h later here's the number
blockNumber : 0

eth.syncing
{
currentBlock: 4270728,
highestBlock: 4270793,
knownStates: 6879452,
pulledStates: 6875584,
startingBlock: 4268019
}

imported state still going, gonna check back tomorow...

@brennino
Copy link

@brennino brennino commented Sep 14, 2017

hi @skarn01 I don't know if this happen only to me but when I start fast sync with an empty block chain starting block always shows 0. You can see my eth.syncing on my previous post that I report here:

eth.syncing
{
currentBlock: 4249131,
highestBlock: 4250814,
knownStates: 25172364,
pulledStates: 25170517,
startingBlock: 0
}

Maybe something wrong happen during fast sync or you close geth before eth.blockNumber says a number near last block. Blockchain sync is really time consuming and you haven't to stop fast sync until finishes or eth.blockNumber != 0 and near highestBlock.

What can I tell for helping you... it seems you are not starting fast sync from an empty block chain so if I'm in you I will start again from the beginning.

If you don't want to start again (and I can understand you, I went crazy for days before having some results...) you have this opportunities:

1 - If you want just to make a transaction because ethereum are falling in price now and you are in panic you can use light sync for syncing, make your transaction and, when you stop praying and shout in your room for a price peak, try with calm to sync your blockchain with fast sync again.
2 - wait two days more with your current situation and see if something changes.

About point number 1, I think light sync is not an experimental feature any more (but maybe someone else can confirm)... and I succeed to make a transaction with a light sync blockchain without problems.
If you want to start again with point number 2 later, you can close geth (because your situation is already corrupted) and rename your blockchiain directory .ethereum/geth. If you are ok instead for clear your current blockchain just use geth removedb on the operating system shell.
After that, for starting light sync (different from fast sync!), I have started geth with the command:
geth --light --cache=1024
and wait just 2 or 3 hour for download a 600Mb light blockchain. After that you can make your transaction. Again this is just my experience for helping people, no responsabilities if you lose your ethereum.
Hope it helps
Marco

@skarn01
Copy link

@skarn01 skarn01 commented Sep 15, 2017

hi @brennino, you're right that strange that my starting block is not 0 as i haven't stop the geth daemon...

what i want to do is develop my own services using the chain ( got experience creating an ICO for my boss, now i got some interest in that technology ^^ ) so i don't worry for money i currently put here, there's currently none, ahah.

i'll try the --light option after the geth removedb. --light still give me possibility to work with the chain and see full block after the first sync?

Thank you and I'll come back with update on my situation.

@jochem-brouwer
Copy link
Member

@jochem-brouwer jochem-brouwer commented Aug 18, 2020

INFO [08-18|22:42:47.091] Deallocated fast sync bloom items=581136598 errorrate=0.173

Size ~271 GB

@sssubik
Copy link

@sssubik sssubik commented Aug 19, 2020

Hey @jochem-brouwer is 581136598 your final state entries?
Because for me it has already been higher than yours and it is still contnuing..
Imported new state entries count=412 elapsed=8.904ms processed=673602785 pending=8320 retry=0 duplicate=0 unexpected=30

@steelep
Copy link

@steelep steelep commented Aug 20, 2020

I am going on 9 days now for a full "fast"-sync, fast i9 comp, fiber internet with SSD & cache =2048....

@WyseNynja
Copy link

@WyseNynja WyseNynja commented Aug 20, 2020

@steelep
Copy link

@steelep steelep commented Aug 20, 2020

@wellttllew
Copy link

@wellttllew wellttllew commented Aug 23, 2020

@wellttllew
I would like to know your server configuration.
My server configuration is: ten-core processors, 32M memory,20M band width.The fast sync took about 48 hours.

About 3 days , with an aws lightsail $80 plan.
4Cores, 16GB, 320GB SSD.

@sjlnk
Copy link

@sjlnk sjlnk commented Aug 25, 2020

Fast sync took about 10 hours on Amazon i3.xlarge instance (4 vCPUs, 30GB RAM, NVMe hard drive).

575809113 state entries as of 2020-08-25.

I used the following flags:
--cache=4096 (essential for speed improvement, but requires more RAM)
--maxpeers=50

The final .ethereum directory size is about 253 GB.
geth version: 1.9.20-unstable-7b5107b7-20200824

@garyng2000
Copy link

@garyng2000 garyng2000 commented Aug 25, 2020

what is the size of the db directory after fully sync ? I haven't done this for a while(didn't have the freezer feature back then). TIA.

@sirnicolas21
Copy link

@sirnicolas21 sirnicolas21 commented Aug 25, 2020

what is the size of the db directory after fully sync ? I haven't done this for a while(didn't have the freezer feature back then). TIA.

full sizes without anything moved out
mainnet = 402GB
testnet = 96GB

@farhadjafari385
Copy link

@farhadjafari385 farhadjafari385 commented Aug 26, 2020

i have started fast sync and its was took 3 days
on first day every blocks was synced but its syncing states and now its really slow!

here is syncing result:

{
  "currentBlock": 10735343,
  "highestBlock": 10735450,
  "knownStates": 584833661,
  "pulledStates": 584754308,
  "startingBlock": 10734477
}

here is console result:

INFO [08-26|12:13:23.330] Imported new state entries               count=384  elapsed=6.877ms     processed=584758628 pending=77408  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:13:27.602] Imported new state entries               count=636  elapsed=7.330ms     processed=584759264 pending=77425  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:13:32.088] Imported new block headers               count=2    elapsed=11.847ms    number=10735455 hash="71d3b7…0e55db"
INFO [08-26|12:13:32.300] Downloader queue stats                   receiptTasks=0 blockTasks=0 itemSize=193.43KiB throttle=339
INFO [08-26|12:13:33.786] Imported new state entries               count=680  elapsed=7.854ms     processed=584759944 pending=77018  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:13:40.582] Imported new state entries               count=618  elapsed=10.019ms    processed=584760562 pending=76876  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:13:46.097] Imported new state entries               count=768  elapsed=14.378ms    processed=584761330 pending=76011  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:13:52.082] Imported new state entries               count=588  elapsed=10.860ms    processed=584761918 pending=75758  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:13:54.640] Imported new state entries               count=384  elapsed=7.061ms     processed=584762302 pending=75829  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:00.586] Imported new state entries               count=620  elapsed=10.992ms    processed=584762922 pending=75610  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:04.091] Imported new state entries               count=384  elapsed=5.460ms     processed=584763306 pending=75509  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:05.281] Imported new block headers               count=1    elapsed=10.060ms    number=10735456 hash="261277…ec1ae9"
INFO [08-26|12:14:05.316] Downloader queue stats                   receiptTasks=0 blockTasks=0 itemSize=195.86KiB throttle=335
INFO [08-26|12:14:09.136] Imported new state entries               count=638  elapsed=6.682ms     processed=584763944 pending=75456  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:09.166] Imported new block headers               count=1    elapsed=11.364ms    number=10735457 hash="6cd8bc…d18c5a"
INFO [08-26|12:14:11.997] Imported new state entries               count=384  elapsed=1.012ms     processed=584764328 pending=76285  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:17.521] Imported new state entries               count=768  elapsed=2.411ms     processed=584765096 pending=77453  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:20.346] Imported new state entries               count=384  elapsed=1.352ms     processed=584765480 pending=78070  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:22.298] Imported new block headers               count=1    elapsed=12.015ms    number=10735458 hash="c50745…4caf1a"
INFO [08-26|12:14:22.325] Downloader queue stats                   receiptTasks=0 blockTasks=0 itemSize=179.93KiB throttle=365
INFO [08-26|12:14:23.382] Imported new state entries               count=384  elapsed=1.053ms     processed=584765864 pending=78578  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:26.121] Imported new state entries               count=384  elapsed=1.387ms     processed=584766248 pending=79245  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:29.057] Imported new state entries               count=395  elapsed=1.074ms     processed=584766643 pending=79953  retry=4   duplicate=1090 unexpected=4261
INFO [08-26|12:14:32.295] Imported new state entries               count=384  elapsed=1.125ms     processed=584767027 pending=80679  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:32.435] Imported new block headers               count=1    elapsed=12.931ms    number=10735459 hash="7d63b2…ff65b6"
INFO [08-26|12:14:33.331] Downloader queue stats                   receiptTasks=0 blockTasks=0 itemSize=187.34KiB throttle=350
INFO [08-26|12:14:35.170] Imported new state entries               count=384  elapsed=1.509ms     processed=584767411 pending=81137  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:37.587] Imported new state entries               count=384  elapsed="939.928µs" processed=584767795 pending=82376  retry=2   duplicate=1090 unexpected=4261
INFO [08-26|12:14:42.914] Imported new state entries               count=768  elapsed=5.270ms     processed=584768563 pending=83624  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:45.925] Imported new state entries               count=384  elapsed=1.518ms     processed=584768947 pending=84116  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:48.339] Imported new state entries               count=384  elapsed=1.183ms     processed=584769331 pending=85239  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:51.306] Imported new state entries               count=433  elapsed=4.434ms     processed=584769764 pending=85273  retry=0   duplicate=1090 unexpected=4261
INFO [08-26|12:14:52.549] Imported new block headers               count=1    elapsed=10.824ms    number=10735460 hash="499459…5ccd22"
INFO [08-26|12:14:52.972] Imported new block headers               count=1    elapsed=10.718ms    number=10735461 hash="c25179…457362"
INFO [08-26|12:14:53.342] Downloader queue stats                   receiptTasks=0 blockTasks=0 itemSize=187.89KiB throttle=349
INFO [08-26|12:14:54.843] Imported new state entries               count=384  elapsed=1.205ms     processed=584770148 pending=85616  retry=0   duplicate=1090 unexpected=4261

anyone know how many states exist now?
and what can i do?

@sssubik
Copy link

@sssubik sssubik commented Aug 26, 2020

Hey @sjlnk. Is your sync completed. What is the logs showing now?
I have been in it for weeks and still getting imported state entries

@sjlnk
Copy link

@sjlnk sjlnk commented Aug 26, 2020

Hey @sjlnk. Is your sync completed. What is the logs showing now?
I have been in it for weeks and still getting imported state entries

Yes, it's completed. I already reported the results above. Please scroll up. ^^

The --cache setting makes a huge difference but is more demanding for the hardware.

@sssubik
Copy link

@sssubik sssubik commented Aug 26, 2020

Hey @sjlnk Did you get Imported Chain segment?
I get this for weeks now:

INFO [08-26|17:09:11.398] Imported new block headers               count=3    elapsed=30.970ms   number=10736801 hash=8df84b…d6ab09
INFO [08-26|17:09:15.920] Initializing fast sync bloom             items=11047301 errorrate=0.000 elapsed=13m20.633s
INFO [08-26|17:09:22.638] Imported new state entries               count=719  elapsed=19.118ms   processed=698898959 pending=3397 retry=0  duplicate=0 unexpected=13
INFO [08-26|17:09:23.924] Initializing fast sync bloom             items=11136084 errorrate=0.000 elapsed=13m28.637s
INFO [08-26|17:09:31.931] Initializing fast sync bloom             items=11215396 errorrate=0.000 elapsed=13m36.644s
INFO [08-26|17:09:32.719] Imported new block headers               count=1    elapsed=31.119ms   number=10736802 hash=01f447…129466
INFO [08-26|17:09:38.130] Imported new state entries               count=324  elapsed=14.996ms   processed=698899283 pending=3309 retry=0  duplic

Also Which GETH version are you using?

@sjlnk
Copy link

@sjlnk sjlnk commented Aug 26, 2020

Hey @sjlnk Did you get Imported Chain segment?
I get this for weeks now:

INFO [08-26|17:09:11.398] Imported new block headers               count=3    elapsed=30.970ms   number=10736801 hash=8df84b…d6ab09
INFO [08-26|17:09:15.920] Initializing fast sync bloom             items=11047301 errorrate=0.000 elapsed=13m20.633s
INFO [08-26|17:09:22.638] Imported new state entries               count=719  elapsed=19.118ms   processed=698898959 pending=3397 retry=0  duplicate=0 unexpected=13
INFO [08-26|17:09:23.924] Initializing fast sync bloom             items=11136084 errorrate=0.000 elapsed=13m28.637s
INFO [08-26|17:09:31.931] Initializing fast sync bloom             items=11215396 errorrate=0.000 elapsed=13m36.644s
INFO [08-26|17:09:32.719] Imported new block headers               count=1    elapsed=31.119ms   number=10736802 hash=01f447…129466
INFO [08-26|17:09:38.130] Imported new state entries               count=324  elapsed=14.996ms   processed=698899283 pending=3309 retry=0  duplic

Also Which GETH version are you using?

Yes I got that message. The version is also reported above.

@sssubik
Copy link

@sssubik sssubik commented Aug 27, 2020

Hey @sjlnk. I dont get it why my state entries is so much greater than yours. But still the sync is not completed:

INFO [08-27|09:38:20.250] Initializing fast sync bloom             items=295539384 errorrate=0.000 elapsed=16h22m25.240s
INFO [08-27|09:38:25.986] Imported new block headers               count=2    elapsed=87.677ms      number=10741287 hash="309adc…db751d"
INFO [08-27|09:38:28.254] Initializing fast sync bloom             items=295539813 errorrate=0.000 elapsed=16h22m33.244s
INFO [08-27|09:38:36.256] Initializing fast sync bloom             items=295540299 errorrate=0.000 elapsed=16h22m41.246s
INFO [08-27|09:38:44.262] Initializing fast sync bloom             items=295540709 errorrate=0.000 elapsed=16h22m49.251s
INFO [08-27|09:38:49.951] Imported new state entries               count=305  elapsed=7.255ms       processed=700468113 pending=77486 retry=0   duplicate=0 unexpected=0

I am already at: 700468113

@farhadjafari385
Copy link

@farhadjafari385 farhadjafari385 commented Aug 27, 2020

Hey @sjlnk. I dont get it why my state entries is so much greater than yours. But still the sync is not completed:

INFO [08-27|09:38:20.250] Initializing fast sync bloom             items=295539384 errorrate=0.000 elapsed=16h22m25.240s
INFO [08-27|09:38:25.986] Imported new block headers               count=2    elapsed=87.677ms      number=10741287 hash="309adc…db751d"
INFO [08-27|09:38:28.254] Initializing fast sync bloom             items=295539813 errorrate=0.000 elapsed=16h22m33.244s
INFO [08-27|09:38:36.256] Initializing fast sync bloom             items=295540299 errorrate=0.000 elapsed=16h22m41.246s
INFO [08-27|09:38:44.262] Initializing fast sync bloom             items=295540709 errorrate=0.000 elapsed=16h22m49.251s
INFO [08-27|09:38:49.951] Imported new state entries               count=305  elapsed=7.255ms       processed=700468113 pending=77486 retry=0   duplicate=0 unexpected=0

I am already at: 700468113

after 7 day?

u use ssd or nvme?

my result after 5 day:


INFO [08-27|10:48:43.404] Imported new state entries               count=384  elapsed="4.217µs"   processed=592286844 pending=65731  retry=0   duplicate=1090 unexpected=4261
INFO [08-27|10:48:51.810] Imported new block headers               count=2    elapsed=13.184ms    number=10741600 hash="0df0e9…8ddd8a"
INFO [08-27|10:48:52.667] Downloader queue stats                   receiptTasks=0 blockTasks=0 itemSize=201.71KiB throttle=325
INFO [08-27|10:48:52.808] Imported new state entries               count=384  elapsed="5.05µs"    processed=592287228 pending=66124  retry=0   duplicate=1090 unexpected=4261
INFO [08-27|10:49:02.164] Imported new state entries               count=384  elapsed="6.18µs"    processed=592287612 pending=66519  retry=0   duplicate=1090 unexpected=4261

@Neurone
Copy link
Contributor

@Neurone Neurone commented Aug 27, 2020

I'm still at 540M and counting, but I'll tell you when I'm done.

If we take as good the three stats reported in this thread

DATE COUNT
14/04/2020 487.040.102
10/05/2020 504.344.987
25/08/2020 575.809.113

there's about 500k~600k new state entries every day and so we should be now around ~580M. To be sure you should count the real entries in the levelDB. You can do it with a separated script that count that, but I was already doing it in a separate PR for geth so if you want you can already try it from here.

I suggest always to build from source, but I uploaded also an already compiled version on my Ubuntu-20.04.1 64 bit.

I will send a PR for the master, but please note that if you take from there your DB will be upgraded to the new schema. Nothing to fear of but, just to say it.

This is for example the output for a goerli testnet full synced few minutes ago, where trie nodes are 24272247. You see the line contract codes because I already upgraded my network for tests.

❯ ./build/bin/geth --goerli --verbosity 3 inspect
INFO [08-27|11:48:29.190] Maximum peer count                       ETH=50 LES=0 total=50
INFO [08-27|11:48:29.190] Smartcard socket not found, disabling    err="stat /run/pcscd/pcscd.comm: no such file or directory"
INFO [08-27|11:48:29.198] Set global gas cap                       cap=25000000
INFO [08-27|11:48:29.198] Allocated cache and file handles         database=/home/developer/.ethereum/goerli/geth/chaindata cache=512.00MiB handles=524288
INFO [08-27|11:48:29.283] Opened ancient database                  database=/home/developer/.ethereum/goerli/geth/chaindata/ancient
INFO [08-27|11:48:29.313] Persisted trie from memory database      nodes=361 size=51.17KiB time=1.37971ms gcnodes=0 gcsize=0.00B gctime=0s livenodes=1 livesize=0.00B
INFO [08-27|11:48:29.316] Loaded most recent local header          number=3296898 hash="886b5b…2b6a4b" td=4811621 age=14m5s
INFO [08-27|11:48:29.316] Loaded most recent local full block      number=3296898 hash="886b5b…2b6a4b" td=4811621 age=14m5s
INFO [08-27|11:48:29.316] Loaded most recent local fast block      number=3296898 hash="886b5b…2b6a4b" td=4811621 age=14m5s
INFO [08-27|11:48:29.316] Loaded last fast-sync pivot marker       number=3292045
INFO [08-27|11:48:37.405] Inspecting database                      count=7148000 elapsed=8.006s
INFO [08-27|11:48:45.406] Inspecting database                      count=14589000 elapsed=16.007s
INFO [08-27|11:48:53.408] Inspecting database                      count=22353000 elapsed=24.009s
+-----------------+--------------------+------------+----------+
|    DATABASE     |      CATEGORY      |    SIZE    |  COUNT   |
+-----------------+--------------------+------------+----------+
| Key-Value store | Headers            | 12.86 MiB  |    20838 |
| Key-Value store | Bodies             | 39.91 MiB  |    21008 |
| Key-Value store | Receipts           | 36.77 MiB  |    21008 |
| Key-Value store | Difficulties       | 1.01 MiB   |    21472 |
| Key-Value store | Block number->hash | 924.96 KiB |    20580 |
| Key-Value store | Block hash->number | 128.95 MiB |  3297807 |
| Key-Value store | Transaction index  | 10.40 MiB  |   303011 |
| Key-Value store | Bloombit index     | 102.89 MiB |  1646592 |
| Key-Value store | Contract codes     | 97.18 MiB  |    17941 |
| Key-Value store | Trie nodes         | 2.37 GiB   | 24272247 |
| Key-Value store | Trie preimages     | 3.51 MiB   |    49418 |
| Key-Value store | Account snapshot   | 0.00 B     |        0 |
| Key-Value store | Storage snapshot   | 0.00 B     |        0 |
| Key-Value store | Clique snapshots   | 79.33 KiB  |       98 |
| Key-Value store | Singleton metadata | 151.00 B   |        5 |
| Ancient store   | Headers            | 1.04 GiB   | NA       |
| Ancient store   | Bodies             | 1.09 GiB   | NA       |
| Ancient store   | Receipts           | 596.63 MiB | NA       |
| Ancient store   | Difficulties       | 31.21 MiB  | NA       |
| Ancient store   | Block number->hash | 118.75 MiB | NA       |
| Light client    | CHT trie nodes     | 0.00 B     |        0 |
| Light client    | Bloom trie nodes   | 0.00 B     |        0 |
+-----------------+--------------------+------------+----------+
|                         TOTAL        |  5.65 GIB  |           
+-----------------+--------------------+------------+----------+
ERROR[08-27|11:48:58.711] Database contains unaccounted data       size=100.74KiB count=3514
@sjlnk
Copy link

@sjlnk sjlnk commented Aug 27, 2020

Hey @sjlnk. I dont get it why my state entries is so much greater than yours. But still the sync is not completed:

INFO [08-27|09:38:20.250] Initializing fast sync bloom             items=295539384 errorrate=0.000 elapsed=16h22m25.240s
INFO [08-27|09:38:25.986] Imported new block headers               count=2    elapsed=87.677ms      number=10741287 hash="309adc…db751d"
INFO [08-27|09:38:28.254] Initializing fast sync bloom             items=295539813 errorrate=0.000 elapsed=16h22m33.244s
INFO [08-27|09:38:36.256] Initializing fast sync bloom             items=295540299 errorrate=0.000 elapsed=16h22m41.246s
INFO [08-27|09:38:44.262] Initializing fast sync bloom             items=295540709 errorrate=0.000 elapsed=16h22m49.251s
INFO [08-27|09:38:49.951] Imported new state entries               count=305  elapsed=7.255ms       processed=700468113 pending=77486 retry=0   duplicate=0 unexpected=0

I am already at: 700468113

That's strange. I would like to know also why this happens. Perhaps someone more knowledgeable could chime in and enlighten us...

@holiman
Copy link
Contributor

@holiman holiman commented Aug 27, 2020

@sjlnk it's because you've stopped and restarted sync several times. Or so I presume, since it's still Initializing fast sync bloom. Every time you restart, geth goes into a mode where, while importing state, also reads the entire db to create a bloom filter of existing states. Which slows down sync. Also, every time you restart, you then choose a new pivot point to sync to, so more state to download.

@karalabe just merged a great PR that makes fast-sync a lot more stable, and less prone to crashing due to out of memory. I would recommend to update to latest master and either start from scratch or continue (yes, with another restart -- can't be helped).

@holiman
Copy link
Contributor

@holiman holiman commented Aug 27, 2020

@sjlnk actually, looking at it again, I suspect you have bad read IO. It has only gone through 29M entries for the bloom filter, in 16+h, which is very slow indeed. Might want to check that your disk is good. It's not an HDD, is it? And if it's an SSD, those can get degraded pretty badly with time.

@cncr04s
Copy link

@cncr04s cncr04s commented Aug 27, 2020

I synced a 3 month gap from last sync, took 2.5 weeks to complete. Its an array of 6 10k rpm sas drives in raid 5. Adding 2 more soon. Took about 2 or more tb of write volume. It really is an ssd killer.

@sjlnk
Copy link

@sjlnk sjlnk commented Aug 27, 2020

@sjlnk it's because you've stopped and restarted sync several times. Or so I presume, since it's still Initializing fast sync bloom. Every time you restart, geth goes into a mode where, while importing state, also reads the entire db to create a bloom filter of existing states. Which slows down sync. Also, every time you restart, you then choose a new pivot point to sync to, so more state to download.

@karalabe just merged a great PR that makes fast-sync a lot more stable, and less prone to crashing due to out of memory. I would recommend to update to latest master and either start from scratch or continue (yes, with another restart -- can't be helped).

Thanks for clearing it out! Actually my sync completed in 10 hours with a powerful AWS NVMe enabled instance. I just wanted to know why some people have to download more states than others. I never restarted the process so I'm assuming the other people here did and that's why they have so many more states to download.

@sssubik
Copy link

@sssubik sssubik commented Aug 28, 2020

Hey @holiman . The logs are from my VPS server. It is an SSD. Does this mean that my server is not powerful enough as it has very less i/o speed. Do I need to upgrade it?

@sssubik
Copy link

@sssubik sssubik commented Aug 29, 2020

Hey @sjlnk . Could you give the specification of your server? Because I too have a good server.. But it has not completed for like a month. What is your Disk I/O rate specifically..

@Neurone
Copy link
Contributor

@Neurone Neurone commented Aug 30, 2020

I'm done with counter at 652299180. Please note that in my case the processed counter at the end was bigger of about ~12M of entries vs the real entries stored in levelDB. I restarted geth during the fast sync a lot of times in these days.

INFO [08-30|10:01:52.403] Imported new state entries               count=1605 elapsed=22.003ms   processed=652296924 pending=3467   retry=0    duplicate=41127 unexpected=121037
INFO [08-30|10:01:53.302] Imported new state entries               count=1157 elapsed=11.998ms   processed=652298081 pending=2059   retry=0    duplicate=41127 unexpected=121037
INFO [08-30|10:01:54.214] Imported new state entries               count=1001 elapsed=14.992ms   processed=652299082 pending=348    retry=0    duplicate=41127 unexpected=121037
INFO [08-30|10:01:54.523] Imported new state entries               count=98   elapsed=2.837ms    processed=652299180 pending=0      retry=0    duplicate=41127 unexpected=121037
INFO [08-30|10:01:54.535] Imported new block receipts              count=1    elapsed=2ms        number=10760885 hash="e5569e…7ead3c" age=29m44s   size=101.58KiB
INFO [08-30|10:01:54.545] Committed new head block                 number=10760885 hash="e5569e…7ead3c"
INFO [08-30|10:01:54.613] Deallocated fast sync bloom              items=639248389 errorrate=0.001
INFO [08-30|10:02:03.025] Imported new chain segment               blocks=12 txs=2533 mgas=149.011 elapsed=8.402s     mgasps=17.735 number=10760897 hash="f8d397…0dafd0" age=26m49s   dirty=18.46MiB
INFO [08-30|10:02:11.409] Imported new chain segment               blocks=14 txs=3369 mgas=174.232 elapsed=8.384s     mgasps=20.780 number=10760911 hash="1adada…67ecc0" age=21m55s   dirty=41.57MiB
INFO [08-30|10:02:19.956] Imported new chain segment               blocks=18 txs=3487 mgas=210.546 elapsed=8.546s     mgasps=24.636 number=10760929 hash="97dcea…9d023b" age=16m20s   dirty=68.14MiB
INFO [08-30|10:02:27.303] New local node record                    seq=39281 id=82090fbe6ea27663 ip=127.0.0.1     udp=30303 tcp=30303
INFO [08-30|10:02:27.389] New local node record                    seq=39282 id=82090fbe6ea27663 ip=93.40.103.165 udp=30303 tcp=30303
INFO [08-30|10:02:28.037] Imported new chain segment               blocks=17 txs=3815 mgas=209.530 elapsed=8.081s     mgasps=25.927 number=10760946 hash="4165bb…e09268" age=11m39s   dirty=94.33MiB
INFO [08-30|10:02:36.621] Imported new chain segment               blocks=12 txs=2487 mgas=136.747 elapsed=8.584s     mgasps=15.930 number=10760958 hash="c59ba3…d33c9e" age=8m32s    dirty=111.89MiB
INFO [08-30|10:02:40.190] New local node record                    seq=39283 id=82090fbe6ea27663 ip=127.0.0.1     udp=30303 tcp=30303
INFO [08-30|10:02:45.795] New local node record                    seq=39284 id=82090fbe6ea27663 ip=93.40.103.165 udp=30303 tcp=30303
INFO [08-30|10:02:45.895] Imported new chain segment               blocks=10 txs=2042 mgas=120.584 elapsed=9.273s     mgasps=13.003 number=10760968 hash="82b2bd…ea0969" age=6m43s    dirty=126.93MiB
INFO [08-30|10:02:46.400] Deep froze chain segment                 blocks=30001 elapsed=18.624s    number=10057008 hash="ffaddb…05d9b1"
INFO [08-30|10:02:54.190] Imported new chain segment               blocks=13    txs=2915 mgas=161.846 elapsed=8.295s     mgasps=19.511 number=10760981 hash="e653ff…70529a" age=3m40s    dirty=146.58MiB
INFO [08-30|10:03:00.113] Imported new chain segment               blocks=7     txs=1296 mgas=87.019  elapsed=5.923s     mgasps=14.691 number=10760988 hash="a655a4…7c6ca5" age=1m31s    dirty=156.69MiB
INFO [08-30|10:03:00.126] Imported new block headers               count=1    elapsed=1m4.052s   number=10760989 hash="31df12…bc9465" age=1m30s
INFO [08-30|10:03:00.139] Imported new block headers               count=1    elapsed=4.998ms    number=10760990 hash="04c374…9a92dc" age=1m27s
INFO [08-30|10:03:00.153] Imported new block headers               count=3    elapsed=6.000ms    number=10760993 hash="c6fffb…265cae"
INFO [08-30|10:03:00.225] Downloader queue stats                   receiptTasks=0    blockTasks=0    itemSize=221.62KiB throttle=296
INFO [08-30|10:03:00.997] Imported new chain segment               blocks=1     txs=235  mgas=12.392  elapsed=763.005ms  mgasps=16.241 number=10760989 hash="31df12…bc9465" age=1m30s    dirty=158.37MiB
INFO [08-30|10:03:01.008] Imported new block headers               count=1    elapsed=572.995ms  number=10760994 hash="e3afed…f9afef"
INFO [08-30|10:03:03.377] Imported new chain segment               blocks=1     txs=200  mgas=12.423  elapsed=2.361s     mgasps=5.261  number=10760990 hash="04c374…9a92dc" age=1m30s    dirty=160.05MiB
INFO [08-30|10:03:04.270] Deep froze chain segment                 blocks=30001 elapsed=17.860s    number=10087009 hash="e2f6c6…fe7f20"
WARN [08-30|10:03:07.574] Fast syncing, discarded propagated block number=10760994 hash="e3afed…f9afef"
INFO [08-30|10:03:08.411] Imported new chain segment               blocks=4     txs=666  mgas=49.755  elapsed=5.023s     mgasps=9.905  number=10760994 hash="e3afed…f9afef" dirty=165.73MiB
INFO [08-30|10:03:08.422] Fast sync complete, auto disabling

Detailed DB stats

> geth --datadir s:\Ethereum inspect
INFO [08-30|12:49:32.043] Maximum peer count                       ETH=50 LES=0 total=50
INFO [08-30|12:49:32.388] Set global gas cap                       cap=25000000
INFO [08-30|12:49:32.393] Allocated cache and file handles         database=s:\Ethereum\geth\chaindata cache=512.00MiB handles=8192
INFO [08-30|12:49:34.040] Opened ancient database                  database=s:\Ethereum\geth\chaindata\ancient
INFO [08-30|12:49:34.074] Disk storage enabled for ethash caches   dir=s:\Ethereum\geth\ethash count=3
INFO [08-30|12:49:34.079] Disk storage enabled for ethash DAGs     dir=C:\Users\giuse\AppData\Local\Ethash count=2
INFO [08-30|12:49:34.090] Loaded most recent local header          number=10761038 hash="95d315…aac7a9" td=17106408363816477706008 age=2h38m10s
INFO [08-30|12:49:34.099] Loaded most recent local full block      number=10760905 hash="a638f7…ecd63c" td=17106043165864029076711 age=3h10m31s
INFO [08-30|12:49:34.106] Loaded most recent local fast block      number=10761038 hash="95d315…aac7a9" td=17106408363816477706008 age=2h38m10s
INFO [08-30|12:49:34.107] Deep froze chain segment                 blocks=18 elapsed=33.995ms number=10670905 hash="8c3832…cb9018"
INFO [08-30|12:49:34.115] Loaded last fast-sync pivot marker       number=10760885
INFO [08-30|12:49:42.130] Inspecting database                      count=6482000 elapsed=8.001s
...
INFO [08-30|13:22:16.661] Counting ancient database receipts       blocknumber=10670906 percentage=100 elapsed=32m42.533s
+-----------------+--------------------+------------+-----------+
|    DATABASE     |      CATEGORY      |    SIZE    |   ITEMS   |
+-----------------+--------------------+------------+-----------+
| Key-Value store | Headers            | 55.00 MiB  |    100554 |
| Key-Value store | Bodies             | 3.64 GiB   |     98505 |
| Key-Value store | Receipts           | 4.20 GiB   |  17751843 |
| Key-Value store | Difficulties       | 6.25 MiB   |    110238 |
| Key-Value store | Block number->hash | 5.16 MiB   |    110222 |
| Key-Value store | Block hash->number | 420.76 MiB |  10761043 |
| Key-Value store | Transaction index  | 27.44 GiB  | 818569194 |
| Key-Value store | Bloombit index     | 1.64 GiB   |   5380096 |
| Key-Value store | Contract codes     | 37.33 MiB  |      6636 |
| Key-Value store | Trie nodes         | 74.51 GiB  | 639023322 |
| Key-Value store | Trie preimages     | 1.07 MiB   |     17051 |
| Key-Value store | Account snapshot   | 0.00 B     |         0 |
| Key-Value store | Storage snapshot   | 0.00 B     |         0 |
| Key-Value store | Clique snapshots   | 0.00 B     |         0 |
| Key-Value store | Singleton metadata | 151.00 B   |         5 |
| Ancient store   | Headers            | 4.51 GiB   |  10670906 |
| Ancient store   | Bodies             | 97.85 GiB  |  10670906 |
| Ancient store   | Receipt lists      | 44.38 GiB  |  10670906 |
| Ancient store   | └ counted receipts | --         | 802441861 |
| Ancient store   | Difficulties       | 166.01 MiB |  10670906 |
| Ancient store   | Block number->hash | 386.71 MiB |  10670906 |
| Light client    | CHT trie nodes     | 0.00 B     |         0 |
| Light client    | Bloom trie nodes   | 0.00 B     |         0 |
+-----------------+--------------------+------------+-----------+
|                         TOTAL        | 259.22 GIB |           |
+-----------------+--------------------+------------+-----------+
ERROR[08-30|13:22:16.783] Database contains unaccounted data       size=121.08KiB count=2632
@sssubik
Copy link

@sssubik sssubik commented Aug 31, 2020

Hey @Neurone How many days does the full sync take in average? How can I know how close am I? I have been syncing for like a month now..

@sjlnk
Copy link

@sjlnk sjlnk commented Aug 31, 2020

Hey @sjlnk . Could you give the specification of your server? Because I too have a good server.. But it has not completed for like a month. What is your Disk I/O rate specifically..

I used i3.xlarge on Amazon EC2.

@splix
Copy link

@splix splix commented Aug 31, 2020

Finished a fast sync today (block 10770930) with:

processed=608439377

(was running it on a work machine during day time, so the total time is irrelevant in my case)

@Neurone
Copy link
Contributor

@Neurone Neurone commented Aug 31, 2020

Hey @Neurone How many days does the full sync take in average? How can I know how close am I? I have been syncing for like a month now..

Not easy to say because I tested many things during the sync process, I restarted a lot of times (40~50 times) and I moved the files around different systems. In any case I searched for all the logs I saved between restarts (not all) and I created the summary below, just to have an idea.

The most of the time is spent for the fast sync of state entries, but block headers are downloaded in few hours in any case. So you should check your stats and calculate your rate to download state entries per hour: if it's high enough, I suggest you to delete the database - not the ancient - and download state entries again from the start.

If your rate is like mine at the end (~8.2M entries per hour) you should be able to reach the current state from scratch in ~3 days.

SYSTEM / OS RAM GB DISK HOURS ONLINE RESTARTS STATE ENTRIES REACHED STATE ENTRIES PER HOUR TIME TO INIT FAST SYNC BLOOM WHEN RESTARTED
Rock64 / Armbian 20.02.1 Bionic 4 HDD ~1.800 ~40 469.380.958 ~100k (decreasing over time, this is the mean value in the last 26 days) 3h16m29
Desktop / Ubuntu 20.04 (WSL2 on Windows 10) 16 SSD 134 1 578.598.409 ~815k 2h38m14
Desktop / Windows 10 16 SSD 9 0 652.299.180 ~8.2M 30m28
  • Desktop:
    • Processor: Intel i7-3820 3.6Ghz (2012)
    • SSD: Crucial CT525MX300SSD1 (2016)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
You can’t perform that action at this time.