Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

header/p2p: Mainnet nodes getting mocha-4 headers #3071

Closed
mindstyle85 opened this issue Jan 7, 2024 · 9 comments · Fixed by #3086
Closed

header/p2p: Mainnet nodes getting mocha-4 headers #3071

mindstyle85 opened this issue Jan 7, 2024 · 9 comments · Fixed by #3086
Assignees
Labels
bug Something isn't working external Issues created by non node team members

Comments

@mindstyle85
Copy link

Celestia Node version

12.1

OS

ubuntu

Install tools

No response

Others

No response

Steps to reproduce it

install and run light node like in the docs (defaults to mainnet without flags etc)

Expected result

run on mainnet

Actual result

2024-01-07T09:04:19.472-0800 ERROR header/sync sync/sync_head.go:175 invalid network header {"height_of_invalid": 894619, "hash_of_invalid": "9E53A68FAC3C4051046A498E58C1C15E59CF306AAE8CC80D459A83EDF23E3BB4", "height_of_subjective": 503228, "hash_of_subjective": "8AD1E4F9C65A163312F344FDBDD17B71444D0EA42A12240F7401634AA45C6DD9", "reason": "wrong chain id: 'mocha-4' != 'celestia'"}

Relevant log output

No response

Notes

No response

@mindstyle85 mindstyle85 added the bug Something isn't working label Jan 7, 2024
@github-actions github-actions bot added the external Issues created by non node team members label Jan 7, 2024
@mindstyle85
Copy link
Author

note from the person observing this (and he wasnt the only one reporting it, we just figured they misconfigured): it seems that the error above happens only on that specific header

@mindstyle85
Copy link
Author

some more logs from other reports:

sometimes i get "2023-12-29T17:43:26.465Z ERROR header/sync sync/sync_head.go:175 invalid network header {"height_of_invalid": 829037, "hash_of_invalid": "6F0A2391EB372F91482085B4D73B5B0579F5811A2B06855D6839AE5D0DA539C5", "height_of_subjective": 436976, "hash_of_subjective": "2606F33CC027A92A242ABFD4E2957C978449E3777B2EDFF45E4ED97BE653540F", "reason": "wrong chain id: 'mocha-4' != 'celestia'"}"
2024-01-05T22:49:07.051+0100 ERROR header/sync sync/sync_head.go:175 invalid network header {"height_of_invalid": 881445, "hash_of_invalid": "729B0D560C699F39EA7F7D42E1662B0A955F1709859E59CEB4B688AD7B67D611", "height_of_subjective": 489952, "hash_of_subjective": "2F1779EC9384A2D370935AA6CBC8C720B5304AE24C17C49F56B0C391DF998285", "reason": "wrong chain id: 'mocha-4' != 'celestia'"}

2024-01-05T23:06:46.957+0100 ERROR header/sync sync/sync_head.go:175 invalid network header {"height_of_invalid": 450443, "hash_of_invalid": "AD2ADE2643F98B8C6DAD14B9349180ABEFD9A0A29FC5DABD3F83DEB595F1107F", "height_of_subjective": 490042, "hash_of_subjective": "DDD38DA8621D01279AB5377AC470EA46F113D086124646E495B203A85DB11E24", "reason": "unordered headers: timestamp '2023-12-31T13:22:48Z' < current '2024-01-05T22:06:30Z'"}
2024-01-05T22:49:07.051+0100 ERROR header/sync sync/sync_head.go:175 invalid network header {"height_of_invalid": 881445, "hash_of_invalid": "729B0D560C699F39EA7F7D42E1662B0A955F1709859E59CEB4B688AD7B67D611", "height_of_subjective": 489952, "hash_of_subjective": "2F1779EC9384A2D370935AA6CBC8C720B5304AE24C17C49F56B0C391DF998285", "reason": "wrong chain id: 'mocha-4' != 'celestia'"}

ill do a test run of the light node on a fresh server following the docs, see if this also happens (though i think JB tried this yday and also ran into it)

@mindstyle85
Copy link
Author

ok i did a test, if you go via docs and do:

celestia light init
celestia light start

you immediately get:

The p2p host is listening on:
*  /ip4/213.246.39.171/tcp/2121/p2p/12D3KooWK3Er86i3wTgpxXFDQt16pR45ZP3oCGeGwZbJqtoYwjX1
*  /ip6/::1/tcp/2121/p2p/12D3KooWK3Er86i3wTgpxXFDQt16pR45ZP3oCGeGwZbJqtoYwjX1

2024-01-08T10:54:52.100Z        INFO    header/store    store/store.go:365      new head     {"height": 513, "hash": "D34533754C7407A3FF14DD3D4044FC08ED9F39884545922DC19D69DCEE498B84"}
2024-01-08T10:54:52.182Z        ERROR   header/sync     sync/sync_head.go:175   invalid network header        {"height_of_invalid": 508707, "hash_of_invalid": "B90E5DA97D17C1D2ED4EB8B4EE4CF89066EDE016B0C87102477A1D3D156397BB", "height_of_subjective": 508707, "hash_of_subjective": "B90E5DA97D17C1D2ED4EB8B4EE4CF89066EDE016B0C87102477A1D3D156397BB", "reason": "known header: '508707' <= current '508707'"}
2024-01-08T10:54:52.268Z        INFO    canonical-log   swarm/swarm_dial.go:559 CANONICAL_PEER_STATUS: peer=12D3KooWJJsaiBmHeDQ9f6NXPK6Rz4hAbKhnNGKnoDamvrssMdrh addr=/ip4/95.156.239.4/tcp/2121 sample_rate=100 connection_status="established" dir="outbound"
2024-01-08T10:54:54.893Z        INFO    canonical-log   swarm/swarm_listen.go:136       CANONICAL_PEER_STATUS: peer=12D3KooWBbSD71D4bFihfQHzyg3c6qsEzUMdpAXgTNs8zpsi2uk1 addr=/ip4/20.240.200.118/tcp/2121 sample_rate=100 connection_status="established" dir="inbound"
2024-01-08T10:54:57.564Z        ERROR   header/sync     sync/sync_head.go:175   invalid network header        {"height_of_invalid": 900060, "hash_of_invalid": "53A044ACFD40913285DA850C41667720F85C5FCBE49C330F00C23540A0552080", "height_of_subjective": 508707, "hash_of_subjective": "B90E5DA97D17C1D2ED4EB8B4EE4CF89066EDE016B0C87102477A1D3D156397BB", "reason": "wrong chain id: 'mocha-4' != 'celestia'"}

@mindstyle85
Copy link
Author

final note: after leaving it running, the invalid header messages are quite frequent in the logs (every 20 or so lines at least) but other than that the node seems to be syncing fine

@renaynay renaynay changed the title Light nodes that run default option (celestia mainnet) get errors in logs for mocha-4 header/p2p: Mainnet nodes getting mocha-4 headers Jan 8, 2024
@mindstyle85
Copy link
Author

here are some more logs from the light node, i also see a lot of stream resets and i left all the logs prior to shutdown too when i clicked ctrl+c

https://pastebin.com/BsGpiXg7

@walldiss walldiss assigned walldiss and renaynay and unassigned walldiss Jan 8, 2024
@renaynay
Copy link
Member

renaynay commented Jan 8, 2024

Hey @mindstyle85 don't worry about the stream reset logs on shutdown. They aren't related.

I'm going to run a modified version of the node shortly to dump logs of who is sending those headers on mainnet. I have a suspicion it's possibly a few nodes running over incorrect datastore mounts and accidentally running with default settings (as mainnet is the default chain). This seems to be the only explanation at the moment.

Will report back soon.

@renaynay
Copy link
Member

renaynay commented Jan 8, 2024

Related celestiaorg/go-header#141

@renaynay
Copy link
Member

renaynay commented Jan 9, 2024

The explanation here is someone is running a mainnet bridge node over a mocha-4 datastore that is connected to a mocha-4 RPC endpoint, so it is just piping mocha-4 headers into mainnet.

The solution is to have the core/listener take a chainID and ensure the blocks it is receiving are for that chainID it receives on start-up, since the current checks we have (1. chainID validation on store Init via Get call to exchange; and 2. chainID check in Verify which checks an untrusted new header against a trusted subjective head) do not cover the case where a BN runner started running a node on a different network, stopped it, and then started running the node over the same datastore but on a different network (still pointed at the old network's RPC).

@mindstyle85
Copy link
Author

Ok great, ill let you close this whenever you want given that i see you already made a fix :)

renaynay added a commit that referenced this issue Jan 17, 2024
…rect chain (#3086)

Right now, it is possible for a **bridge** node that was initialised and
started on one chain (e.g. `mocha-4`) to be stopped and restarted on a
different chain (e.g. `mainnet`) and propagate headers from the old
chain (`mocha-4`) into the different network (`mainnet`).

This PR fixes this issue by causing the listener to fatal if the
listener recognises it is receiving blocks from a different chain to
that which it expects.

Error will look like this: 

```
2024-01-10T16:06:22.001+0100	ERROR	core	core/listener.go:175	listener: received block with unexpected chain ID: expected arabica-11, received mocha-4
2024-01-10T16:06:22.001+0100	INFO	core	core/listener.go:177	listener: listening stopped
2024-01-10T16:06:22.001+0100	FATAL	core	core/listener.go:126	listener: invalid subscription
```

Resolves #3071
renaynay added a commit that referenced this issue Jan 23, 2024
…rect chain (#3086)

Right now, it is possible for a **bridge** node that was initialised and
started on one chain (e.g. `mocha-4`) to be stopped and restarted on a
different chain (e.g. `mainnet`) and propagate headers from the old
chain (`mocha-4`) into the different network (`mainnet`).

This PR fixes this issue by causing the listener to fatal if the
listener recognises it is receiving blocks from a different chain to
that which it expects.

Error will look like this: 

```
2024-01-10T16:06:22.001+0100	ERROR	core	core/listener.go:175	listener: received block with unexpected chain ID: expected arabica-11, received mocha-4
2024-01-10T16:06:22.001+0100	INFO	core	core/listener.go:177	listener: listening stopped
2024-01-10T16:06:22.001+0100	FATAL	core	core/listener.go:126	listener: invalid subscription
```

Resolves #3071
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working external Issues created by non node team members
Projects
None yet
3 participants