Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Node ignores high-peers value defined in the config map #4771

Closed
nenadvasic opened this issue Aug 2, 2023 · 11 comments
Closed

Node ignores high-peers value defined in the config map #4771

nenadvasic opened this issue Aug 2, 2023 · 11 comments
Assignees

Comments

@nenadvasic
Copy link
Contributor

nenadvasic commented Aug 2, 2023

Environment

  • Smapp version: v1.0.11
  • Node version: v1.0.7
  • OS: Ubuntu 22.04

Describe the bug

Spacemesh node constantly has over 100 connections when it's run through Smapp. It seems that high-peers value is not used although it's present in the config map. Values defined in the config map are:

  "p2p": {
    "disable-reuseport": false,
    "p2p-disable-legacy-discovery": true,
    "bootnodes": [
        "/dns4/mainnet-bootnode-10.spacemesh.network ...",
        "... omitted ...",
        "/dns4/mainnet-bootnode-19.spacemesh.network ..."
    ],
    "min-peers": 10,
    "low-peers": 20,
    "high-peers": 30
  },

The issue started to appear with Smapp version v1.0.11. I didn't have any problems with v1.0.9.

Steps to reproduce

  1. Download spacemesh_app_1.0.11_amd64.deb from the release page
  2. Install
  3. Run Smapp
  4. Observe the number of Connected neighbors in the Network tab

Expected behavior

The number of Connected neighbors should be between 10 and 30.

Actual behavior

The number of Connected neighbors is constantly over 100. The node is running for 3 hours already and the value is not decreasing. Restart also doesn't help.

Logs

N/A

Additional info

xxx@xxx:/opt/Spacemesh/node$ ./go-spacemesh version
v1.0.7+65f516c6fca65eaf495b227aad526390eaa8eb20
xxx@xxx:~$ grpcurl -plaintext 0.0.0.0:9092 spacemesh.v1.NodeService.Version
{
  "versionString": {
    "value": "v1.0.7"
  }
}
xxx@xxx:~$ grpcurl -plaintext 0.0.0.0:9092 spacemesh.v1.NodeService.Status
{
  "status": {
    "connectedPeers": "112",
    "isSynced": true,
    ...
  }
}

It could be that the issue is with the go-spacemesh and not Smapp, but I'm not sure.

@maxpla3
Copy link

maxpla3 commented Aug 2, 2023

I am experiencing the same issue. Since update to node v1.0.7 my internet connection is completely overloaded as soon as I start the nodes. I have 12 nodes and each of the nodes reports having about 50 to 80 peers. I tried to reduce the peer count in the config, without success:

    "p2p": {
        "min-peers": 2,
        "low-peers": 3,
        "high-peers": 4,
        "p2p-disable-legacy-discovery": true,
        "bootnodes": [...]
    }

bors bot referenced this issue Aug 3, 2023
related: https://github.com/spacemeshos/smapp/issues/1424

total hardcap without autoscaling should be high peers + min peers + transient new connections (min peers).
once node reached high-peers watermark, it will disconnect from bootnodes, and reconnect only once crossing min-peers watermark
@pigmej pigmej transferred this issue from spacemeshos/smapp Aug 3, 2023
@pigmej
Copy link
Member

pigmej commented Aug 3, 2023

We expect an improvement for that to land in next go-spacemesh release. Then we will also include that improvement in next smapp release.

@nenadvasic
Copy link
Contributor Author

Thanks for the updated version, but unfortunately the issue still persists. Several people confirmed this on Discord. It may be that this is an issue that exists only on Linux, but I might be wrong.

@nenadvasic
Copy link
Contributor Author

The situation became worse. I have 3 nodes and they have 271, 278, and 212 connected peers respectively.

@pigmej
Copy link
Member

pigmej commented Aug 4, 2023

@nenadvasic what's your current config for peers?

@nenadvasic
Copy link
Contributor Author

"p2p": {
    "disable-reuseport": false,
    "p2p-disable-legacy-discovery": true,
    "bootnodes": [
        ...
    ],
    "min-peers": 5,
    "low-peers": 10,
    "high-peers": 15
}

@pigmej

@pigmej
Copy link
Member

pigmej commented Aug 4, 2023

ok two items.

That's for sure too low limits to do kinda anything because the moment you connect to the bootnode you'll get peers from the bootnode, and there are 10 bootnodes in your config I guess.
So the result will be constant low -> over high flow.

That does not fully explain 271 though.
What you could do is:

https://github.com/spacemeshos/go-spacemesh/blob/v1.0.9/p2p/README.md

(please note that you need 1.0.9 for that)

And configure one node to be direct peer for all other your nodes.

@nenadvasic
Copy link
Contributor Author

Thank you, I will take a look. Btw, that node that had 271 connections is running through Smapp, and I just noticed that the config file was updated automatically. Now that node has:

"min-peers": 20,
"low-peers": 50,
"high-peers": 80

I didn't know that the node can update configs by itself 🤔

@dshulyak
Copy link
Contributor

dshulyak commented Aug 4, 2023

could you please check what go-spacemesh version are you running? i am running v1.0.8 with the following config, it went above 100 only in the initial bootstrap period after that it never goes above 50-60:

{
    "p2p": {
        "min-peers": 30,
        "low-peers": 60,
        "high-peers": 100,

@pigmej
Copy link
Member

pigmej commented Aug 7, 2023

ok two items.

That's for sure too low limits to do kinda anything because the moment you connect to the bootnode you'll get peers from the bootnode, and there are 10 bootnodes in your config I guess.
So the result will be constant low -> over high flow.

That does not fully explain 271 though.
What you could do is:

https://github.com/spacemeshos/go-spacemesh/blob/v1.0.9/p2p/README.md

(please note that you need 1.0.9 for that)

And configure one node to be direct peer for all other your nodes.

@nenadvasic
Copy link
Contributor Author

I confirm that the issue has been resolved with go-sm v1.0.11

Thank you @pigmej @dshulyak 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants