Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

port already used 和 router config conflict #3470

Closed
11 tasks
amxliuli opened this issue Jun 1, 2023 · 8 comments
Closed
11 tasks

port already used 和 router config conflict #3470

amxliuli opened this issue Jun 1, 2023 · 8 comments

Comments

@amxliuli
Copy link

amxliuli commented Jun 1, 2023

Bug Description

正常启动frp的客户端和服务端以后,穿透可以正常使用,但是使用一段时间(有时候几天,有时候几分钟)穿透就失败嘞,通过查看日志发现,是客户端会频繁(几分钟)的去login服务端,然后突然有一次就报 端口或者router冲突。

frpc Version

0.38.0

frps Version

0.38.0

System Architecture

linux/amd64

Configurations

  1. frps.ini
[common]
bind_port = 7000
tls_enable = true

#http https
vhost_http_port = 86
#vhost_https_port = 443
vhost_http_timeout = 6000

#log
log_file = /usr/local/software/frp/log/frps.log
log_level = trace

heartbeat_timeout = 300
user_conn_timeout = 60

#subdomain
subdomain_host = frp.***.com

max_pool_count = 200
  1. frpc.ini
[common]
server_addr = 182.61.53.***
server_port = 7000

# console or real logFile path like ./frpc.log
log_file = /usr/local/soft/frp/log/frpc.log

# trace, debug, info, warn, error
log_level = trace

log_max_days = 3

tls_enable=true

Logs

  1. frps.log
2023/06/01 14:32:18 [W] [control.go:312] [da8fd4774c1fc492] write message to control connection error: session shutdown
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [control.go:335] [da8fd4774c1fc492] control connection closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:19 [T] [service.go:396] start check TLS connection...
2023/06/01 14:32:19 [T] [service.go:404] success check TLS connection
2023/06/01 14:32:19 [I] [service.go:449] [3cc6c08b2114ce53] client login info: ip [219.145.34.148:1819] version [0.38.0] hostname [] os [linux] arch [amd64]
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [xndc_guoke_test_yanshi_front] error: port already used
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [98_registry_tcp] error: port already used
2023/06/01 14:32:19 [I] [proxy.go:88] [3cc6c08b2114ce53] [99_simulation_http] proxy closing
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [99_simulation_http] error: router config conflict
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [suanfa_ssh_tcp] error: port already used
  1. frpc.log
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [98_mysql] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [98_huaneng] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [minio_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_front_ok] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [95_mysql8] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_harbor_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_ssh] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [zgh_zhny_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [sdprice_forecast_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_back] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [97_ssh_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [96_mysql8] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_keking] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [96_ssh_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [minio_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_maven_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_gitlab] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_yanshi_ok] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_harbor] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [mantis_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_test_yanshi_front] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [98_registry_tcp] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [99_simulation_http] start error: router config conflict
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [suanfa_ssh_tcp] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [zj_prod_http] start error: router config conflict
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [96_mysql5] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [fp_ocr_tcp] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [97_price] start error: router config conflict
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [98_ssh] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_front] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_tcp_yanshi_back] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [99_mysql] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [sdprice_forecast_backup] start error: port already used

Steps to reproduce

  1. 启动服务端
  2. 启动客户端
  3. 使用一段时间后穿透失败
    ...

Affected area

  • Docs
  • Installation
  • Performance and Scalability
  • Security
  • User Experience
  • Test and Release
  • Developer Infrastructure
  • Client Plugin
  • Server Plugin
  • Extensions
  • Others
@Becods
Copy link
Contributor

Becods commented Jun 1, 2023

更新你的frp
使用mtr进行连续ping测试

@amxliuli
Copy link
Author

amxliuli commented Jun 2, 2023

由于目前穿透应用于生产环境,版本暂时不能立马升级,只做了mtr测试,结果如下:

  1. c->s
    c-s
  2. s->c
    s- c

@amxliuli
Copy link
Author

amxliuli commented Jun 2, 2023

今天断开报错是这个
frperror

@fatedier
Copy link
Owner

fatedier commented Jun 2, 2023

断开大部分都是网络原因,正常情况下,网络恢复后,相同 RunID 的客户端连接上来时,会释放之前的资源,不会出现 port alrewady used。

如果你的 frpc 重启了,之前的 RunID 就丢失了,相当于一个新的客户端。frps 必须等到和之前的客户端心跳超时后,才会释放相关资源。通过修改心跳相关的配置可以降低这个等待的时间。

如果没有重启 frpc,RunID 出现了变化,那可能是 bug。否则,只需要等网络恢复后自动重连。

@amxliuli
Copy link
Author

amxliuli commented Jun 2, 2023

heartbeat_timeout = 300
user_conn_timeout = 60
那是不是我把这两个时间需要设置小一点才可以,当网络出现问题的时候,重新连接时候如果时间设置的过程,会导致原来的链接在s端还没有释放,新的连接连接的时候,发现原来的端口有链接,才会报这个问题。

@yzlnew
Copy link

yzlnew commented Jun 26, 2023

0.44.0 遇到类似的问题,需要服务端和客户端都重启。

@github-actions
Copy link

Issues go stale after 30d of inactivity. Stale issues rot after an additional 7d of inactivity and eventually close.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Aug 3, 2023
@swzaaaaaaa
Copy link

请问,有解决方法吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants