Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using it within a goroutine can cause inaccurate delay measurements returned by the Time function. #699

Closed
Cloverk1t opened this issue Sep 22, 2023 · 2 comments

Comments

@Cloverk1t
Copy link

I used resty to create an http client through which to test a website. Through 50 cycles, observe the change of the delay after the client obtains data. Request latency values are normal when goroutine is not used. However, if you use goroutine, you will find that the request latency value increases.

goroutine is not used

test code

package main

import (
	"log"

	"github.com/go-resty/resty/v2"
)

func main() {
	r := resty.New()

	for i := 0; i < 50; i++ {
		resp, err := r.R().Get("https://www.baidu.com")
		if err != nil {
			log.Println(err)
		}

		log.Println("latency", resp.Time().String())
	}
}

result

2023/09/22 15:11:16 latency 61.541147ms
2023/09/22 15:11:17 latency 13.174302ms
2023/09/22 15:11:17 latency 13.010557ms
2023/09/22 15:11:17 latency 12.856145ms
2023/09/22 15:11:17 latency 13.877801ms
2023/09/22 15:11:17 latency 13.18593ms
2023/09/22 15:11:17 latency 13.87356ms
2023/09/22 15:11:17 latency 12.868666ms
2023/09/22 15:11:17 latency 16.017437ms
2023/09/22 15:11:17 latency 13.013273ms
2023/09/22 15:11:17 latency 13.784125ms
2023/09/22 15:11:17 latency 13.103677ms
2023/09/22 15:11:17 latency 13.047946ms
2023/09/22 15:11:17 latency 12.855595ms
2023/09/22 15:11:17 latency 12.923358ms
2023/09/22 15:11:17 latency 13.17819ms
2023/09/22 15:11:17 latency 12.875555ms
2023/09/22 15:11:17 latency 13.301265ms
2023/09/22 15:11:17 latency 12.585781ms
2023/09/22 15:11:17 latency 11.871888ms
2023/09/22 15:11:17 latency 13.086154ms
2023/09/22 15:11:17 latency 12.955003ms
2023/09/22 15:11:17 latency 13.037251ms
2023/09/22 15:11:17 latency 13.232548ms
2023/09/22 15:11:17 latency 12.564897ms
2023/09/22 15:11:17 latency 12.410482ms
2023/09/22 15:11:17 latency 13.004386ms
2023/09/22 15:11:17 latency 13.618793ms
2023/09/22 15:11:17 latency 12.906188ms
2023/09/22 15:11:17 latency 13.164312ms
2023/09/22 15:11:17 latency 16.874222ms
2023/09/22 15:11:17 latency 12.977787ms
2023/09/22 15:11:17 latency 13.063648ms
2023/09/22 15:11:17 latency 12.830305ms
2023/09/22 15:11:17 latency 12.941254ms
2023/09/22 15:11:17 latency 13.276458ms
2023/09/22 15:11:17 latency 12.807105ms
2023/09/22 15:11:17 latency 13.000529ms
2023/09/22 15:11:17 latency 12.994953ms
2023/09/22 15:11:17 latency 12.244618ms
2023/09/22 15:11:17 latency 12.655327ms
2023/09/22 15:11:17 latency 13.90941ms
2023/09/22 15:11:17 latency 13.038211ms
2023/09/22 15:11:17 latency 11.82425ms
2023/09/22 15:11:17 latency 11.996318ms
2023/09/22 15:11:17 latency 13.219059ms
2023/09/22 15:11:17 latency 13.714828ms
2023/09/22 15:11:17 latency 11.937539ms
2023/09/22 15:11:17 latency 12.007648ms
2023/09/22 15:11:17 latency 12.248625ms

goroutine is used

package main

import (
	"log"
	"sync"

	"github.com/go-resty/resty/v2"
)

func main() {
	r := resty.New()
	wg := sync.WaitGroup{}

	for i := 0; i < 50; i++ {
		wg.Add(1)
		go func() {
			defer wg.Done()

			resp, err := r.R().Get("https://www.baidu.com")
			if err != nil {
				log.Println(err)
			}

			log.Println("latency", resp.Time().String())
		}()
	}

	wg.Wait()
}

result

2023/09/22 15:10:07 latency 66.406218ms
2023/09/22 15:10:07 latency 66.277814ms
2023/09/22 15:10:07 latency 66.827802ms
2023/09/22 15:10:07 latency 66.901955ms
2023/09/22 15:10:07 latency 69.37684ms
2023/09/22 15:10:07 latency 69.831271ms
2023/09/22 15:10:07 latency 76.456447ms
2023/09/22 15:10:07 latency 75.843056ms
2023/09/22 15:10:07 latency 76.052466ms
2023/09/22 15:10:07 latency 78.455793ms
2023/09/22 15:10:07 latency 78.448611ms
2023/09/22 15:10:07 latency 80.199273ms
2023/09/22 15:10:07 latency 80.409344ms
2023/09/22 15:10:07 latency 81.160046ms
2023/09/22 15:10:07 latency 82.200197ms
2023/09/22 15:10:07 latency 82.189909ms
2023/09/22 15:10:07 latency 84.97184ms
2023/09/22 15:10:07 latency 87.233604ms
2023/09/22 15:10:07 latency 88.752593ms
2023/09/22 15:10:07 latency 89.143926ms
2023/09/22 15:10:07 latency 90.09053ms
2023/09/22 15:10:07 latency 90.46183ms
2023/09/22 15:10:07 latency 90.30296ms
2023/09/22 15:10:07 latency 90.733339ms
2023/09/22 15:10:07 latency 90.991163ms
2023/09/22 15:10:07 latency 93.018959ms
2023/09/22 15:10:07 latency 93.197034ms
2023/09/22 15:10:07 latency 93.352431ms
2023/09/22 15:10:07 latency 94.316382ms
2023/09/22 15:10:07 latency 94.084157ms
2023/09/22 15:10:07 latency 94.892879ms
2023/09/22 15:10:07 latency 94.57857ms
2023/09/22 15:10:07 latency 95.935302ms
2023/09/22 15:10:07 latency 96.208132ms
2023/09/22 15:10:07 latency 97.826608ms
2023/09/22 15:10:07 latency 99.944758ms
2023/09/22 15:10:07 latency 100.955956ms
2023/09/22 15:10:07 latency 101.148486ms
2023/09/22 15:10:07 latency 101.174599ms
2023/09/22 15:10:07 latency 101.720125ms
2023/09/22 15:10:07 latency 102.852113ms
2023/09/22 15:10:07 latency 102.86166ms
2023/09/22 15:10:07 latency 103.738887ms
2023/09/22 15:10:07 latency 103.847566ms
2023/09/22 15:10:07 latency 104.016016ms
2023/09/22 15:10:07 latency 105.560349ms
2023/09/22 15:10:07 latency 106.38444ms
2023/09/22 15:10:07 latency 105.657236ms
2023/09/22 15:10:07 latency 110.001929ms
2023/09/22 15:10:08 latency 274.658808ms
@jeevatkm
Copy link
Member

@cloverkits, I need clarification on your goals and objectives. I will try to provide facts. Typically, goroutine gets used where concurrent/parallelism is required to optimize overall processing time and efficiency.
I think the above goroutine example wouldn't provide valuable information on time measurement.
Have you tried the Go default HTTP client with your goroutine example?

@Cloverk1t
Copy link
Author

@jeevatkm yep! Default HTTP client maybe have something wrong in goroutine. In your words "concurrent/parallelism is required to optimize overall processing time" which is my case. I need reads multiple objects in parallel from one server multi interfaces.

example code

package main

import (
	"io"
	"log"
	"net/http"
	"sync"
	"time"
)

func main() {
	wg := sync.WaitGroup{}

	for i := 0; i < 50; i++ {

		wg.Add(1)
		go func() {
			defer wg.Done()

			startAt := time.Now()
			resp, err := http.Get("https://www.baidu.com")
			if err != nil {
				log.Println(err)
			}
			defer resp.Body.Close()

			log.Println("latency:", time.Since(startAt))

			_, err = io.ReadAll(resp.Body)
			if err != nil {
				log.Println(err)
			}

		}()
	}

	wg.Wait()
}

result

2023/09/25 10:12:23 latency: 69.907033ms
2023/09/25 10:12:23 latency: 69.614386ms
2023/09/25 10:12:23 latency: 70.939928ms
2023/09/25 10:12:23 latency: 70.62763ms
2023/09/25 10:12:23 latency: 71.418996ms
2023/09/25 10:12:23 latency: 72.425473ms
2023/09/25 10:12:23 latency: 78.597691ms
2023/09/25 10:12:23 latency: 81.947911ms
2023/09/25 10:12:23 latency: 82.902086ms
2023/09/25 10:12:23 latency: 84.898575ms
2023/09/25 10:12:23 latency: 84.801553ms
2023/09/25 10:12:23 latency: 84.496353ms
2023/09/25 10:12:23 latency: 85.732495ms
2023/09/25 10:12:23 latency: 85.642949ms
2023/09/25 10:12:23 latency: 85.952062ms
2023/09/25 10:12:23 latency: 86.141165ms
2023/09/25 10:12:23 latency: 87.003269ms
2023/09/25 10:12:23 latency: 87.436695ms
2023/09/25 10:12:23 latency: 91.543479ms
2023/09/25 10:12:23 latency: 91.84506ms
2023/09/25 10:12:23 latency: 92.977854ms
2023/09/25 10:12:23 latency: 93.429503ms
2023/09/25 10:12:23 latency: 94.71919ms
2023/09/25 10:12:23 latency: 95.130987ms
2023/09/25 10:12:23 latency: 97.990181ms
2023/09/25 10:12:23 latency: 98.086186ms
2023/09/25 10:12:23 latency: 98.242818ms
2023/09/25 10:12:23 latency: 98.315098ms
2023/09/25 10:12:23 latency: 98.920861ms
2023/09/25 10:12:23 latency: 99.129425ms
2023/09/25 10:12:23 latency: 100.425497ms
2023/09/25 10:12:23 latency: 101.000571ms
2023/09/25 10:12:23 latency: 101.15033ms
2023/09/25 10:12:23 latency: 100.735866ms
2023/09/25 10:12:23 latency: 101.528584ms
2023/09/25 10:12:23 latency: 101.650535ms
2023/09/25 10:12:23 latency: 101.478554ms
2023/09/25 10:12:23 latency: 104.253945ms
2023/09/25 10:12:23 latency: 104.981183ms
2023/09/25 10:12:23 latency: 105.334097ms
2023/09/25 10:12:23 latency: 105.663269ms
2023/09/25 10:12:23 latency: 106.390947ms
2023/09/25 10:12:23 latency: 106.30004ms
2023/09/25 10:12:23 latency: 107.450104ms
2023/09/25 10:12:23 latency: 110.235126ms
2023/09/25 10:12:23 latency: 110.161177ms
2023/09/25 10:12:23 latency: 110.386901ms
2023/09/25 10:12:23 latency: 110.360874ms
2023/09/25 10:12:23 latency: 110.623886ms
2023/09/25 10:12:23 latency: 111.314326ms

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants