Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[optimize] 分片上传建议限制 goroutine 数量,上传大文件时会占用大量内存 #5

Closed
LinkinStars opened this issue Apr 28, 2021 · 1 comment

Comments

@LinkinStars
Copy link

	file, err := os.Open(u.LocalFilePath)
	if err != nil {
		return ret, err
	}
	defer file.Close()
	uploadRespChan := make(chan SuperFile2UploadResponse)
	for i := 0; i < sliceNum; i++ {
		buffer := make([]byte, sliceSize)
		n, err := file.Read(buffer[:])
		if err != nil && err != io.EOF {
			log.Println("file.Read failed, err:", err)
			return ret, err
		}
		if n == 0 {
			break
		}
                // 如果分片数量过多,这里会开大量 goroutine 进行上传,导致瞬间占用内存巨大
		go func(partSeq int, partByte []byte) {
			uploadResp, err := u.SuperFile2Upload(uploadID, partSeq, partByte)
			uploadRespChan <- uploadResp
			if err != nil {
				log.Printf("SuperFile2UploadFailed, partseq[%d] err[%v]", partSeq, err)
			}
		}(i, buffer[0:n])
	}

image

@jsyzchen
Copy link
Owner

jsyzchen commented May 9, 2021

在v0.0.7版本里已优化

@jsyzchen jsyzchen closed this as completed May 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants