You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So how do I download a large file that takes longer than 100 seconds to download?
I'm getting exceptions like:
Unhandled Exception: System.AggregateException: One or more errors occurred. (Exception raised by job) ---> System.Exception: Exception raised by job ---> System.Exception: Failed to get response ---> System.AggregateException: One or more errors occurred. (A task was canceled.) ---> System.Threading.Tasks.TaskCanceledException: A task was canceled.
--- End of inner exception stack trace ---
at System.Threading.Tasks.Task`1.GetResultCore(Boolean waitCompletionNotification)
at Hopac.Core.TaskToJobAwaiter`1.DoWork(Worker& wr) in /Users/h/dev/logibit/Hopac/Libs/Hopac.Core/External/Tasks.cs:line 119
at Hopac.Core.Worker.RunOnThisThread(Scheduler sr, Work work) in /Users/h/dev/logibit/Hopac/Libs/Hopac.Core/Engine/Worker.cs:line 100
--- End of inner exception stack trace ---
at HttpFs.Client.getResponseOrFail@974[a,b](FSharpChoice`2 _arg1)
at Hopac.Core.ContMap`2.DoCont(Worker& wr, X x) in /Users/h/dev/logibit/Hopac/Libs/Hopac.Core/Flow/Job.cs:line 67
at Hopac.Core.Worker.Run(Scheduler sr, Int32 me) in /Users/h/dev/logibit/Hopac/Libs/Hopac.Core/Engine/Worker.cs:line 160
--- End of inner exception stack trace ---
at Hopac.Scheduler.run[x](Scheduler sr, Job`1 xJ) in /Users/h/dev/logibit/Hopac/Libs/Hopac/Hopac.fs:line 489
when I try to download large files. It regularly occurs if a download takes longer than 100 secs. My program successfully downloads smaller files so I don't think its the program logic itself. Or I get exceptions if I turn on a VPN that slows down my download speed by 10x (thereby causing downloads to take longer than the 100s limit) whereas the program works if I turn off the VPN to again download under the time limit.
I currently am doing this:
let bodyBytes, statusCode, contentType =
try
Hopac.Hopac.job {
use! response = HttpFs.Client.getResponse request
let! bodyBytes = HttpFs.Client.Response.readBodyAsBytes response
let status = response.statusCode
let contentType = response.headers.[HttpFs.Client.ResponseHeader.ContentTypeResponse]
return bodyBytes, status, contentType
}
|> Hopac.Hopac.run
where I write out the bodyBytes using System.IO.File.WriteAllBytes(cachedFilename, bodyBytes), and run that using Array.Parallel.iter. I figure this isn't optimum behavior but I was trying to get around the no response exception.
Earlier I also tried doing it via:
Hopac.Hopac.job {
use! response = HttpFs.Client.getResponse request
let! bodyBytes = HttpFs.Client.Response.readBodyAsBytes response
let contentType = response.headers.[HttpFs.Client.ResponseHeader.ContentTypeResponse]
use filestream = new System.IO.FileStream(cachedFilename, System.IO.FileMode.Create)
do! Hopac.Job.awaitUnitTask (response.body.CopyToAsync filestream)
}
but I get the same error.
For reference other people have seen similar problems with timeouts (some in different .Net libraries):
Beginner question: How to correctly download large files #153
(I think a misleading answer was given to issue #153)
"The Timeout property has no effect on asynchronous requests made with the BeginGetResponse or BeginGetRequestStream method.
Caution
In the case of asynchronous requests, the client application implements its own time-out mechanism. Refer to the example in the BeginGetResponse method."
letmyCustomClient=letclient=new HttpClient(new HttpClientHandler())//timeout set to 5 min
client.Timeout <- TimeSpan.FromMinutes(5.0)
client.DefaultRequestHeaders.Clear()
client
...let!req= Request.createWithClient myCustomClient Get <| Uri(url)
"Starting with .NET Core 2.1, the System.Net.Http.SocketsHttpHandler class instead of HttpClientHandler provides the implementation used by higher-level HTTP networking APIs."
Since I am writing a .Net Core 2.1 program and you weren't modifying anything on HttpClientHandler anyway. And that way I get all the benefits of the new SocketsHttpHandler class.
So how do I download a large file that takes longer than 100 seconds to download?
I'm getting exceptions like:
when I try to download large files. It regularly occurs if a download takes longer than 100 secs. My program successfully downloads smaller files so I don't think its the program logic itself. Or I get exceptions if I turn on a VPN that slows down my download speed by 10x (thereby causing downloads to take longer than the 100s limit) whereas the program works if I turn off the VPN to again download under the time limit.
I currently am doing this:
where I write out the bodyBytes using System.IO.File.WriteAllBytes(cachedFilename, bodyBytes), and run that using Array.Parallel.iter. I figure this isn't optimum behavior but I was trying to get around the no response exception.
Earlier I also tried doing it via:
but I get the same error.
For reference other people have seen similar problems with timeouts (some in different .Net libraries):
Beginner question: How to correctly download large files #153
(I think a misleading answer was given to issue #153)
How to change the time-out value of request in HTTP Utilities #802
fsprojects/FSharp.Data#802
Use HttpWebRequest timeout for async getResponse in HTTP utils #920
fsprojects/FSharp.Data#920
"Note that if caller doesn't provide a timeout through customiseHttpRequest, the default value from HttpWebRequest will still be used (100s)."
https://docs.microsoft.com/en-us/dotnet/api/system.net.httpwebrequest.timeout?redirectedfrom=MSDN&view=netframework-4.7.2#System_Net_HttpWebRequest_Timeout
"The Timeout property has no effect on asynchronous requests made with the BeginGetResponse or BeginGetRequestStream method.
Caution
In the case of asynchronous requests, the client application implements its own time-out mechanism. Refer to the example in the BeginGetResponse method."
https://docs.microsoft.com/en-us/dotnet/api/system.net.httpwebrequest.begingetresponse?view=netframework-4.7.2
The text was updated successfully, but these errors were encountered: