You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After starting the process it quickly shot up to over a GB of memory usage. Then it failed. The folder contains 86.3 GBs of data. I'll try to get more details for you and then update this issue.
Update: this also occurs on a folder with 789 MBs, 535 files, and 24 folders.
The text was updated successfully, but these errors were encountered:
Ran this while debugging. I think it's a timeout issue.
System.TimeoutException was unhandled by user code
Message=Server operation did not finish within user specified timeout '90' seconds, check if operation is valid or try increasing the timeout.
Source=Microsoft.WindowsAzure.StorageClient
StackTrace:
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait()
at Microsoft.WindowsAzure.StorageClient.TaskImplHelper.ExecuteImplWithRetry(Func`1 impl, RetryPolicy policy)
at Microsoft.WindowsAzure.StorageClient.CloudBlob.FetchAttributes(BlobRequestOptions options)
at Microsoft.WindowsAzure.StorageClient.CloudBlob.FetchAttributes()
at Abc.ATrak.Azure.Exists() in D:\repos\misc\A-Trak\Azure.cs:line 86
at Abc.ATrak.Program.<>c__DisplayClass3.b__2(IStorageItem from, ParallelLoopState state) in D:\repos\misc\A-Trak\Program.cs:line 115
at System.Threading.Tasks.Parallel.<>c__DisplayClassf`1.b__c()
InnerException:
I'll try to reset the timeout policy and run again.
Yes, the files are copied into memory, so it is quite intensive operation if you have a large amount of data. We are going to move to streams for this type of operation.
Thanks for the stack trace and we will work on this very soon.
You can see the error here: http://wadewegner.blob.core.windows.net/images/Error.png
After starting the process it quickly shot up to over a GB of memory usage. Then it failed. The folder contains 86.3 GBs of data. I'll try to get more details for you and then update this issue.
Update: this also occurs on a folder with 789 MBs, 535 files, and 24 folders.
The text was updated successfully, but these errors were encountered: