Skip to content
This repository has been archived by the owner on Oct 31, 2019. It is now read-only.

azurePutBlob does not work with files #42

Open
kjohnson4 opened this issue Jan 23, 2017 · 8 comments
Open

azurePutBlob does not work with files #42

kjohnson4 opened this issue Jan 23, 2017 · 8 comments

Comments

@kjohnson4
Copy link

I've read in a csv file (tmpFile), and now want to write that out to blob. Running this:
azurePutBlob(sc, blob = "tsv.txt", file = tmpFile)

Returns:
[1] "blob: directory/tsv.txt Saved: 1 bytes written"

And if I look at the blob file, it has just 1 character in it: '-'.
Looking at source, there is no handling of the file param. If contents is blank, it gets that character '-', and thats it.

I also tried:
azurePutBlob(sc, blob = "tsv.txt", contents = tmpFile)
And it run for awhile, looks like its uploading the data:
[1] "blob: directory/tsv.txt Saved: 545 bytes written" "blob: directory/tsv.txt Saved: 545 bytes written" [3] "blob: directory/tsv.txt Saved: 699 bytes written" "blob: directory/tsv.txt Saved: 545 bytes written" [5] "blob: directory/tsv.txt Saved: 545 bytes written" "blob: directory/tsv.txt Saved: 706 bytes written"

it goes up to [37] but then this is output and it stops uploading and the blob is never created:
Warning messages:
1: In if (nchar(contents) == 0) contents <- "-" :
the condition has length > 1 and only the first element will be used
2: In charToRaw(object) :
argument should be a character vector of length 1
all but the first element will be ignored

@Alanwe
Copy link
Member

Alanwe commented Jan 23, 2017

@kjohnson4 - Yes the local file function had some bugs and was removed for the initial release. I will remove the parameter from the function and documentation until it can get implemented correctly. Thanks for pointing out.

The azureBlobPut of an object from contents should work. As per your other post, can you confirm that you are not using classic mode and that other blob functions are working. Thanks

@kjohnson4
Copy link
Author

Sorry about that, I missed notes that the file function was bugged.

Other blob functions are working. I can Get (read) blobs and manipulate the data, and I'm trying to write those changes back, but can't seem to get this working. All our resources are managed, not classic, and veirfied that this SA was created new in RM, not in classic and moved.

@kjohnson4
Copy link
Author

Using your example on other issue posting:
azureCreateStorageContainer(sc,"myteststorage") library(RCurl) flightfile <- getURL("https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/244708/Home_Office_Air_Travel_Data_2011.csv") azurePutBlob(sc,contents=flightfile,blob = "flightinfo/flightfile.csv")

I noticed that flightfile is stored in "Values" of the RStudio environment. However all the files I'm reading in (using read.table, read.csv) will end up placing the result in the "Data" section, as the reading stores the file in a dataframe. Would this be causing an issue to put this data into Blob?

@Sebs030
Copy link

Sebs030 commented Jan 26, 2018

Hi, I am having a similar issue, when trying to pass in a dataframe for content I surprisingly get an authentication error:

No encoding supplied: defaulting to UTF-8.
No encoding supplied: defaulting to UTF-8.
Error: azurePutBlob()
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly
including the signature. RequestId:
The MAC signature found in the HTTP request 'KEY' is not the same as
any computed signature. Server used following string to sign: 'PUT

246

text/plain; charset=UTF-8

x-ms-blob-type:Blockblob x-ms-date:Fri, 26 Jan 2018 16:30:45 GMT x-ms-version:2017-04-17
PATH'.
Return code: 403
In addition: Warning message:
In charToRaw(object) : argument should be a character vector of length 1
all but the first element will be ignored

However if the data is passed in as a simple one row character by coercing the dataframe with toString() the function works and the data gets stored in the blob. Would be great if a dataframe could be directly stored as a file.

@bdb67
Copy link

bdb67 commented Apr 27, 2018

Hello, this is an important issue, is this being worked on? Thanks.

@ericchansen
Copy link

Any updates on this issue? Getting same problem as @Sebs030.

@balajisubudhi
Copy link

Not able to store an pdf file using azurePutBlob. it gives authentication error. any suggestion how to store pdf file in blob using R

@alexhallam
Copy link

I also have trouble uploading a dataframe. A solution that worked for me was to convert the dataframe to JSON.

json_USArrests<- USArrests %>% 
  as_tibble() %>% 
  jsonlite::toJSON()

The json_USArrests can then be passed to azurePutBlob. From that point you can do an Azure Data Factory injest if the goal is to put the data in an Azure SQL database, or you can read the blob and convert it back to a dataframe if you want to use it again in R.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

8 participants