Problem with the data loader #25

Closed
icsd06ogdi opened this Issue Jun 3, 2012 · 14 comments

Projects

None yet

3 participants

@icsd06ogdi

I have a great problem in the last step of running the ogdi.
I have tested both v2 and v4 and I face exactly the same problem with the data loader utility.
I created all storages and services all went well and when i try to upload a simple csv file (in v4 its not , seperated but ; seperated) it seems to be imposible.
I try to upload a simple CSV file with contents like below:

a,b,c,d,e
1,51,35,14,2
2,49,30,14,2
3,47,32,13,2
4,46,31,15,2
5,50,36,14,2
6,54,39,17,4

The cfg that is created is below:

string string string string string (UTC+02:00) Athens, Bucharest, Istanbul b a Name New.Guid EntitySet EntityKind iris iris iris iris 2012-06-03T01:12:46.1453125+03:00 2012-06-03T01:12:46.1453125+03:00 2012-06-04T01:12:46.1453125+03:00 false

When the cfg is created and i push the start button i get the result below:

Entity Start:
Name = 'EntityId' Value = '2825dda9-909a-4e80-bf89-c951727a2b92'
Name = 'primar' Value = '1'
Name = 'a' Value = '51'
Name = 'b' Value = '35'
Name = 'c' Value = '14'
Name = 'd' Value = '2'
Entity End

Value cannot be null.
Parameter name: field

Failed to process entity.

Entity Start:
Name = 'EntityId' Value = 'fb64a28d-f716-4644-9fc9-2398f2c22939'
Name = 'primar' Value = '2'
Name = 'a' Value = '49'
Name = 'b' Value = '30'
Name = 'c' Value = '14'
Name = 'd' Value = '2'
Entity End

Value cannot be null.
Parameter name: field

Failed to process entity.

Entity Start:
Name = 'EntityId' Value = '75cdffbf-7add-4e7d-a283-1e37fa093fc2'
Name = 'primar' Value = '3'
Name = 'a' Value = '47'
Name = 'b' Value = '32'
Name = 'c' Value = '13'
Name = 'd' Value = '2'
Entity End

Value cannot be null.
Parameter name: field

Failed to process entity.

Entity Start:
Name = 'EntityId' Value = 'ca164c9c-9059-4654-b30f-71f6a3b2934a'
Name = 'primar' Value = '4'
Name = 'a' Value = '46'
Name = 'b' Value = '31'
Name = 'c' Value = '15'
Name = 'd' Value = '2'
Entity End

Value cannot be null.
Parameter name: field

Failed to process entity.

Entity Start:
Name = 'EntityId' Value = '2b86197d-731e-4a84-99a6-5f177382aff7'
Name = 'primar' Value = '5'
Name = 'a' Value = '50'
Name = 'b' Value = '36'
Name = 'c' Value = '14'
Name = 'd' Value = '2'
Entity End

Thanks

@nikg
Member
nikg commented Jun 4, 2012

Can you upload / link to your CSV file? What's the table name? Does it happen to have <3 characters?

@icsd06ogdi

Well unfortunately i see the cfg tags cant be presented.
Tha table name is iris and the data i try to upload is here:
a,b,c,d,e
1,51,35,14,2
2,49,30,14,2
3,47,32,13,2
4,46,31,15,2
5,50,36,14,2
6,54,39,17,4
....(etc)

I have tried anything you can imagine. I face the same problem whatever i do. In both v2 and v4 ogdi project.

@icsd06ogdi

Here is also another person who faced the same problem two years ago in v2 version and unfortunately no one was there to help him.
http://ogdi.codeplex.com/workitem/13684

@nikg
Member
nikg commented Jun 4, 2012

I don't think the v2 issue is what you are experiencing. Try renaming the field names to be longer than 1 character, minimum of 3; make sure you don't have a blank column in your CSV. Ideally, upload the CSV and cfg somewhere so that we can work with the exact files to troubleshoot. Just 3 rows of data should suffice

@remiolivier
Member

Another hint could be to set the file's encoding to Unicode.
Typically some characters are not supported according to specifics encoding throwing an error in the DataLoader.

With Notepad : Save as -> Encoding -> Unicode

@icsd06ogdi

Well first of all I would like to thank you as I have been trying for a long time and any help is valuable as I cant finish my end-of -diploma thesis without this upload.

I tried both hints and neither the first nor the second worked.
Here is the CSV:http://www.mediafire.com/?6nroujn4ccswnzi
and here the cfg that is being created during the procedure:http://www.mediafire.com/?q80ck8e029lfndf

Thanks in advance.

@nikg
Member
nikg commented Jun 4, 2012

@icsd06ogdi this is simple. You shouldn't try to "bind to map" a dataset with no coordinates.

In the DataLoader under "Dataset Columns" uncheck "Bind to Map".

here's your dataset loaded: http://datadotgc2.cloudapp.net/DataBrowser/Hack%20OpenData/iris#param=NOFILTER--DataView--Results

let me know any questions!

@icsd06ogdi

Thanks, i left it by default. Now it seems that it is uploaded but I face another problem.
Here is the link of the service where the data should have been uploaded but i do not see the expected page format (just like the one in the video):http://351ca9230784443bade40c0dabe6bd00.cloudapp.net/v1

There is no such xml format and of course na data set to search.

Thanks.

@icsd06ogdi

Here is also a printscreen of what I get:http://www.mediafire.com/?yrtgnw5v03id745

@nikg
Member
nikg commented Jun 4, 2012

You need to debug locally, but my guess - did you run the ConfigTool to set up the DataBrowser connection to the storage?


From: icsd06ogdi
Sent: 04/06/2012 11:21 AM
To: Nik Garkusha
Subject: Re: [DataLab] Problem with the data loader (#25)

Here is also a printscreen of what I get:http://www.mediafire.com/?yrtgnw5v03id745


Reply to this email directly or view it on GitHub:
#25 (comment)

@icsd06ogdi

As for configuration I did exactly what the first video here says:http://ogdi.codeplex.com/documentation
And I am pretty sure that there are problems as regards tha row of the steps we need to do.
I run the v2 but i dont think there is any problem. I only need to upload one singe csv. Nothing more.

However i have a question: I created the essential two storages and during the deploy I was asked to place a service as no service was on windows azure. I did create new service to deploy, it created also new storage (3rd).
The name and key of this 3rd storage is never used in my code. I mean that all the configuration comprises the first and the second storage's information. There is I think a problem.

@icsd06ogdi

To be more clear, i run the config tool and there I gave the information of my second storage which is supposed to be the data storage. Why on earth during deploy it created a new storage? And am I supposed to use this storage's information anywhere in my code?

OgdiConfigConnectionString and the DiognosticsConnectionString use the first storage's information
DataConnectionString points to the second storage.

And the information of the 3rd storage created during the creation of the service?

@icsd06ogdi

And here are also three printscreens: Storage accounts, Service, DataBrowserConfig through Config tool.
http://www.mediafire.com/?w1zby4155ty5o2w
http://www.mediafire.com/?fkbdbd26boo0hz7
http://www.mediafire.com/?1u0121n5po56cg2

You can't imagine how long time I have been trying to upload one single CSV. I dont know why I face again a problem.

@nikg
Member
nikg commented Jun 4, 2012

Have you followed the wiki https://github.com/openlab/DataLab/wiki and the details here: https://github.com/openlab/DataLab/wiki/Configuring-OGDI-for-Deployment-in-Windows-Azure

Refer to the service topology for info on different Storage accounts: https://github.com/openlab/DataLab/wiki/OGDI-Service-Topology

You can also refer to a detailed walk-through here: http://myopencity.ca/installing-ogdi-version-5-on-azure-part-3/

I'm closing the issue, as the original "DataLoader" question was addressed.

@nikg nikg closed this Jun 4, 2012
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment