-
Notifications
You must be signed in to change notification settings - Fork 294
-
Notifications
You must be signed in to change notification settings - Fork 294
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Field API name should be case insensitive in sdl mapping file #43
Comments
Really need this issue to be fixed. Is there problem in Metadata API? |
I'll take a look in the next few days.The original issue makes sense. |
Would be fantastic. |
Got a big fix for this. Except I didn't fix the UI showing the CSV headers in upper-case. I'm just making the SDL files case insenstive when loading them. Case sensitivity or casing beyond this is a different issue. I should check in the fix in the next few days but you'll have to build the jar to get the update until we have our next release (Fall of '15). |
Perfect! Waiting for updates on git. |
@rjmazzeo Did you have a chance to test it? Is it possible you push bug-fix this week? |
Unfortunately, this bug but doesn't qualify for pushing to main or snapping a build right away. It should be promoted from the develop branch to the main branch in the next few weeks and a new build available in the September timeframe. If you want it now, I would encourage you to checkout the "develop" branch and build the jar yourself. |
Hi Rocky, Thank you for this fix. Please let me know if I am wrong but issue with case sensitivity of mapping file .sdl still in place. Please note that I generate Mapping file automatically from source csv header (first line of CSV) Please let me know if I am doing something incorrect. Thanks a lot |
Hmm.. I changed the code to the use CSV and MetadataApi as the authority for case and ignore incoming case in the SDL. Test team confirmed the fix. Can you send one of your mapping files and describe your usage scenario (high-level listing of steps)? |
Please see the .csv source, .sdl mapping and .jar dataloader that built from development branch: https://www.dropbox.com/sh/0aa4gvc0kbzposh/AADyXDdZ13XFSXtFZI051DuBa?dl=0 1 - Using apache ant and dataloader I retrieve data (1 record) from source org (see .csv in dropbox) <copy file="build/dataloader/ts2__Config__c.csv" tofile="build/dataloader/ts2__Config__c.sdl">
<filterchain>
<headfilter lines="1"/>
</filterchain>
</copy>
<replaceregexp file="build/dataloader/ts2__Config__c.sdl" match='"(\w+)",?' byline="true" flags='g'>
<substitution expression="\1=\1\${line.separator}"/>
</replaceregexp> please see generated .sdl mapping 3 - When I try to insert data from .csv using .sdl it returns an error like: required field is missed (the same that I received with previous version of dataloader) |
I'll try to create an entity that roughly corresponds to yours and let you know. |
Umm... how did you create that entity? The double underscore in the api field names are being rejected. I can continue with the test, but I'll have to change your sdl to have 1 underscore on the api names. |
Confirmed... without the second "_", the bug does not repro. Let me know what process you used to do the double underscore. There may be a different bug going on here. |
In salesforce fields and object names that included to pacakge must have a prefix followed by double underscore. For instance we have 2 package installed in salesforce: Dog (with dog namespace) and Cat (with cat namespace). Both packages have field named Tail. So we will have: dog__Tail__c and cat__Tail__c. So in salesforce double underscore have 2 meanings: |
the ending I knew about. The packaging of entities I have not personally done. Let me fiddle with this a bit more. |
But fields with such naming are gone to csv file when we use data loader via command line interface. <bean id="csvGetConfig" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
<description>Pull config from trialforce org</description>
<property name="name" value="csvGetConfig"/>
<property name="configOverrideMap">
<map>
<entry key="sfdc.entity" value="ts2__Config__c"/>
<entry key="process.operation" value="extract"/>
<entry key="sfdc.extractionSOQL" value="Select ts2__Tracking_Url_Address__c,Name From ts2__Config__c LIMIT 1"/>
<entry key="dataAccess.name" value="build/dataloader/ts2__Config__c.csv" />
<entry key="dataAccess.type" value="csvWrite" />
</map>
</property>
</bean> |
Yeah. I don't know the issue yet. It did not reproduce for me but I do not have your exact situation replicated. Since it does not reproduce for me with the non-packaged entity definition I created... I need to pursue a entity derived from a package definition to see if it behaves differently. Can you confirm that it works if you correct the case on your SDL? |
Yes. I have just uploaded a sdl mapping file that worked for me to dropbox folder |
You can install a free package from appexchange to test how the data loader works with fields with namespace. Just for instance |
Apologies I did not get back to this. I'll try again on Monday. If we still fail to make traction on this then you'll have to file a support ticket so our investigation team work with the account in question. I'll let you know Monday if I see something. |
Appreciate your help. Thanks a lot. |
Hi Rocky, Did you have a chance to take a look at this issue? |
Not yet. I'll try out the package you recommended to see if I can repro it today. |
Ok, I tried it out with the recommended package and the new code worked like a champ. I'm suspecting it's a stale jar in your situation. If you think there is still a bug then you'll have to file a support issue. The investigation team will be able to directly use your org to see whats up. |
Ok, will try to recompile jar. |
One of the objects created by the package you mentioned was vacation request. I created a few of those and made sure I could export/import them. Come to think of it I did not try the command line parser. I've been under the assumption the GUI and command line are using the same mapping provider. Let me try that one real quick. |
Confirmed working with the CLI.
and the sdl (where I capped the case to make exact matches fail)
|
Recompiled dataloader from develop branch. Still does not work. |
No, I've used the MacOS to build. The jar you sent me had the fixed code so I don't suspect a bad build. I suspect something else. It's either the app is running the wrong jar or the issue you are experience is specific to your scenario. The former you can take care of yourself by making sure there is no other copy in the classpath (make it easy and remove all other copies of the jar). If it still repros then we'll have to involve support to figure out why your scenario is behaving differently than mine. I will probablly be the person helping you once the initial investigation is underway but the tooling will be in place for me to properly help you. |
Thanks a lot for your help, <java classname="com.salesforce.dataloader.process.ProcessRunner" classpath="dataloader/dataloader-34.0-uber.jar" failonerror="true" fork="true">
<sysproperty key="salesforce.config.dir" value="dataloader/conf"/>
<arg line="process.name=csvInsertConfig"/>
<arg line="sfdc.endpoint=${sf.target.url}"/>
<arg line="sfdc.username=${sf.target.un}"/>
<arg line="sfdc.password=${sf.target.pw.encrypted}"/>
<arg line="process.encryptionKeyFile=${basedir}/build/dataloader/key.txt"/>
<arg line="process.mappingFile=${basedir}/build/dataloader/ts2__Config__c.sdl"/>
</java> So, I think classpath set correctly and point to dataloader-34.0-uber.jar One more thing I am trying to load data to salesforce Custom Settings (not an regular object). Probably there can be problem in salesforce platform - will check tomorrow |
Closing as it is an old issue and data loader is currently in maintenance mode. |
When I load an existing mapping file in the data loader, the Salesforce field API names are treated as case sensitive. Since API names are generally case insensitive, they should also be so here.
The text was updated successfully, but these errors were encountered: