CDAP-15492 Salesforce streaming integration tests fail, jar includes unnecessary Spark#38
Conversation
|
Thanks for your pull request. It looks like this may be your first contribution to a Google open source project (if not, look below for help). Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). 📝 Please visit https://cla.developers.google.com/ to sign. Once you've signed (or fixed any issues), please reply here (e.g. What to do if you already signed the CLAIndividual signers
Corporate signers
ℹ️ Googlers: Go here for more info. |
2cbad73 to
8ac9e2e
Compare
|
@albertshau do you think we should keep the jira open after this is merged? I cannot see any viable solution for this though. |
…unnecessary Spark
8ac9e2e to
529a374
Compare
albertshau
left a comment
There was a problem hiding this comment.
lgtm. Yes let's keep the jira open so we don't lose it.
| <commons.csv.version>1.6</commons.csv.version> | ||
| <jackson.version>1.9.13</jackson.version> | ||
| <jackson2.version>2.9.9</jackson2.version> | ||
| <jackson2.version>2.6.7</jackson2.version> |
There was a problem hiding this comment.
don't think we can package this version. Can you try updating spark to a version that uses a newer jackson? If that doesn't work, since you mentioned this only breaks the test and not the actual plugin, let's add a note about overriding this when running the test and keep an jira open for fixing this.
This happens because com.fasterxml.jackson.core/jackson-databind 2.6.7 is used by org.apache.spark/spark-core_2.11. While with the patch we use 2.9.9.
So I guess when we classload all the classes into a single scope wrong jackson-databind gets loaded, causing spark to fail.
2. Salesforce jar has size of 32Mb currently. Without spark core which is provided by CDAP, therefore unnecessary, it has size of 9Mb.
3. Fix a wrong default value for SFDC widget causing "unknown value" problems.