Skip to content
This repository has been archived by the owner on May 26, 2020. It is now read-only.

Deep integration with Spark version 1.4 #22

Closed
austindsouza opened this issue Aug 24, 2015 · 3 comments
Closed

Deep integration with Spark version 1.4 #22

austindsouza opened this issue Aug 24, 2015 · 3 comments

Comments

@austindsouza
Copy link

Hi,

I was trying to integrate stratio-deep with spark 1.4.1, while doing this Stratio deep is compiling properly with Spark1.4.1
But while creating the distribution i am getting the error.

[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Networking 1.3.1
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-network-common_2.10 ---
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-network-common_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @ spark-network-common_2.10 ---
[INFO] Add Source directory: /tmp/stratio-deep-distribution/stratiospark/network/common/src/main/scala
[INFO] Add Test Source directory: /tmp/stratio-deep-distribution/stratiospark/network/common/src/test/scala
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-network-common_2.10 ---
[INFO] Source directory: /tmp/stratio-deep-distribution/stratiospark/network/common/src/main/scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-network-common_2.10 ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-network-common_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /tmp/stratio-deep-distribution/stratiospark/network/common/src/main/resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-network-common_2.10 ---
[INFO] Using zinc server for incremental compilation
[INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[info] Compiling 43 Java sources to /tmp/stratio-deep-distribution/stratiospark/network/common/target/scala-2.10/classes...
[info] Error occurred during initialization of VM
[info] java.lang.Error: Properties init: Could not determine current working directory.
[info] at java.lang.System.initProperties(Native Method)
[info] at java.lang.System.initializeSystemClass(System.java:1119)
[info]
[error] Compile failed at Aug 24, 2015 1:10:20 PM [0.056s]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 5.302 s]
[INFO] Spark Project Networking ........................... FAILURE [ 0.783 s]
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project Bagel ................................ SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.636 s
[INFO] Finished at: 2015-08-24T13:10:20+05:30
[INFO] Final Memory: 47M/318M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-network-common_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :spark-network-common_2.10
Cannot make Spark distribution

Meanwhile, I saw that Spark version 1.5 is released, so when can we expect this integration.

Thanks & regards.

@mafernandez-stratio
Copy link
Member

Hi Austin,

Currently, the development of Stratio Deep is deprecated. However, Stratio Crossdata has inherited part of the features of Stratio Deep. Please, visit link in order to get more information.

Regards

@austindsouza
Copy link
Author

Hello Miguel,

Thanks for your reply....
Does the new Crossdata also have Elasticsearch integrated with that?

Regards
On 25-Jan-2016 10:42 pm, "Miguel Angel Fernandez Diaz" <
notifications@github.com> wrote:

Hi Austin,

Currently, the development of Stratio Deep is deprecated. However, Stratio
Crossdata has inherited part of the features of Stratio Deep. Please, visit
link https://stratio.atlassian.net/wiki/display/CROSSDATA1x0/Home in
order to get more information.

Regards


Reply to this email directly or view it on GitHub
#22 (comment).

@mafernandez-stratio
Copy link
Member

Hi Austin,

Crossdata can be used with any of the datasources of Spark and it optimises the access to Cassandra, MongoDB and Elasticsearch. More information about Crossdata connectors here.

Regards

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants