Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Import error under Scala 2.10 #14

Open
mattpap opened this issue Oct 11, 2014 · 3 comments
Open

Import error under Scala 2.10 #14

mattpap opened this issue Oct 11, 2014 · 3 comments
Labels

Comments

@mattpap
Copy link
Owner

mattpap commented Oct 11, 2014

Originally submitted in #7 by @lev112:

What is the right way to compile with scala 2.10?

I've tried to set scalaVersion := "2.10.4" in Build.scala, and it compiles,
but I see some strange behavior...

I try to import spark package, and in the console it works fine,
but in the notebook I get an error:
error: object spark is not a member of package org.apache

Is it a bug or did I do something wrong?
(in IScala 0.1 the same code works fine)

@mattpap
Copy link
Owner Author

mattpap commented Oct 11, 2014

Can you show how you run IScala and resolve/import dependencies? I tried org.apache.spark in console and notebook under 2.10, both using command line (-m option) or through %libraryDependencies and %update, and spark always imports. I used "org.apache.spark" %% "spark-core" % "0.9.1" dependency.

btw. Although the default target is 2.11, you can build for 2.10 without Build.scala modifications. Just issue ++2.10.4 compile (or release, etc.) in sbt. Or you can just issue ++2.10.4 which will switch Scala version for the remaining commands (see [1]).

[1] http://www.scala-sbt.org/0.13/docs/Command-Line-Reference.html

@lev112
Copy link

lev112 commented Oct 11, 2014

While trying to recreate the error, I was able to fix my problem.
But I will describe the way to recreate the problem, because I think there is a real issue here.
I saw this behavior on both the notebook and the console.

This code works fine:

%libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
%update
import org.apache.spark.SparkContext

(restart the notebook before the next step)

But if I first do the import (and it fails because there is no libraryDependencies):

import org.apache.spark.SparkContext
<console>:7: error: object spark is not a member of package org.apache

running the same code from before will fail:

%libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
%update
import org.apache.spark.SparkContext
<console>:7: error: object spark is not a member of package org.apache

(in the update logs, it said that spark was resolved)

At this point it is impossible to do the import, and you have to restart the notebook.
It's not a big issue, but can be really annoying that the notebook can get to a broken state and need to be restarted.

And thanks for the ++ command in sbt

@mattpap
Copy link
Owner Author

mattpap commented Oct 11, 2014

There is a very unfortunate bug currently in master. Mainly, after the interpreter is started, you can't change the classpath. This used to work and %reset was sufficient to achieve this (%update does reset automatically). I added a special case where if you run %libraryDependencies and %update before initializing the interpreter, then %update will update the classpath and no reset is necessary. Until this is fixed, I wouldn't depend on %reset doing anything useful. I hope to fix this soon, because it's really annoying and confusing.

If unsure, you can pass library dependencies via command line arguments, e.g.:

bin/2.10/notebook -m org.apache.spark::spark-core:0.9.1

(this isn't document anywhere yet).

@mattpap mattpap added the bug label Oct 21, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants