Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix discarded return value Scala compiler warnings #970

Closed
jbaiera opened this issue May 3, 2017 · 2 comments · Fixed by #1011
Closed

Fix discarded return value Scala compiler warnings #970

jbaiera opened this issue May 3, 2017 · 2 comments · Fixed by #1011

Comments

@jbaiera
Copy link
Member

jbaiera commented May 3, 2017

Subtask of #969

Fix warnings related to discarding non-Unit values from method calls. These should either be checked, or their signatures changed to stop returning things they don't need to.

/Users/`james.baiera/Documents/source/elasticsearch-hadoop/spark/core/main/scala/org/elasticsearch/spark/rdd/EsSpark.scala:102: discarded non-Unit value
    rdd.sparkContext.runJob(rdd, new EsRDDWriter(config.save(), hasMeta).write _)
                           ^
/Users/james.baiera/Documents/source/elasticsearch-hadoop/spark/core/main/scala/org/elasticsearch/spark/rdd/JavaEsRDD.scala:49: discarded non-Unit value
    InitializationUtils.setValueReaderIfNotSet(settings, classOf[JdkValueReader], log)
                                              ^
/Users/james.baiera/Documents/source/elasticsearch-hadoop/spark/core/main/scala/org/elasticsearch/spark/rdd/ScalaEsRDD.scala:51: discarded non-Unit value
    InitializationUtils.setValueReaderIfNotSet(settings, classOf[ScalaValueReader], log)
                                              ^
/Users/james.baiera/Documents/source/elasticsearch-hadoop/spark/core/main/scala/org/elasticsearch/spark/serialization/ScalaValueReader.scala:190: discarded non-Unit value
    map.asInstanceOf[Map[AnyRef, Any]].put(key, value)
                                          ^
/Users/james.baiera/Documents/source/elasticsearch-hadoop/spark/sql-20/src/main/scala/org/elasticsearch/spark/sql/EsSparkSQL.scala:94: discarded non-Unit value
    sparkCtx.runJob(srdd.toDF().rdd, new EsDataFrameWriter(srdd.schema, esCfg.save()).write _)
                   ^
/Users/james.baiera/Documents/source/elasticsearch-hadoop/spark/sql-20/src/main/scala/org/elasticsearch/spark/sql/SchemaUtils.scala:295: discarded non-Unit value
        info._1.setProperty(level, StringUtils.concatenate(fields, StringUtils.DEFAULT_DELIMITER))
                           ^
@takezoe
Copy link
Contributor

takezoe commented May 3, 2017

Hm, checking discarding return value may be overkill because all of them are defined in Spark or Java libraries. We can't touch them or shouldn't change their signatures. They can be ignored explicitly as:

val _ = rdd.sparkContext.runJob(rdd, new EsRDDWriter(config.save(), hasMeta).write _)

But it's not very common also in Scala world. So eliminating "-Ywarn-value-discard" option might be a solution in this situation.

@jbaiera
Copy link
Member Author

jbaiera commented May 3, 2017

@takezoe I think that makes sense to be honest. This is just a placeholder to clear out these types of warnings for the sake of turning on fatal-warnings. If that ends up as removing the flag because it makes the most sense, I think it makes sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants