Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-27346][SQL] Loosen the newline assert condition on 'examples' field in ExpressionInfo #24274

Closed
wants to merge 1 commit into from

Conversation

HyukjinKwon
Copy link
Member

What changes were proposed in this pull request?

I haven't tested by myself on Windows and I am not 100% sure if this is going to cause an actual problem.

However, this one line:

assert examples.isEmpty() || examples.startsWith(System.lineSeparator() + " Examples:");

made me to investigate a lot today.

Given my speculation, if Spark is built in Linux and it's executed on Windows, it looks possible for multiline strings, like,

examples = """
Examples:
> SELECT _FUNC_();
2.718281828459045
""")

to throw an exception because the newline in the binary is \n but System.lineSeparator returns \r\n.

I think this is not yet found because this particular codes are not released yet (see SPARK-26426).

Looks just better to loosen the condition and forget about this stuff.

This should be backported into branch-2.4 as well.

How was this patch tested?

N/A

Copy link
Member

@srowen srowen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree, I don't see a reason to assert that much about the message. It presumably wouldn't be enabled anyway as an assertion at runtime, but fine.

@SparkQA
Copy link

SparkQA commented Apr 2, 2019

Test build #104204 has finished for PR 24274 at commit 7f341e5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member Author

Thanks, @srowen.

@HyukjinKwon
Copy link
Member Author

Merged to master and branch-2.4.

HyukjinKwon added a commit that referenced this pull request Apr 2, 2019
…field in ExpressionInfo

## What changes were proposed in this pull request?

I haven't tested by myself on Windows and I am not 100% sure if this is going to cause an actual problem.

However, this one line:

https://github.com/apache/spark/blob/827383a97c11a61661440ff86ce0c3382a2a23b2/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java#L82

made me to investigate a lot today.

Given my speculation, if Spark is built in Linux and it's executed on Windows, it looks possible for multiline strings, like,

https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala#L146-L150

to throw an exception because the newline in the binary is `\n` but `System.lineSeparator` returns `\r\n`.

I think this is not yet found because this particular codes are not released yet (see SPARK-26426).

Looks just better to loosen the condition and forget about this stuff.

This should be backported into branch-2.4 as well.

## How was this patch tested?

N/A

Closes #24274 from HyukjinKwon/SPARK-27346.

Authored-by: Hyukjin Kwon <gurwls223@apache.org>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit 949d712)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
@codegorillauk
Copy link

I develop on Windows, this is a major problem as I cannot run any of my tests due to this assert. As a result the only way forward is to either wait for 2.4.2 or migrate all the devs onto Linux.

This is a bit more than Trivial - any test run outside of Linux will failure to run with error "Error while instantiating 'org.apache.spark.sql.internal.SessionStateBuilder'" due to that contentious decision.

@srowen
Copy link
Member

srowen commented Apr 17, 2019

Yes, I'm not sure building and testing on Windows has ever been really supported or works. Running on Windows is supposed to work but still not something that's tested much. I'd certainly work on Linux if possible.

@codegorillauk
Copy link

Fair point - its been working remarkably well for over 4 years though. I have moved to Linux, but a lot of the dev are very much "windows" developers.
I'll see what response that gets back.
Thanks.

@HyukjinKwon
Copy link
Member Author

HyukjinKwon commented Apr 17, 2019

From which Spark version? I think this PR fixes the problem. IIRC, only Spark 2.4.1 has this issue. 2.4.0 might have it too but 2.4.0 isn't for Windows anyway because of a regression on Pyspark (it won't work at all)

Yes, more importantly, I don't think Spark officially support to build and test on Windows although it runs on Windows.

@HyukjinKwon
Copy link
Member Author

Also, this issue was fixed on branch-2.4. I don't think there's any actionable item on this issue in previous releases in Apache Spark side.

@codegorillauk
Copy link

I'm not saying its not fixed by this PR - it is.
2.4.0 didn't have this specific problem.

@HyukjinKwon
Copy link
Member Author

I see. So Spark 2.4.1 has the problem? Spark 2.4.2 is being discussed now. Hopefully new release will be soon. But to be clear, I don't think Apache Spark officially supports build or test on Windows. It is just a good-to-do when the changes are minimised.

To build Apache Spark Windows, bash is required (via Windows's or Cygwin's) already.

@codegorillauk
Copy link

We don't build spark, just use it in local[*] for testing our spark applications in a non-clustered deployed environment. I haven't used spark deployed on windows for almost 4 years. I know that sparks own build and test don't run well on windows.. But using it in local most with mini cluster implementations of kakfa, hdfs etc on windows has been our general dev use case. For now we will keep our 2.4.1 upgrade in a branch as the other teams cannot easily move from a windows dev environment overnight, and hope that 2.4.2 isn't to far away.

kai-chi pushed a commit to kai-chi/spark that referenced this pull request Jul 23, 2019
…field in ExpressionInfo

## What changes were proposed in this pull request?

I haven't tested by myself on Windows and I am not 100% sure if this is going to cause an actual problem.

However, this one line:

https://github.com/apache/spark/blob/827383a97c11a61661440ff86ce0c3382a2a23b2/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java#L82

made me to investigate a lot today.

Given my speculation, if Spark is built in Linux and it's executed on Windows, it looks possible for multiline strings, like,

https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala#L146-L150

to throw an exception because the newline in the binary is `\n` but `System.lineSeparator` returns `\r\n`.

I think this is not yet found because this particular codes are not released yet (see SPARK-26426).

Looks just better to loosen the condition and forget about this stuff.

This should be backported into branch-2.4 as well.

## How was this patch tested?

N/A

Closes apache#24274 from HyukjinKwon/SPARK-27346.

Authored-by: Hyukjin Kwon <gurwls223@apache.org>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit 949d712)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
kai-chi pushed a commit to kai-chi/spark that referenced this pull request Jul 25, 2019
…field in ExpressionInfo

## What changes were proposed in this pull request?

I haven't tested by myself on Windows and I am not 100% sure if this is going to cause an actual problem.

However, this one line:

https://github.com/apache/spark/blob/827383a97c11a61661440ff86ce0c3382a2a23b2/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java#L82

made me to investigate a lot today.

Given my speculation, if Spark is built in Linux and it's executed on Windows, it looks possible for multiline strings, like,

https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala#L146-L150

to throw an exception because the newline in the binary is `\n` but `System.lineSeparator` returns `\r\n`.

I think this is not yet found because this particular codes are not released yet (see SPARK-26426).

Looks just better to loosen the condition and forget about this stuff.

This should be backported into branch-2.4 as well.

## How was this patch tested?

N/A

Closes apache#24274 from HyukjinKwon/SPARK-27346.

Authored-by: Hyukjin Kwon <gurwls223@apache.org>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit 949d712)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
kai-chi pushed a commit to kai-chi/spark that referenced this pull request Aug 1, 2019
…field in ExpressionInfo

## What changes were proposed in this pull request?

I haven't tested by myself on Windows and I am not 100% sure if this is going to cause an actual problem.

However, this one line:

https://github.com/apache/spark/blob/827383a97c11a61661440ff86ce0c3382a2a23b2/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java#L82

made me to investigate a lot today.

Given my speculation, if Spark is built in Linux and it's executed on Windows, it looks possible for multiline strings, like,

https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala#L146-L150

to throw an exception because the newline in the binary is `\n` but `System.lineSeparator` returns `\r\n`.

I think this is not yet found because this particular codes are not released yet (see SPARK-26426).

Looks just better to loosen the condition and forget about this stuff.

This should be backported into branch-2.4 as well.

## How was this patch tested?

N/A

Closes apache#24274 from HyukjinKwon/SPARK-27346.

Authored-by: Hyukjin Kwon <gurwls223@apache.org>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit 949d712)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
@HyukjinKwon HyukjinKwon deleted the SPARK-27346 branch March 3, 2020 01:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants