-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AnalysisException: resolved attribute(s) missing for blockEntityLinkage #150
Comments
For now I can use a regular link with no blocking , but ideally I'd like to know if it possible to use the blocking method. |
Can you try with version 0.3.5 and report back? |
Seems to be fixed in 0.3.5 |
Actually it is not.. I am still seeing this in 0.3.5
|
Can you share the full exception here? |
I pushed a hotfix here: https://github.com/zouzias/spark-lucenerdd/pull/151/files (feedback is welcome) and I plan to release it tonight under |
Will this be available with Spark 2.1? I will be testing as soon as it is in maven central |
I reproduced here. I will try to fix now. |
Released a fix under Tests are clean on the CI: https://travis-ci.org/zouzias/spark-lucenerdd/jobs/499546155 |
I believe it is fixed now. Thank you |
Glad to hear. |
This is very strange. When I run it in a cluster (reading from a hive table) , I don't see this error anymore. On the other hand, when I run it in my local (reading parquet files) , I see it. I am not sure how to replicate it and the contents of the files are sensitive , so I can't share them here. I need to test more , but let's leave it closed for now. |
Describe the bug
I'd like to link two dataframes blocking on a field named
cd
usingblockEntityLinkage
with the following Schemas :But it produces the following exception :
A self join in df1 works just fine
I tried renaming the column but it did not work.
I am running Spark
2.1.0
and lucenerdd0.3.3
Edit : An explanation about the issue can be found here but I don't believe it can be fixed on my end.
Thank you
The text was updated successfully, but these errors were encountered: