Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CARBONDATA-657]added support for shared dictionary columns in spark 2.1 #570

Closed
wants to merge 1 commit into from

Conversation

anubhav100
Copy link
Contributor

@anubhav100 anubhav100 commented Jan 25, 2017

in spark 1.6 shared columns works fine

0: jdbc:hive2://localhost:10000> CREATE TABLE uniq_shared_dictionary (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='CUST_ID,Double_COLUMN2,DECIMAL_COLUMN2','columnproperties.CUST_ID.shared_column'='shared.CUST_ID','columnproperties.decimal_column2.shared_column'='shared.decimal_column2');
+---------+--+
| Result |
+---------+--+
+---------+--+

but in spark 2.1 it gives exception

0: jdbc:hive2://hadoop-master:10000> CREATE TABLE uniq_shared_dictionary (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='CUST_ID,Double_COLUMN2,DECIMAL_COLUMN2','columnproperties.CUST_ID.shared_column'='shared.CUST_ID','columnproperties.decimal_column2.shared_column'='shared.decimal_column2');
Error: org.apache.carbondata.spark.exception.MalformedCarbonCommandException: Invalid table properties columnproperties.cust_id.shared_column (state=,code=0)
LOGS
ERROR 18-01 13:31:18,147 - Error executing query, currentState RUNNING,
org.apache.carbondata.spark.exception.MalformedCarbonCommandException: Invalid table properties columnproperties.cust_id.shared_column

but if we give column name in lower case it works fine

CREATE TABLE uniq_shared_dictionary (cust_id int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), decimal_column2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='CUST_ID,Double_COLUMN2,DECIMAL_COLUMN2','columnproperties.cust_id.shared_column'='shared.cust_id','columnproperties.decimal_column2.shared_column'='shared.decimal_column2');
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (2.644 seconds)

with this pr user will be able to create shared columns in spark 2.1 in both case

@CarbonDataQA
Copy link

Build Success with Spark 1.6.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/748/

@anubhav100 anubhav100 changed the title added support for shard columns for spark 2.1 added support for shared dictionary columns in spark 2.1 Jan 25, 2017
@anubhav100 anubhav100 changed the title added support for shared dictionary columns in spark 2.1 [CARBONDATA-657]added support for shared dictionary columns in spark 2.1 Jan 25, 2017
@CarbonDataQA
Copy link

Build Success with Spark 1.6.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/779/

@asfbot
Copy link

asfbot commented May 28, 2017

Refer to this link for build results (access rights to CI server needed):
https://builds.apache.org/job/carbondata-pr-spark-1.6/76/
Test FAILed.

@asfbot
Copy link

asfbot commented May 28, 2017

Refer to this link for build results (access rights to CI server needed):
https://builds.apache.org/job/carbondata-pr-spark-2.1/227/

Build result: ABORTED

[...truncated 635.88 KB...][INFO] [INFO] Apache CarbonData :: Parent ........................ SUCCESS [ 6.463 s][INFO] Apache CarbonData :: Common ........................ SUCCESS [ 8.324 s][INFO] Apache CarbonData :: Core .......................... SUCCESS [01:49 min][INFO] Apache CarbonData :: Processing .................... SUCCESS [ 21.782 s][INFO] Apache CarbonData :: Hadoop ........................ SUCCESS [ 23.117 s][INFO] Apache CarbonData :: Spark Common .................. SUCCESS [ 34.875 s][INFO] Apache CarbonData :: Spark2 ........................ SUCCESS [01:33 min][INFO] Apache CarbonData :: Spark Common Test ............. SUCCESS [07:15 min][INFO] Apache CarbonData :: Assembly ...................... SUCCESS [ 24.990 s][INFO] Apache CarbonData :: Spark2 Examples ............... FAILURE [ 21.502 s][INFO] ------------------------------------------------------------------------[INFO] BUILD FAILURE[INFO] ------------------------------------------------------------------------[INFO] Total time: 13:57 min[INFO] Finished at: 2017-05-28T13:14:45+00:00[INFO] Final Memory: 107M/1203M[INFO] ------------------------------------------------------------------------Waiting for Jenkins to finish collecting dataBuild was abortedAborted by chenliang613channel stoppedSetting status of fc1fa0a to FAILURE with url https://builds.apache.org/job/carbondata-pr-spark-2.1/227/ and message: 'Build finished. 'Using context: Jenkins (Spark 2.1): Maven clean install
Test FAILed.

@CarbonDataQA
Copy link

SDV Build Failed with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/63/

@asfgit
Copy link

asfgit commented Aug 2, 2017

Can one of the admins verify this patch?

1 similar comment
@asfgit
Copy link

asfgit commented Aug 2, 2017

Can one of the admins verify this patch?

@CarbonDataQA
Copy link

Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/95/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/632/

@ravipesala
Copy link
Contributor

SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1263/

@anubhav100 anubhav100 closed this Nov 15, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants