Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CARBONDATA-1537] Fixed version compatabilty issues from V1 to latest carbon version #1398

Closed
wants to merge 3 commits into from

Conversation

ravipesala
Copy link
Contributor

No description provided.

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/218/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/342/

@ravipesala
Copy link
Contributor Author

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/970/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/343/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/219/

@CarbonDataQA
Copy link

Build Failed with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/220/

@ravipesala
Copy link
Contributor Author

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/971/

@CarbonDataQA
Copy link

Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/344/

@ravipesala
Copy link
Contributor Author

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/972/

@CarbonDataQA
Copy link

Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/345/

@CarbonDataQA
Copy link

Build Failed with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/221/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/346/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/222/

@ravipesala
Copy link
Contributor Author

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/973/

@ravipesala
Copy link
Contributor Author

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/974/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/223/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/347/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/224/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/348/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/349/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/225/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/350/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/354/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/230/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/355/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/231/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/232/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/356/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/357/

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/233/

@ravipesala
Copy link
Contributor Author

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/985/

@ravipesala ravipesala changed the title [WIP] Fixed version compatabilty issues from V1 to latest carbon version [CARBONDATA-1537] Fixed version compatabilty issues from V1 to latest carbon version Oct 5, 2017
@@ -214,6 +218,42 @@ public void setNumberOfPages(int numberOfPages) {
for (int i = 0; i < mSize; i++) {
output.writeInt(measureChunksLength.get(i));
}
// Serialize datachunks as well for older versions like V1 and V2
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you wrap this logic into a function and mentioning it is for V1 and V2 serialization only, I think it will be more readable

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

@@ -238,6 +278,20 @@ public void setNumberOfPages(int numberOfPages) {
for (int i = 0; i < measureChunkOffsetsSize; i++) {
measureChunksLength.add(input.readInt());
}

// Deserialize datachunks as well for older versions like V1 and V2
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you wrap this logic into a function and mentioning it is for V1 and V2 deserialization only, I think it will be more readable

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

// internal use only, for variable length data type
BYTE_ARRAY(13, "BYTE_ARRAY", -1),

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove empty line

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

// internal use only, for value compression from integer/long to 3 bytes value
SHORT_INT(14, "SHORT_INT", 3);
SHORT_INT(14, "SHORT_INT", 3),
// Only for internal use for backward compatability
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mentioning it is for V1 and V2 format only

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

@@ -2070,6 +1994,20 @@ public static void dropDatabaseDirectory(String dbName, String storePath)
}
}

public static DataType getDataType(char type) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest to move all conversion to DataType enum, including this one and ColumnPageEncoderMeta.convertType. And also move CarbonCommonConstants.BIG_INT_MEASURE and related constants to DataType enum also.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

@@ -123,6 +123,25 @@ private BlockletInfo getBlockletInfo(
}

@Override public List<ColumnSchema> getSchema(TableBlockInfo tableBlockInfo) throws IOException {
return null;
FileHolder fileReader = null;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this change related to this PR? Can you add comment to describe the intention of this function in its abstract interface?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it is part of this PR. We need to get columnschema

import org.apache.carbondata.core.util.CarbonProperties

/**
* V1 to V3 compatability test. This test has to be at last
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will you add V2 to V3 compatibility test in future PR?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, i will tests for V2 to V3 in another PR, Because I need to verify the compatibility of V2 to V3 first

@@ -43,7 +43,7 @@ class CarbonSession(@transient val sc: SparkContext,
}

@transient
override private[sql] lazy val sessionState: SessionState = new CarbonSessionState(this)
override lazy val sessionState: SessionState = new CarbonSessionState(this)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this change required? Somewhere outside sql package will read this variable?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is required for test case I added.

@CarbonDataQA
Copy link

Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/294/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/418/

@ravipesala
Copy link
Contributor Author

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1048/

@jackylk
Copy link
Contributor

jackylk commented Oct 11, 2017

LGTM

@asfgit asfgit closed this in 133b303 Oct 11, 2017
anubhav100 pushed a commit to anubhav100/incubator-carbondata that referenced this pull request Jun 22, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants