Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CARBONDATA-2753] Fix Compatibility issues while querying on 1.3 Store #2534

Closed

Conversation

dhatchayani
Copy link
Contributor

Problem:
Currently,in the segmentFile we are writing the index files list in files field, only if it exists, otherwise it will be empty(in case if it is merged to merge index file). But in the old store, we were writing both the files and mergeFileName fields even if the index files are merged.

Solution:
While querying we have to check the physical existence of the index files listed in the files field. If it physically exists, then we have to consider that.

  • Any interfaces changed?

  • Any backward compatibility impacted?

  • Document update required?

  • Testing done
    Manual Testing with Old Store

  • For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.

@CarbonDataQA
Copy link

Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6128/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7368/

@dhatchayani
Copy link
Contributor Author

Retest this please

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7370/

@CarbonDataQA
Copy link

Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6131/

@ravipesala
Copy link
Contributor

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5941/

@dhatchayani
Copy link
Contributor Author

retest this please

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7372/

@CarbonDataQA
Copy link

Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6133/

@brijoobopanna
Copy link
Contributor

retest sdv please

@ravipesala
Copy link
Contributor

SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5945/

indexFiles.put(indexFilePath, null);
}
} catch (IOException e) {
LOGGER.error(e.getMessage());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. Don't LOG the error here, throw IO exception back to caller
  2. Mention the compatibility version number and provide a detailed comment in this method for file Existence check including the version numbers

@dhatchayani dhatchayani changed the title [CARBONDATA-2753] Fix Compatibility issues on index Files [CARBONDATA-2753] Fix Compatibility issues while querying on 1.3 Store Jul 23, 2018
@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7385/

@CarbonDataQA
Copy link

Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6146/

@ravipesala
Copy link
Contributor

SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5960/

@manishgupta88
Copy link
Contributor

LGTM

@asfgit asfgit closed this in 7ab6706 Jul 23, 2018
asfgit pushed a commit that referenced this pull request Jul 30, 2018
Problem:
Currently,in the segmentFile we are writing the index files list in files field, only if it exists, otherwise it will be empty(in case if it is
merged to merge index file). But in the old store, we were writing both the files and mergeFileName fields even if the index files are merged.

Solution:
While querying we have to check the physical existence of the index files listed in the files field. If it physically exists, then we have to consider that.

This closes #2534
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants