New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-2627] removed the dependency of tech.allegro.schema.json2avro #2398
Conversation
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6471/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5302/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6476/ |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5307/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5394/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5398/ |
buildAvroTestDataSingleFileArrayDefaultType() | ||
assert(new File(writerPath).exists()) | ||
// avro1.8.x Parser donot handles default value , this willbe fixed in 1.9.x. So for now this | ||
// will throw exception. After upgradation of Avro we can change this test case. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test case is required. Along with this can you upgrade to Avro 1.9.X also ? We cannot have it as a separate activity.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
community knows this issue , they said this will be fixed in 2.x version
try { | ||
val schema = new org.apache.avro.Schema.Parser().parse(avroSchema) | ||
val reader = new GenericDatumReader[GenericRecord](schema) | ||
input = new ByteArrayInputStream(json.getBytes()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a duplicate code from SDKwriterTestCase.scala, can you move to one test util in core module call from both the places ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
test-cases are from two different packages , so we should write the util class separately.
DataFileWriter writer = null; | ||
Encoder encoder = null; | ||
ByteArrayOutputStream output = null; | ||
try { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the 3rd place same code. same comment as above. Move to core module test util file and try to reusue same.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
test-cases are from two different packages , so we should write the util class separately.
LGTM , Please handle @ajantha-bhat comments I will merge it. |
done |
Be sure to do all of the following checklist to help us incorporate
your contribution quickly and easily:
Any interfaces changed?
Any backward compatibility impacted?
Document update required?
Testing done
Please provide details on
- Whether new unit test cases have been added or why no new tests are required?
- How it is tested? Please attach test report.
- Is it a performance related change? Please attach the performance test report.
- Any additional information to help reviewers in testing this change.
For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.