Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CARBONDATA-3575] Remove redundant exception throws #3597

Closed
wants to merge 6 commits into from

Conversation

jackylk
Copy link
Contributor

@jackylk jackylk commented Jan 29, 2020

Why is this PR needed?

After

What changes were proposed in this PR?

Does this PR introduce any user interface change?

  • No
  • Yes. (please explain the change and update document)

Is any new testcase added?

  • No
  • Yes

@CarbonDataQA1
Copy link

Build Failed with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/1799/

@CarbonDataQA1
Copy link

Build Failed with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/1800/

@CarbonDataQA1
Copy link

Build Success with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/1801/

@ajantha-bhat
Copy link
Member

@jackylk : how to make sure this PR has removed all the redundant exception throwing ?
and make sure new PR don't add any new redundant exception ?
@lamber-ken : do we have checkstyle rule for this ? we should enable it

@jackylk
Copy link
Contributor Author

jackylk commented Jan 30, 2020

I checked it using IntelliJ's Analyzer

@jackylk
Copy link
Contributor Author

jackylk commented Jan 30, 2020

@ajantha-bhat I do not think checkstyle can check this, since it is needs to understand the API semantic

@xubo245
Copy link
Contributor

xubo245 commented Jan 31, 2020

@jackylk Please describe the PR in details.

*/
byte[][] generateAndSplitKey(long[] keys) throws KeyGenException;
byte[][] generateAndSplitKey(long[] keys);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the caller also should handle the throws exception, like: org.apache.carbondata.core.keygenerator.columnar.impl.MultiDimKeyVarLengthVariableSplitGeneratorUnitTest#testGenerateAndSplitKeyAndGetKeyArrayWithActualLogicWithInt

Copy link
Contributor Author

@jackylk jackylk Jan 31, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

*/
public static byte[] getMaxKeyBasedOnOrinal(List<Integer> keyOrdinalList, KeyGenerator generator)
throws KeyGenException {
public static byte[] getMaxKeyBasedOnOrinal(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please optimize it to getMaxKeyBasedOnOrdinal

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check Orinal in the whole project.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

*/
byte[] generateKey(long[] keys) throws KeyGenException;
byte[] generateKey(long[] keys);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check all caller, like: org.apache.carbondata.core.scan.result.iterator.RawResultIterator#convertRow

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

@@ -77,7 +73,7 @@ public BitSet isScanRequired(byte[][] blockMaxValue, byte[][] blockMinValue,
}

@Override
public void readColumnChunks(RawBlockletColumnChunks rawBlockletColumnChunks) throws IOException {
public void readColumnChunks(RawBlockletColumnChunks rawBlockletColumnChunks) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why didn't remove the exception? like: org.apache.carbondata.core.scan.filter.executer.FilterExecuter#readColumnChunks

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because some subclass need to throw IOException, so it can't be removed

@xubo245
Copy link
Contributor

xubo245 commented Jan 31, 2020

How to handle other redundant exceptions? @jackylk

@jackylk
Copy link
Contributor Author

jackylk commented Jan 31, 2020

@xubo245 all redundant throws are removed

@CarbonDataQA1
Copy link

Build Failed with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/1820/

@CarbonDataQA1
Copy link

Build Failed with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/1822/

@CarbonDataQA1
Copy link

Build Success with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/1823/

@xubo245
Copy link
Contributor

xubo245 commented Feb 2, 2020

LGTM

@asfgit asfgit closed this in 6ec8e32 Feb 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants