Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-15023][core][runtime] Remove on-heap managed memory #10397

Conversation

@xintongsong
Copy link
Contributor

xintongsong commented Dec 3, 2019

What is the purpose of the change

This PR is part or FLIP-49. It completely remove on-heap managed memory, keeping managed memory always off-heap.

This PR also contains a hotfix commit for FLINK-15047 to fix YarnDistributedCacheITCase.

Brief change log

  • fccbc2c: Commits for FLINK-15047 fixing YarnDistributedCacheITCase
  • 419da58: Code clean-up in ResourceProfile
  • 8c755e7: Remove on-heap managed memory from ResourceProfile
  • 3333587: Remove on-heap managed memory from ResourceSpec
  • b702019: Remove on-heap managed memory fraction from StreamConfig, and its calculation logic from StreamingJobGraphGenerator.
  • 0090333: Remove on-heap managed memory from TaskExecutorResourceSpec and TaskExecutorResourceUtils.
  • 54b4517: Remove config options of managed memory off-heap size / fraction and legacy memory off-heap.

Verifying this change

This change is a trivial rework / code cleanup without any test coverage.

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): (yes / no)
  • The public API, i.e., is any changed class annotated with @Public(Evolving): (yes)
  • The serializers: (no)
  • The runtime per-record code paths (performance sensitive): (no)
  • Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
  • The S3 file system connector: (no)

Documentation

  • Does this pull request introduce a new feature? (no)
  • If yes, how is the feature documented? (not applicable)
@flinkbot

This comment has been minimized.

Copy link

flinkbot commented Dec 3, 2019

Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
to review your pull request. We will use this comment to track the progress of the review.

Automated Checks

Last check on commit 54b4517 (Wed Dec 04 15:58:10 UTC 2019)

no warnings

Mention the bot in a comment to re-run the automated checks.

Review Progress

  • 1. The [description] looks good.
  • 2. There is [consensus] that the contribution should go into to Flink.
  • 3. Needs [attention] from.
  • 4. The change fits into the overall [architecture].
  • 5. Overall code [quality] is good.

Please see the Pull Request Review Guide for a full explanation of the review process.


The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands
The @flinkbot bot supports the following commands:

  • @flinkbot approve description to approve one or more aspects (aspects: description, consensus, architecture and quality)
  • @flinkbot approve all to approve all aspects
  • @flinkbot approve-until architecture to approve everything until architecture
  • @flinkbot attention @username1 [@username2 ..] to require somebody's attention
  • @flinkbot disapprove architecture to remove an approval you gave earlier
@flinkbot

This comment has been minimized.

Copy link

flinkbot commented Dec 3, 2019

CI report:

Bot commands The @flinkbot bot supports the following commands:
  • @flinkbot run travis re-run the last Travis build
@xintongsong xintongsong force-pushed the xintongsong:FLINK-15023-flip49-remove-onheap-managed branch from a33b5d6 to b497bde Dec 3, 2019
@xintongsong

This comment has been minimized.

Copy link
Contributor Author

xintongsong commented Dec 3, 2019

Thanks for the review, @azagrebin.
I've addressed the comments, and rebased to the latest master branch.

Copy link
Contributor

azagrebin left a comment

Thanks for addressing the comments @xintongsong , LGTM
There are tests failures, could you check whether they are related to the changes?

@xintongsong

This comment has been minimized.

Copy link
Contributor Author

xintongsong commented Dec 4, 2019

Thanks @azagrebin. I don't think the test failures are related to this PR.

According to the travis report (#5151), the failed stages are core_legacy_scheduler and misc.

  • core_legacy_scheduler failed due to some travis problem. It has passed on the same branch and commit in my repository (#317).
  • misc failed on YarnDistributedCacheITCase, which is a new test case added and merged to the master branch yesterday.
    • I find another travis build (#5150) right before ours that also failed on the same problem.
    • To verify this does not cover other potential test failures, I disabled YarnDistributedCacheITCase and triggered the tests again, and it passed in this time (#318).
xintongsong added 7 commits Dec 4, 2019
…ctiveResourceManagerFactory.

This fixes YarnDistributedCacheITCase#testPerJobModeWithDistributedCache.
The test case failed because value of dynamic properties in generated task executor starting command contains space, which blocks parsing of the subsequent properties.
…StreamConfig, and its calculation logic from StreamingJobGraphGenerator.
…rResourceSpec and TaskExecutorResourceUtils.
…ff-heap size / fraction and legacy memory off-heap.
@xintongsong xintongsong force-pushed the xintongsong:FLINK-15023-flip49-remove-onheap-managed branch from dce97b3 to 54b4517 Dec 4, 2019
@xintongsong

This comment has been minimized.

Copy link
Contributor Author

xintongsong commented Dec 4, 2019

The PR is updated to include a commit fixing the failed YarnDistributedCacheITCase.
Now travis passed: https://travis-ci.org/xintongsong/flink/builds/620614535

@azagrebin

This comment has been minimized.

Copy link
Contributor

azagrebin commented Dec 4, 2019

merged into master by 2142dc7

@azagrebin azagrebin closed this Dec 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants
You can’t perform that action at this time.