Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BEAM-2937] Basic PGBK combiner lifting. #4290

Merged
merged 2 commits into from Dec 20, 2017

Conversation

robertwb
Copy link
Contributor

Follow this checklist to help us incorporate your contribution quickly and easily:

  • Make sure there is a JIRA issue filed for the change (usually before you start working on it). Trivial changes like typos do not require a JIRA issue. Your pull request should address just this issue, without pulling in other changes.
  • Each commit in the pull request should have a meaningful subject line and body.
  • Format the pull request title like [BEAM-XXX] Fixes bug in ApproximateQuantiles, where you replace BEAM-XXX with the appropriate JIRA issue.
  • Write a pull request description that is detailed enough to understand what the pull request does, how, and why.
  • Run mvn clean verify to make sure basic checks pass. A more thorough check will be performed on your pull request automatically.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

@robertwb
Copy link
Contributor Author

R: @tvalentyn

Copy link
Contributor

@tvalentyn tvalentyn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, although I don't have a lot of context on model internals that this CL touches.


At sdks/python/apache_beam/runners/portability/fn_api_runner.py:173:

  def __init__(self, use_grpc=False, sdk_harness_factory=None):

Do we want to document it in the docstring what objects are expected to be passed as sdk_harness_factory?

@BeamTransformFactory.register_urn(
urns.PRECOMBINE_TRANSFORM, beam_runner_api_pb2.CombinePayload)
def create(factory, transform_id, transform_proto, payload, consumers):
# TODO: Combine side inputs.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to link a JIRA with a description of a TODO?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're actually talking about removing this in Java, so I don't think it's worth surfacing there for now. The SDK API doesn't even support this for now, this is mostly a marker for if this ever gets added. Changed to a normal comment.


... -> PreCombine -> GBK -> MergeAccumulators -> ExtractOutput -> ...
"""
def new_id(existing_ids, prefix=''):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like similar to unique_name at the end of the file. Shall we dedup them or if not, move helpers closer together?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call. Done.


windowed_input_coder = pipeline_components.coders[
input_pcoll.coder_id]
window_coder_id = windowed_input_coder.component_coder_ids[1]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What determines the order of objects in windowed_input_coder.component_coder_ids ?

Perhaps this should be encapsulated in some helper method that is aware of [1], [0] indexes ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the (albeit undocumented) spec of the various coder urns.

I've changed this to tuple unpacking which, though technically the same, I think makes things more understandable.

windowed_input_coder = pipeline_components.coders[
input_pcoll.coder_id]
window_coder_id = windowed_input_coder.component_coder_ids[1]
input_coder = pipeline_components.coders[
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to add a helper variable here something like:

input_coder_id = windowed_input_coder.component_coder_ids[0]
input_coder = pipeline_components.coders[input_coder_id]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did this as part of the above change :).

transform.spec.payload, beam_runner_api_pb2.CombinePayload)

input_pcoll = pipeline_components.pcollections[only_element(
transform.inputs.values())]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does transform.inputs.values() always return one value?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does for COMBINE_PER_KEY_TRANSFORM.

Copy link
Contributor Author

@robertwb robertwb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, PTAL.


... -> PreCombine -> GBK -> MergeAccumulators -> ExtractOutput -> ...
"""
def new_id(existing_ids, prefix=''):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call. Done.

transform.spec.payload, beam_runner_api_pb2.CombinePayload)

input_pcoll = pipeline_components.pcollections[only_element(
transform.inputs.values())]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does for COMBINE_PER_KEY_TRANSFORM.


windowed_input_coder = pipeline_components.coders[
input_pcoll.coder_id]
window_coder_id = windowed_input_coder.component_coder_ids[1]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the (albeit undocumented) spec of the various coder urns.

I've changed this to tuple unpacking which, though technically the same, I think makes things more understandable.

windowed_input_coder = pipeline_components.coders[
input_pcoll.coder_id]
window_coder_id = windowed_input_coder.component_coder_ids[1]
input_coder = pipeline_components.coders[
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did this as part of the above change :).

@BeamTransformFactory.register_urn(
urns.PRECOMBINE_TRANSFORM, beam_runner_api_pb2.CombinePayload)
def create(factory, transform_id, transform_proto, payload, consumers):
# TODO: Combine side inputs.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're actually talking about removing this in Java, so I don't think it's worth surfacing there for now. The SDK API doesn't even support this for now, this is mostly a marker for if this ever gets added. Changed to a normal comment.

Copy link
Contributor

@tvalentyn tvalentyn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

@robertwb robertwb merged commit e92f718 into apache:master Dec 20, 2017
Alienero pushed a commit to Alienero/beam that referenced this pull request Dec 21, 2017
nguyent pushed a commit to nguyent/beam that referenced this pull request Dec 29, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants