Skip to content

Conversation

@rolfmorel
Copy link
Contributor

Simple wrapper around upstream's transform_interpreter API. Invokable directly from Python and via the commandline on separate payload and schedule files.

Simple wrapper around upstream's transform_interpreter API. Invokable
directly from Python and via the commandline on separate payload and
schedule files.

def example_schedule() -> ir.Module:
schedule = ir.Module.create()
schedule.operation.attributes["transform.with_named_sequence"] = ir.UnitAttr.get()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be better if the transform module provided this as a helper, so that users only need to add the transforms themselves.

Copy link
Contributor Author

@rolfmorel rolfmorel Nov 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, we should probably be facilitating dealing with such details a bit more. The thing I would not like is to obscure that we are just constructing IR, i.e. to me this code still reflects what the .mlir will look like. We should find a middle ground somehow. So, the helpers should look like IR builders as well ... probably?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On returning to the PR: I think this is only non-ergonomic in that one must intersperse operation between schedule and attributes (an upstream issue). Beyond that, this API usage properly reflects how IR is build, in as terse a manner as upstream supports.

Separately, we could of course have helpers to, e.g. to wrap named_sequences up in appropriate modules. As that's not entirely trivial, e.g. multiple sequences will live in the same module, so we cannot simply wrap a single named_sequence, I think we can punt this until we start observing the kind of replication we get across the repo.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to dissociate a bit the transform helpers from actual IR, because while the underlying IR shape may change, the API should not. If all our APIs follow the IR shape closely, then we'll have to change their usage every time the IR changes, and that doesn't scale.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, we can improve this. I don't think the right solution is obvious though (go ahead and play around with it -- I am sure @makslevental agrees) and don't think this is the PR to solve it. Lets just merge this and address the right kind of wrappers in a dedicated PR (such a more focused PR is also like to engage the ex-Brium folks more, I imagine).


def example_payload() -> ir.Module:
payload = ir.Module.create()
with ir.InsertionPoint(payload.body):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should also be a helper, eventually be part of the MLIR importer module, which should be reused by the schedule, too, if we all agree with this design: https://github.com/llvm/lighthouse/wiki/Integrator

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What would the helper do here? Wrap a payload in a module?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Abstract away creation of modules, functions and other global objects. Although simple things to do in Python, the idea of having to remember insertion points is one that should not leak into a public API.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above, IMHO.

@rengolin rengolin requested a review from Groverkss November 11, 2025 13:24
Copy link
Contributor

@adam-smnk adam-smnk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really neat and easy to use 👍

Now requires eb9d56c (or later) of llvm-project
Copy link

@banach-space banach-space left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is very exciting - thanks for driving this forward!

Before we land it, I think we should get CI set up and confirm that the current changes actually run end-to-end. Right now, I haven’t been able to run anything from Lighthouse. This might just be an issue with my local setup, but without CI we can’t easily tell whether it’s a configuration problem or a deeper design/assumption issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants