Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update flow dependencies #588

Merged
merged 4 commits into from Jul 7, 2022
Merged

Conversation

vloncar
Copy link
Contributor

@vloncar vloncar commented Jul 3, 2022

Description

This PR shuffles the order of flows a bit to help resolve the issues observed in #509 and elsewhere. Essentially it makes writer flows depend on the ip flows and expands functionality of apply_flows() to control how the flows are reapplied if they were already applied before. Another effect of the change is that the flows aren't applied multiple times, an issue that many people have observed.

Type of change

  • Kind of a bug fix since it fixes multiple flows problem
  • New feature

Tests

No extra tests are needed. To reproduce the old behavior, just convert any model and call compile() and build() and see how many times the same optimizers will be applied (in model._applied_flows).

Checklist

  • I have read the guidelines for contributing.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have made corresponding changes to the documentation.
  • My changes generate no new warnings. Yeah, right, as if I could prove this.

- Make writer flows depend on the ip flows
- Add reapply policy to teh ModelGraph.apply_flows()
- Move the stamp to the optimizer
@vloncar vloncar requested review from jmitrevs and thesps July 3, 2022 08:32
length = 8
return ''.join(choice(hexdigits) for m in range(length))

model.config.config['Stamp'] = _make_stamp()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know this isn't the point of this PR, but would it be safer to make the stamp a (hidden?) attribute of the model rather than a config entry?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes! I was thinking that since layers have attributes, so should the model. Then all the config stuff that must be refactored can be implemented this way. But it's for the next PR.

@thesps
Copy link
Contributor

thesps commented Jul 5, 2022

Perhaps some tests with dummy models and dummy flows, with asserts on the _applied_flows? That could be quite useful also as a way to express and bake-in the intended behaviour for any future developments.

@vloncar
Copy link
Contributor Author

vloncar commented Jul 6, 2022

I can add the test, but it would be pretty naive. It will test only the apply_flow() function, which is pretty simple.

Copy link
Contributor

@thesps thesps left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a simple pytest doing dummy flows on a dummy project and checking that the order of applied passes matches expectation. Everything looks good, merging!

@thesps thesps merged commit 64694f2 into fastmachinelearning:master Jul 7, 2022
calad0i pushed a commit to calad0i/hls4ml that referenced this pull request Jul 1, 2023
* Shuffle the flows a bit
- Make writer flows depend on the ip flows
- Add reapply policy to teh ModelGraph.apply_flows()
- Move the stamp to the optimizer

* Remove the printout

* Add a test for flows

Co-authored-by: Sioni Summers <sioni.summers10@imperial.ac.uk>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants