Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Dynamo] Small enchancements for graph dump ir and task arguments #172

Merged
merged 33 commits into from Apr 19, 2023

Conversation

xinli-git
Copy link
Collaborator

  • When graphs are partitioned by dynamo, the previous graph gets overwritten. Using a "function static" variable to track graph sequence
  • Temporary support for passing along additional parameters to the function, this change is used for better support for third party implementations where shape info or dynamic attributes may be passed in.

Copy link
Member

@yaoyaoding yaoyaoding left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @xinli-git. I left some feedbacks.

Comment on lines 45 to 47
generate_executor.graph_counter = getattr(generate_executor, 'graph_counter', 0) + 1
save_dir: Path = Path(save_dir)
ctx.save_graph_instrument(save_dir / "graph_{}".format(generate_executor.graph_counter))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do you think to implement this as a seperate function?

Besides, I prefer to find the first non-existed directory {save_dir}/<id> as the save_dir instead of track the id internally.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the thing with find the first non-existed directory is that if I recompile the model and use the same save_dir value, it will not overwrite the previous compiled graphs, but keeps incrementing. I didn't like this behaviour.

If you think it is OK to disallow overwriting, then I am happy to make it find the first non-existed directory

How do you think to implement this as a seperate function?

  • you mean make this three lines into a different function?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the thing with find the first non-existed directory is that if I recompile the model and use the same save_dir value, it will not overwrite the previous compiled graphs, but keeps incrementing. I didn't like this behaviour.

I see. Make sense.

How do you think to implement this as a seperate function?

Yes. For example, something like

def _resolve_save_dir(save_dir: str) -> str:
   _resolve_save_dir.counter = _resolve_save_dir.counter = getattr(..., 'counter', 0) + 1
  return str(Path(save_dir) / "graph_{}".format(_resolve_save_dir.counter))

@@ -35,7 +35,7 @@ def get_operator_name(op, given_name: Optional[str] = None):
class Operator:
"""An operator that takes tensor as input and output."""

def __init__(self, inputs: List[Tensor], attributes: Dict[str, Any], task: Optional[Task]):
def __init__(self, inputs: List[Tensor], attributes: Optional[Dict[str, Any]] = None, task: Optional[Task] = None):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer to make attributes non-optional. If it is empty, we force the developer to give an empty dict {} in case they forget to pass attributes (if they forget to do so, the reforward function would fail).

For the task attribute, let's also use the Optional[Task] but do not give the default value. In case of the 3rdparty kernel, we explicitly pass the None to task.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK that sounds good, I added this change because the code below checks if it is None and assigns it to an empty dict if that's the case. This gave me the wrong impression that you meant to have this optional but forgot to do so

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. In the very beggining, I set the default value of attributes=None, and some operators that should pass the attributes did not pass them. So I make this parameter mandatory later, but forget to change the body of init. Feel free to delete the check of attributes==None.

@xinli-git
Copy link
Collaborator Author

It seems that the github action vm is acting weird again

@yaoyaoding
Copy link
Member

It seems that the github action vm is acting weird again

Our the nvidia driver on our ci server has done. Fixed just now.

@yaoyaoding
Copy link
Member

Thanks @xinli-git !

@yaoyaoding yaoyaoding merged commit 67cd640 into hidet-org:main Apr 19, 2023
2 checks passed
AndreSlavescu pushed a commit to AndreSlavescu/hidet that referenced this pull request Apr 25, 2023
…det-org#172)

* exp, float

* wip

* chunk, groupnorm, softmax, baddbmm, emmpty

* add interpolate, lint and format

* revert changes of import hidet at top level to minimize changes for PR

* typo

* trigger actions

* trigger actions

* dummy commit

* dummy commit

* add some optimizations to skip certain operations based on alpha beta

* add group norm test

* format

* introduce a fix to torch.compile not dumping graph IR

* Revert "introduce a fix to torch.compile not dumping graph IR"

This reverts commit a1e8df0.

* add interlolate test and group norm test

* accidental push

* remove a random newline added

* minor hot fix

* add a function level static var for multi-graph save ir

* small line change

* revert previous changes

* move save dir to new utility function

* format lint and remove optional attr

* trigger actions

* trigger actions

* Update operator.py

---------

Co-authored-by: Xin Li <xin@centml.ai>
Co-authored-by: Yaoyao Ding <dingyaoyao.cs@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants