Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: refactor head layers #130

Merged
merged 23 commits into from
Oct 17, 2021
Merged

feat: refactor head layers #130

merged 23 commits into from
Oct 17, 2021

Conversation

tadejsv
Copy link
Contributor

@tadejsv tadejsv commented Oct 14, 2021

No description provided.

@github-actions github-actions bot added size/l and removed size/m labels Oct 15, 2021
@hanxiao
Copy link
Member

hanxiao commented Oct 17, 2021

i think the overall risk of this design is it is strongly biased on pytorch & paddle, and breaking some prev efforts on unifying keras & pytorch & paddle implementations. It is true that supporting pytorch & paddle is easy, but a pattern that fits all three frameworks is tricky and that's what this project is about.

# Conflicts:
#	finetuner/tuner/base.py
#	finetuner/tuner/paddle/__init__.py
#	finetuner/tuner/pytorch/__init__.py
@tadejsv
Copy link
Contributor Author

tadejsv commented Oct 17, 2021

@hanxiao wait, my PR is not finished yet.I think the same thing can be done in Keras (tensorflow) as in pytorch

@hanxiao
Copy link
Member

hanxiao commented Oct 17, 2021

yes, let me try on this branch, I think I can finish this PR today

@hanxiao hanxiao marked this pull request as ready for review October 17, 2021 09:51
@hanxiao hanxiao merged commit 84585be into main Oct 17, 2021
@hanxiao hanxiao deleted the feat-new-head-layers branch October 17, 2021 11:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants