Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change the act.name for LinearActivation() to "linear" so that it won't fail in hl_activetype; also fix the hasinputsset in submodel #416

Merged
merged 2 commits into from
Nov 11, 2016

Conversation

yu239-zz
Copy link

No description provided.

fail in hl_activetype; also fix the hasinputsset in submodel
Copy link
Collaborator

@emailweixu emailweixu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix test

@coveralls
Copy link

Coverage Status

Coverage increased (+0.2%) to 62.649% when pulling 45f6e1a on yu239:hl_activetype into 8d4c453 on baidu:develop.

@yu239-zz yu239-zz merged commit ebb153b into PaddlePaddle:develop Nov 11, 2016
@yu239-zz yu239-zz deleted the hl_activetype branch November 11, 2016 02:56
thisjiang pushed a commit to thisjiang/Paddle that referenced this pull request Oct 28, 2021
wangxicoding pushed a commit to wangxicoding/Paddle that referenced this pull request Dec 9, 2021
* Fix package data in setup.py for single files.

* Add specification of GPT jit build in README.

* Remove tests from package.

* Fix package data leaving out files with same names.
lizexu123 pushed a commit to lizexu123/Paddle that referenced this pull request Feb 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants