Skip to content

use setuptools_scm instead of setuptools-git-version#56

Merged
amcadmus merged 3 commits into
deepmodeling:develfrom
njzjz:version
Jul 30, 2019
Merged

use setuptools_scm instead of setuptools-git-version#56
amcadmus merged 3 commits into
deepmodeling:develfrom
njzjz:version

Conversation

@njzjz
Copy link
Copy Markdown
Member

@njzjz njzjz commented Jul 30, 2019

Then, we can use python setup.py sdist to package the source, and use twine upload dist/* to upload it to pypi.org. Users can use pip install deepmd-kit to install the python interface directly.

@njzjz
Copy link
Copy Markdown
Member Author

njzjz commented Jul 30, 2019

why test fail...

@njzjz
Copy link
Copy Markdown
Member Author

njzjz commented Jul 30, 2019

Travis-CI为啥分了一台gcc=5.4的机器……

@amcadmus
Copy link
Copy Markdown
Member

Why does the test fail on a machine with gcc==5.4? Could you paste the error log?

@amcadmus
Copy link
Copy Markdown
Member

2019-07-30-205606_1854x590_scrot
I see it....

@amcadmus
Copy link
Copy Markdown
Member

The official tensorflow distribution was built with gcc==4.8 and the deepmd-kit should be built with the same gcc...

@amcadmus amcadmus merged commit 64a1402 into deepmodeling:devel Jul 30, 2019
@njzjz
Copy link
Copy Markdown
Member Author

njzjz commented Jul 30, 2019

see tensorflow/tensorflow#27067

@njzjz njzjz deleted the version branch July 30, 2019 13:25
njzjz-bot pushed a commit to njzjz-bot/deepmd-kit that referenced this pull request May 8, 2026
Imported from jinzhezenggroup/computational-chemistry-agent-skills.
Upstream-Commit: jinzhezenggroup/computational-chemistry-agent-skills@3f03f72
Upstream-Paths:
- machine-learning-potentials/deepmd-finetune-dpa3
- machine-learning-potentials/deepmd-python-inference
- machine-learning-potentials/deepmd-train-dpa3
- machine-learning-potentials/deepmd-train-se-e2-a
- molecular-dynamics/lammps-deepmd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants