Skip to content

Commit

Permalink
Merge pull request #2257 from nd-02110114/fix-docs-build
Browse files Browse the repository at this point in the history
Improve documents
  • Loading branch information
nissy-dev committed Nov 3, 2020
2 parents d79c5ea + 4f26c4a commit 27a8ef1
Show file tree
Hide file tree
Showing 39 changed files with 541 additions and 429 deletions.
13 changes: 7 additions & 6 deletions .gitignore
Expand Up @@ -71,15 +71,15 @@ target/
datasets/2008-2011_USPTO_reactionSmiles_filtered.zip
datasets/2008-2011_USPTO_reactionSmiles_filtered/
datasets/autodock_vina_1_1_2_mac_catalina_64bit/
datasets/chembl_25-featurized/
datasets/chembl_25.csv.gz
datasets/chembl_25-featurized/
datasets/chembl_25.csv.gz
datasets/delaney-featurized/
datasets/from-pdbbind/
datasets/kinase/
datasets/pdbbind/
datasets/from-pdbbind/
datasets/kinase/
datasets/pdbbind/
datasets/pdbbind_v2015.tar.gz
datasets/qm7-featurized/
datasets/qm7.csv
datasets/qm7.csv
datasets/qm7.mat
datasets/sider-featurized/
datasets/sider.csv.gz
Expand All @@ -101,3 +101,4 @@ datasets/pdbbind_v2019_refined.tar.gz
datasets/qm8.csv

.vscode/
.python-version
6 changes: 1 addition & 5 deletions .readthedocs.yml
Expand Up @@ -7,11 +7,7 @@ version: 2

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py

# Build documentation with MkDocs
# mkdocs:
# configuration: mkdocs.yml
configuration: docs/source/conf.py

# Optionally build your docs in additional formats such as PDF and ePub
formats: all
Expand Down
1 change: 1 addition & 0 deletions .travis.yml
Expand Up @@ -44,6 +44,7 @@ script:
- if [[ "$CHECK_ONLY_DOCS" == "true" ]]; then
cd docs && pip install -r requirements.txt;
make clean html;
make doctest_tutorials;
make doctest_examples;
travis_terminate $?;
fi
Expand Down
2 changes: 1 addition & 1 deletion README.md
Expand Up @@ -30,7 +30,7 @@ materials science, quantum chemistry, and biology.

## Requirements

DeepChem currently supports Python 3.5 through 3.7 and requires these packages on any condition.
DeepChem currently supports Python 3.6 through 3.7 and requires these packages on any condition.

- [joblib](https://pypi.python.org/pypi/joblib)
- [NumPy](https://numpy.org/)
Expand Down
2 changes: 1 addition & 1 deletion deepchem/models/sklearn_models/sklearn_model.py
Expand Up @@ -35,7 +35,7 @@ class SklearnModel(Model):
perhaps you want to use the hyperparameter tuning capabilities in
`dc.hyper`. The `SklearnModel` class provides a wrapper around scikit-learn
models that allows scikit-learn models to be trained on `Dataset` objects
and evaluated with the same metrics as other DeepChem models.`
and evaluated with the same metrics as other DeepChem models.
Notes
-----
Expand Down
1 change: 0 additions & 1 deletion docs/.gitignore

This file was deleted.

10 changes: 6 additions & 4 deletions docs/Makefile
Expand Up @@ -5,8 +5,8 @@
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build
SOURCEDIR = source
BUILDDIR = build

# Put it first so that "make" without argument is like "make help".
help:
Expand All @@ -15,8 +15,10 @@ help:
.PHONY: help Makefile

doctest_examples:
export PYTHONWARNINGS=
@$(SPHINXBUILD) -M doctest "$(SOURCEDIR)" "$(BUILDDIR)" examples.rst;
@$(SPHINXBUILD) -M doctest "$(SOURCEDIR)" "$(BUILDDIR)" source/get_started/examples.rst;

doctest_tutorials:
@$(SPHINXBUILD) -M doctest "$(SOURCEDIR)" "$(BUILDDIR)" source/get_started/tutorials.rst;

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
Expand Down
21 changes: 14 additions & 7 deletions docs/README.md
Expand Up @@ -7,17 +7,24 @@ and examples.
## Building the Documentation

To build the docs, you can use the `Makefile` that's been added to
this directory. (Note that `deepchem` must be installed first.) To
generate docs in html, run
this directory. To generate docs in html, run following commands.

```
pip install -r requirements.txt
make html
open _build/html/index.html
$ pip install -r requirements.txt
$ make html
// clean build
$ make clean html
$ open build/html/index.html
```

You can generate docs in other formats as well if you like. To clean up past builds run
If you want to confirm logs in more details,

```
make clean
$ make clean html SPHINXOPTS=-vvv
```

If you want to confirm the example tests,

```
$ make doctest_examples
```
161 changes: 0 additions & 161 deletions docs/index.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/requirements.txt
Expand Up @@ -3,5 +3,4 @@ scikit-learn
sphinx_rtd_theme
tensorflow==2.3.0
transformers
xgboost
torch==1.6.0
File renamed without changes
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Expand Up @@ -93,7 +93,7 @@ WeaveFeaturizer
:members:

MACCSKeysFingerprint
^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^

.. autoclass:: deepchem.feat.MACCSKeysFingerprint
:members:
Expand All @@ -105,13 +105,13 @@ CircularFingerprint
:members:

PubChemFingerprint
^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^

.. autoclass:: deepchem.feat.PubChemFingerprint
:members:

Mol2VecFingerprint
^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^

.. autoclass:: deepchem.feat.Mol2VecFingerprint
:members:
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
2 changes: 1 addition & 1 deletion docs/metrics.rst → docs/source/api_reference/metrics.rst
@@ -1,6 +1,6 @@
Metrics
=======
Metrics are one of the most import parts of machine learning. Unlike
Metrics are one of the most important parts of machine learning. Unlike
traditional software, in which algorithms either work or don't work,
machine learning models work in degrees. That is, there's a continuous
range of "goodness" for a model. "Metrics" are functions which measure
Expand Down
6 changes: 0 additions & 6 deletions docs/models.rst → docs/source/api_reference/models.rst
Expand Up @@ -199,9 +199,6 @@ Losses
.. autoclass:: deepchem.models.losses.SparseSoftmaxCrossEntropy
:members:

.. autoclass:: deepchem.models.losses.SparseSoftmaxCrossEntropy
:members:

.. autoclass:: deepchem.models.losses.VAE_ELBO
:members:

Expand Down Expand Up @@ -241,9 +238,6 @@ Optimizers
.. autoclass:: deepchem.models.optimizers.LinearCosineDecay
:members:

.. autoclass:: deepchem.models.optimizers.LinearCosineDecay
:members:


Keras Models
============
Expand Down
File renamed without changes.
File renamed without changes.
Expand Up @@ -4,7 +4,7 @@ DeepChem :code:`dc.splits.Splitter` objects are a tool to meaningfully
split DeepChem datasets for machine learning testing. The core idea is
that when evaluating a machine learning model, it's useful to creating
training, validation and test splits of your source data. The training
split is used to train models, the validatation is used to benchmark
split is used to train models, the validation is used to benchmark
different model architectures. The test is ideally held out till the
very end when it's used to gauge a final estimate of the model's
performance.
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 comments on commit 27a8ef1

Please sign in to comment.