Skip to content

Commit

Permalink
Convert demo READMEs to Sphinx index pages (#1391)
Browse files Browse the repository at this point in the history
This moves us towards using Read the Docs as the main source of demos, by duplicating the indexing information from READMEs into the Sphinx docs.

We've decided to move to using Read the Docs like this because:

- it gives us more control over how things are shown (e.g. #1386)
- the display is faster
- it allows us to version properly, including with internal links to documentation of the same version as the demo

I used [m2r](https://miyakogi.github.io/m2r/index.html) to convert each of our demo READMEs to a reStructuredText `index.txt` file, and then fixed them up by hand. For each index, I've gone with:

```rst
Title
======

... description with subsections etc ...

Table of contents
------------------

... full ToC listing of subfolders/demos of this index ...
```

(Where the description is empty in some cases, for directories without READMEs: #1139.)

The thinking is we'll replace the README.md files with a much simplified set of information, potentially just a link to the corresponding Read the Docs. However, the READMEs are still useful as an index if someone has a copy of the demos locally in Jupyter lab. Thus, this PR also extends the `demo_table.py` script to render the demo indexing table to both a raw-HTML markdown version into `demos/README.md` and a rST `list-table` version into `docs/demos/index.txt`. This means we can still retain that demo table in an otherwise reduced/not-human-maintained `demos/README.md`.

Other than extending `demo_table.py`, this tries to do the minimal work to get this working. Future work:

- optimising the display of each of the tables (at the moment they scroll sideways a lot): #1418
- handling abbreviations in the main indexing table, because it doesn't seem easy to get hover text in rST: #1417
- validating links (likely covered by #1360)
- updating the demos to have relative links pointing to the index files, not README.mds (likely covered by #1292 and #1360)
- updating the README files to remove their content: #1419
- updating any links to the demos on GitHub to point to Read the Docs: #1420

Rendered demo landing page: https://stellargraph--1391.org.readthedocs.build/en/1391/demos/index.html

See: #1298
  • Loading branch information
huonw committed Apr 30, 2020
1 parent cba2cbd commit 7610278
Show file tree
Hide file tree
Showing 32 changed files with 1,524 additions and 183 deletions.
19 changes: 17 additions & 2 deletions docs/demos/basics/index.txt
@@ -1,5 +1,20 @@
Basics
====================================
StellarGraph basics
===================

`StellarGraph <https://github.com/stellargraph/stellargraph>`_ has support for loading data via Pandas and NetworkX. This folder contains examples of the loading data into a ``StellarGraph`` object, which is the format used by the machine learning algorithms in this library.

Find demos for a format
-----------------------


* :doc:`loading-pandas <loading-pandas>` shows the recommended way to load data, using Pandas (supporting any input format that Pandas supports, including CSV files and SQL databases)
* :doc:`loading-networkx <loading-networkx>` shows how to load data from a `NetworkX <https://networkx.github.io>`_ graph
* :doc:`loading-saving-neo4j <loading-saving-neo4j>` shows how to load data from a `Neo4j <https://neo4j.com>`_ database, and save results back to it

See :doc:`all demos for machine learning algorithms <../index>`.

Table of contents
-----------------

.. toctree::
:titlesonly:
Expand Down
14 changes: 12 additions & 2 deletions docs/demos/calibration/index.txt
@@ -1,5 +1,15 @@
Calibration
====================================
Ensemble learning for graph neural network algorithms
=====================================================

This folder contains two `Jupyter <http://jupyter.org/>`_ python notebooks demonstrating ``StellarGraph`` model calibration for binary (``calibration-pubmed-link-prediction.ipynb``) and multi-class classification (``calibration-pubmed-node-classification.ipynb``) problems.

References
----------

**1.** On Calibration of Modern Neural Networks. C. Guo, G. Pleiss, Y. Sun, and K. Q. Weinberger. ICML (`link <https://geoffpleiss.com/nn_calibration>`_)

Table of contents
-----------------

.. toctree::
:titlesonly:
Expand Down
34 changes: 32 additions & 2 deletions docs/demos/community_detection/index.txt
@@ -1,5 +1,35 @@
Community Detection
====================================
Community detection on a terrorist attack data
==============================================

This is an example of using Unsupervised GraphSAGE embeddings with clustering to demonstrate how to solve community detection problem. The demo guides through the steps and shows the differences between "traditional" community detection (infomap) and the clustering-of-embeddings approach.

Requirements
------------

This example assumes the ``stellargraph`` library and its requirements have been
installed by following the installation instructions in the README
of the library's `root directory <https://github.com/stellargraph/stellargraph>`_.

These demos require ``python-igraph``, which can be installed via:

.. code-block::

pip install stellargraph[demos,igraph]

Data
----

The dataset used in this demo is available at https://www.kaggle.com/START-UMD/gtd. The Global Terrorism Database (GTD) is an open-source database including information on terrorist attacks around the world from 1970 through 2017. The GTD includes systematic data on domestic as well as international terrorist incidents and includes more than 180,000 attacks. The database is maintained by researchers at the National Consortium for the Study of Terrorism and Responses to Terrorism (START), from the University of Maryland. For information refer to the initial data source: https://www.start.umd.edu/gtd/.

To run the demo notebook, extract the data into a directory, and adjust the data path in the notebook pointing to the raw data

Issues
------

If you experience problems in installing ``igraph-python``, please refer to the installation page https://igraph.org/python/ for help.

Table of contents
-----------------

.. toctree::
:titlesonly:
Expand Down
3 changes: 3 additions & 0 deletions docs/demos/connector/index.txt
@@ -1,6 +1,9 @@
Connector
====================================

Table of contents
-----------------

.. toctree::
:titlesonly:
:glob:
Expand Down
3 changes: 3 additions & 0 deletions docs/demos/connector/neo4j/index.txt
@@ -1,6 +1,9 @@
Neo4j
====================================

Table of contents
-----------------

.. toctree::
:titlesonly:
:glob:
Expand Down
65 changes: 63 additions & 2 deletions docs/demos/embeddings/index.txt
@@ -1,5 +1,66 @@
Embeddings
====================================
Representation learning using StellarGraph
==========================================

`StellarGraph <https://github.com/stellargraph/stellargraph>`_ provides numerous algorithms for doing node and edge representation learning on graphs. This folder contains demos of all of them to explain how they work and how to use them as part of a TensorFlow Keras data science workflow.

A node representation learning task computes a representation or embedding vector for each node in a graph. These vectors capture latent/hidden information about the nodes and edges, and can be used for (semi-)supervised downstream tasks like :doc:`node classification <../node-classification/index>` and :doc:`link prediction <../link-prediction/index>`, or unsupervised ones like :doc:`community detection <../community_detection/index>` or similarity searches. Representation learning is typically an unsupervised task, where the model is trained on data that does not have any ground-truth labels.

Node representations can also be computed from (semi-)supervised models, using the output of a hidden layer as the embedding vector for nodes or edges. StellarGraph provides some :doc:`demonstrations of node classification <../node-classification/index>` and :doc:`link prediction <../link-prediction/index>`, some of which include computing and visualising node or edge embeddings.

Find algorithms and demos for a graph
-------------------------------------

This table lists all representation learning demos, including the algorithms trained, how they are trained, the types of graph used, and the tasks demonstrated.

.. list-table::
:header-rows: 1

* - demo
- algorithm(s)
- training method
- node features
- downstream tasks shown
* - :doc:`Deep Graph Infomax <deep-graph-infomax-cora>`
- GCN, GAT, PPNP, APPNP, GraphSAGE, HinSAGE
- ``DeepGraphInfomax`` (mutual information)
- yes
- visualisation, node classification
* - :doc:`Unsupervised GraphSAGE <embeddings-unsupervised-graphsage-cora>`
- GraphSAGE
- ``UnsupervisedSampler`` (link prediction)
- yes
- visualisation, node classification
* - :doc:`Attri2Vec <stellargraph-attri2vec-citeseer>`
- Attri2Vec
- ``UnsupervisedSampler`` (link prediction)
- yes
- visualisation
* - :doc:`Metapath2Vec <stellargraph-metapath2vec>`
- Metapath2Vec
- natively unsupervised
-
- visualisation
* - :doc:`Node2Vec <stellargraph-node2vec>`
- Node2Vec
- natively unsupervised
-
- visualisation
* - :doc:`Watch Your Step <watch-your-step-cora-demo>`
- Watch Your Step
- natively unsupervised
-
- visualisation, node classification
* - :doc:`GraphWave <graphwave-barbell>`
- GraphWave
- natively unsupervised
-
- visualisation, node classification


See :doc:`the root README <../../README>` or each algorithm's documentation for the relevant citation(s). See :doc:`the demo index <../index>` for more tasks, and a summary of each algorithm.

Table of contents
-----------------

.. toctree::
:titlesonly:
Expand Down
11 changes: 9 additions & 2 deletions docs/demos/ensembles/index.txt
@@ -1,5 +1,12 @@
Ensembles
====================================
Ensemble learning for graph neural network algorithms
=====================================================

This folder contains two `Jupyter <http://jupyter.org/>`_ python notebooks demonstrating the use of ensemble learning
for node attribute inference (``ensemble-node-classification-example.ipynb``) and
link prediction (``ensemble-link-prediction-example.ipynb``) using ``StellarGraph``'s graph neural network algorithms.

Table of contents
-----------------

.. toctree::
:titlesonly:
Expand Down
31 changes: 29 additions & 2 deletions docs/demos/graph-classification/index.txt
@@ -1,5 +1,32 @@
Graph Classification
====================================
Graph classification using StellarGraph
=======================================

`StellarGraph <https://github.com/stellargraph/stellargraph>`_ provides an algorithm for graph classification. This folder contains a demo to explain how it works and how to use it as part of a TensorFlow Keras data science workflow.

A graph classification task predicts an attribute of each graph in a collection of graphs. For instance, labelling each graph with a categorical class (binary classification or multiclass classification), or predicting a continuous number (regression). It is supervised or semi-supervised, where the model is trained using a subset of graphs that have ground-truth labels.

Find algorithms and demos for a collection of graphs
----------------------------------------------------

This table lists all graph classification demos, including the algorithms trained and the types of graphs used.

.. list-table::
:header-rows: 1

* - demo
- algorithm(s)
- node features
- inductive
* - :doc:`GCN Supervised Graph Classification <supervised-graph-classification>`
- GCN, mean pooling
- yes
- yes


See :doc:`the demo index <../index>` for more tasks, and a summary of each algorithm.

Table of contents
-----------------

.. toctree::
:titlesonly:
Expand Down

0 comments on commit 7610278

Please sign in to comment.