Skip to content

Commit

Permalink
Merge pull request #39 from praksharma/development
Browse files Browse the repository at this point in the history
Migrated to JupyterBook
  • Loading branch information
praksharma committed Jul 28, 2023
2 parents 02ca06e + 73706e2 commit 208e340
Show file tree
Hide file tree
Showing 31 changed files with 324 additions and 26 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
**/__pycache__/
# Sphinx documentation
docs/_build/
docs_legacy/_build/

# Jupyter notebook
**.ipynb_checkpoints
Expand Down
32 changes: 22 additions & 10 deletions Tutorials/2. BC/1. dirichlet.ipynb

Large diffs are not rendered by default.

7 changes: 7 additions & 0 deletions Tutorials/4. Dataset/1. basic.ipynb
Original file line number Diff line number Diff line change
@@ -1,5 +1,12 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# How to create the training dataset?"
]
},
{
"cell_type": "code",
"execution_count": 1,
Expand Down
7 changes: 7 additions & 0 deletions Tutorials/5. FCNN/1. basic.ipynb
Original file line number Diff line number Diff line change
@@ -1,5 +1,12 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Basics of network design"
]
},
{
"cell_type": "code",
"execution_count": 3,
Expand Down
7 changes: 7 additions & 0 deletions Tutorials/5. FCNN/3. model.ipynb
Original file line number Diff line number Diff line change
@@ -1,5 +1,12 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# FCNN training"
]
},
{
"cell_type": "code",
"execution_count": 1,
Expand Down
4 changes: 0 additions & 4 deletions docs/.vscode/settings.json

This file was deleted.

1 change: 1 addition & 0 deletions docs/Tutorials
33 changes: 33 additions & 0 deletions docs/_config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Book settings
# Learn more at https://jupyterbook.org/customize/config.html

title: DeepINN
author: Prakhar Sharma
copyright: "2023"
logo: logo.png

# Force re-execution of notebooks on each build.
# See https://jupyterbook.org/content/execute.html
execute:
execute_notebooks: force

# Define the name of the latex output file for PDF builds
latex:
latex_documents:
targetname: book.tex

# Add a bibtex file so that we can create citations
bibtex_bibfiles:
- references.bib

# Information about where the book exists on the web
repository:
url: https://github.com/praksharma/DeepINN # Online location of your book
path_to_book: docs # Optional path to your book, relative to the repository root
branch: master # Which branch of the repository should be used when creating links (optional)

# Add GitHub buttons to your book
# See https://jupyterbook.org/customize/config.html#add-a-link-to-your-repository
html:
use_issues_button: true
use_repository_button: true
17 changes: 17 additions & 0 deletions docs/_toc.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Table of contents
# Learn more at https://jupyterbook.org/customize/toc.html

format: jb-book
root: intro
parts:
- caption: Installation and contribution
chapters:
- file: docs_tutorial/installation.md
- file: docs_tutorial/contribution.md
- file: docs_tutorial/docs_contribution.md
# - caption: Contribution
# chapters:
# - file: docs_tutorial/contribution.md
# - caption: Documentation compilation
# chapters:
# - file: docs_tutorial/docs_contribution.md
31 changes: 31 additions & 0 deletions docs/docs_tutorial/contribution.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# Contribution
Create a `venv` in the root of the repo. Here the assumption is that the `python` is symlink to `python3`.
```sh
python -m venv .venv
```
Activate the environment.
```sh
source .venv/bin/activate
```
Confirm that the Python path is updated.
```sh
which python
```
The `STDOUT` should point to the .venv directory. Now, upgrade the pip.
```sh
python -m pip install --upgrade pip
```
Install the required packages.
```sh
pip install -r requirements.txt
```
If you want to build the docs using the same environment, then install the relevant dependencies.
```sh
pip install -r docs/requirements.txt
```

# Testing
The testing is very simple. Just run the test.py file in the current Python virtual environment.
```sh
python test.py
```
64 changes: 64 additions & 0 deletions docs/docs_tutorial/docs_contribution.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@

## Documentation compilation
The doc is created using [Jupyter-book](https://jupyterbook.org/en/stable/intro.html).

### Setting up Jupyter-books
These steps will allows you to create a basic setup. For more details visit [here](https://jupyterbook.org/en/stable/start/your-first-book.html).
1. Create a new Python virtual environment or use an existing one.

```sh
python3 -m venv jupyter_env
```

2. Install Jupyter-book.
```sh
pip3 install -U jupyter-book
```

3. Create a template for quick start.
```sh
jupyter-book create docs/
```
This will create a directory in the ```$PWD``` named ```docs/```. The table of contents are stored in ```_toc.yml``` and the configuration is stored in ```_config.yml```.

4. Building a project

```
jupyter-book build docs/
```

For a full rebuild:

```
jupyter-book build --all docs/
```

If the toc doesn't update. This will update the entire project.

5. Publish the docs in the new branch.

```
ghp-import -n -p -f docs/_build/html
```

6. Deploy website
Go to "Settings->Pages" of the repo. Set the "Source" to "Deploy from a branch". In the "Branch", select the "gh-pages" branch and location as the "/root".

7. GitHub pages force build
GitHub pages is known for its laziness. To force deploy the website go to "Setting->Pages". Here, search for the following line:

> Your site was last deployed to the github-pages environment by the pages build and deployment workflow.
Click on "pages build and deployment" and click on the button "Re-run all jobs" on the top right corner.

8. Include notebooks outside the docs/ directory
You can create a soft link in the book directory to the directory with the notebooks you want to include:
```sh
ln -s ../Tutorials ./Tutorials
```

You can also link to a document as follows:

```sh
ln -s ../../README.md ./README.md
```
53 changes: 53 additions & 0 deletions docs/docs_tutorial/installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# Installation
## Using pip
DeepINN can be installed via pip using the following command:
```sh
pip install DeepINN
```

## Docker image
Pull the image with suitable tagname. The image is available [here](https://hub.docker.com/r/prakhars962/deepinn).

```sh
docker pull prakhars962/deepinn:tagname
```
### CPU Only
The image opens a jupyter server by default.
```sh
docker run -p 8888:8888 prakhars962/deepinn:pre-release
```

You can override the jupyter server entrypoint using the following command.
```sh
docker run -it --entrypoint /bin/bash prakhars962/deepinn:pre-release
```
### GPU passthrough
First install `nvidia-docker` using this [guide](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#step-2-install-nvidia-container-toolkit).

Now run the container with `nvidia-docker`.
```sh
nvidia-docker run -it --entrypoint /bin/bash prakhars962/deepinn:pre-release
```
This command will bind the `pwd` to `/workspace/tutorials` and open a jupyter-lab with GPU support.
```sh
nvidia-docker run -v $(pwd):/workspace/tutorials -p 8888:8888 prakhars962/deepinn:pre-release
```
Alternatively, one can run interactive session.
```sh
nvidia-docker run -v $(pwd):/workspace/tutorials -it --entrypoint /bin/bash prakhars962/deepinn:pre-release
```

### Tagless copy
Each time you pull the updated image, docker will create a tagless copy of the old one.
```sh
╰─ docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
prakhars962/deepinn pre-release 886808706155 4 minutes ago 6.99GB
prakhars962/deepinn <none> 0bb744f6159e 38 minutes ago 6.99GB
prakhars962/deepinn <none> 4ffbb67f8447 About an hour ago 6.8GB
prakhars962/deepinn <none> fe16ca34f9d9 About an hour ago 6.8GB
```
The only solution is to delete them one by one using the IMAGE_ID.
```sh
docker image rm -f IMAGE_ID
```
5 changes: 5 additions & 0 deletions docs/intro.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# DeepINN
[![DeepINN CI](https://github.com/praksharma/DeepINN/actions/workflows/main.yml/badge.svg)](https://github.com/praksharma/DeepINN/actions/workflows/main.yml) [![docker_container](https://github.com/praksharma/DeepINN/actions/workflows/docker.yml/badge.svg)](https://github.com/praksharma/DeepINN/actions/workflows/docker.yml) [![Codacy Badge](https://app.codacy.com/project/badge/Grade/a5c43d9b9e6a45759061ac654bdc1e3f)](https://www.codacy.com/gh/praksharma/DeepINN/dashboard?utm_source=github.com&amp;utm_medium=referral&amp;utm_content=praksharma/DeepINN&amp;utm_campaign=Badge_Grade)![Travis (.org) branch](https://app.travis-ci.com/praksharma/DeepINN.svg?branch=main)
[![Documentation Status](https://readthedocs.org/projects/deepinn/badge/?version=latest)](https://deepinn.readthedocs.io/en/latest/index.html?badge=latest) [![License](https://img.shields.io/badge/License-AGPL_v3-red.svg)](https://github.com/praksharma/DeepINN/blob/main/LICENSE)

[DeepINN](https://github.com/praksharma/DeepINN) is a deep-learning framework for solving forward and inverse problem involving PDEs using physics-informed neural networks (PINNs).
Binary file added docs/logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
56 changes: 56 additions & 0 deletions docs/references.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
---
@inproceedings{holdgraf_evidence_2014,
address = {Brisbane, Australia, Australia},
title = {Evidence for {Predictive} {Coding} in {Human} {Auditory} {Cortex}},
booktitle = {International {Conference} on {Cognitive} {Neuroscience}},
publisher = {Frontiers in Neuroscience},
author = {Holdgraf, Christopher Ramsay and de Heer, Wendy and Pasley, Brian N. and Knight, Robert T.},
year = {2014}
}

@article{holdgraf_rapid_2016,
title = {Rapid tuning shifts in human auditory cortex enhance speech intelligibility},
volume = {7},
issn = {2041-1723},
url = {http://www.nature.com/doifinder/10.1038/ncomms13654},
doi = {10.1038/ncomms13654},
number = {May},
journal = {Nature Communications},
author = {Holdgraf, Christopher Ramsay and de Heer, Wendy and Pasley, Brian N. and Rieger, Jochem W. and Crone, Nathan and Lin, Jack J. and Knight, Robert T. and Theunissen, Frédéric E.},
year = {2016},
pages = {13654},
file = {Holdgraf et al. - 2016 - Rapid tuning shifts in human auditory cortex enhance speech intelligibility.pdf:C\:\\Users\\chold\\Zotero\\storage\\MDQP3JWE\\Holdgraf et al. - 2016 - Rapid tuning shifts in human auditory cortex enhance speech intelligibility.pdf:application/pdf}
}

@inproceedings{holdgraf_portable_2017,
title = {Portable learning environments for hands-on computational instruction using container-and cloud-based technology to teach data science},
volume = {Part F1287},
isbn = {978-1-4503-5272-7},
doi = {10.1145/3093338.3093370},
abstract = {© 2017 ACM. There is an increasing interest in learning outside of the traditional classroom setting. This is especially true for topics covering computational tools and data science, as both are challenging to incorporate in the standard curriculum. These atypical learning environments offer new opportunities for teaching, particularly when it comes to combining conceptual knowledge with hands-on experience/expertise with methods and skills. Advances in cloud computing and containerized environments provide an attractive opportunity to improve the effciency and ease with which students can learn. This manuscript details recent advances towards using commonly-Available cloud computing services and advanced cyberinfrastructure support for improving the learning experience in bootcamp-style events. We cover the benets (and challenges) of using a server hosted remotely instead of relying on student laptops, discuss the technology that was used in order to make this possible, and give suggestions for how others could implement and improve upon this model for pedagogy and reproducibility.},
booktitle = {{ACM} {International} {Conference} {Proceeding} {Series}},
author = {Holdgraf, Christopher Ramsay and Culich, A. and Rokem, A. and Deniz, F. and Alegro, M. and Ushizima, D.},
year = {2017},
keywords = {Teaching, Bootcamps, Cloud computing, Data science, Docker, Pedagogy}
}

@article{holdgraf_encoding_2017,
title = {Encoding and decoding models in cognitive electrophysiology},
volume = {11},
issn = {16625137},
doi = {10.3389/fnsys.2017.00061},
abstract = {© 2017 Holdgraf, Rieger, Micheli, Martin, Knight and Theunissen. Cognitive neuroscience has seen rapid growth in the size and complexity of data recorded from the human brain as well as in the computational tools available to analyze this data. This data explosion has resulted in an increased use of multivariate, model-based methods for asking neuroscience questions, allowing scientists to investigate multiple hypotheses with a single dataset, to use complex, time-varying stimuli, and to study the human brain under more naturalistic conditions. These tools come in the form of “Encoding” models, in which stimulus features are used to model brain activity, and “Decoding” models, in which neural features are used to generated a stimulus output. Here we review the current state of encoding and decoding models in cognitive electrophysiology and provide a practical guide toward conducting experiments and analyses in this emerging field. Our examples focus on using linear models in the study of human language and audition. We show how to calculate auditory receptive fields from natural sounds as well as how to decode neural recordings to predict speech. The paper aims to be a useful tutorial to these approaches, and a practical introduction to using machine learning and applied statistics to build models of neural activity. The data analytic approaches we discuss may also be applied to other sensory modalities, motor systems, and cognitive systems, and we cover some examples in these areas. In addition, a collection of Jupyter notebooks is publicly available as a complement to the material covered in this paper, providing code examples and tutorials for predictive modeling in python. The aimis to provide a practical understanding of predictivemodeling of human brain data and to propose best-practices in conducting these analyses.},
journal = {Frontiers in Systems Neuroscience},
author = {Holdgraf, Christopher Ramsay and Rieger, J.W. and Micheli, C. and Martin, S. and Knight, R.T. and Theunissen, F.E.},
year = {2017},
keywords = {Decoding models, Encoding models, Electrocorticography (ECoG), Electrophysiology/evoked potentials, Machine learning applied to neuroscience, Natural stimuli, Predictive modeling, Tutorials}
}

@book{ruby,
title = {The Ruby Programming Language},
author = {Flanagan, David and Matsumoto, Yukihiro},
year = {2008},
publisher = {O'Reilly Media}
}
13 changes: 3 additions & 10 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,10 +1,3 @@
sphinx==5.0.2
sphinx_rtd_theme==1.0.0
myst-parser==0.18.1
nbsphinx==0.8.9
sphinxcontrib-applehelp==1.0.2
sphinxcontrib-devhelp==1.0.2
sphinxcontrib-htmlhelp==2.0.0
sphinxcontrib-jsmath==1.0.1
sphinxcontrib-qthelp==1.0.3
sphinxcontrib-serializinghtml==1.1.5
jupyter-book
matplotlib
numpy
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
10 changes: 10 additions & 0 deletions docs_legacy/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
sphinx==5.0.2
sphinx_rtd_theme==1.0.0
myst-parser==0.18.1
nbsphinx==0.8.9
sphinxcontrib-applehelp==1.0.2
sphinxcontrib-devhelp==1.0.2
sphinxcontrib-htmlhelp==2.0.0
sphinxcontrib-jsmath==1.0.1
sphinxcontrib-qthelp==1.0.3
sphinxcontrib-serializinghtml==1.1.5
1 change: 1 addition & 0 deletions requirements_docs.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
jupyter-book
8 changes: 6 additions & 2 deletions todo.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,19 +4,23 @@ Last work : `DeepINN/constraint/gradients.py`
## Geometry
- [ ] Geometry added but lot of bloats from TorchPhysics. Clean up geometry folder.
- [ ] Clean up utils folder.
- [ ] Implement anchor points.

## Gradients
- [x] Implement basic gradients.
- [ ] Implement gradient for multiple output neurons.
- [ ] Do we need retain_graph=True ?
- [x] Do we need retain_graph=True?

## Constraints
- [X] Implement the prescribed BC part in constraint/ boundary_loss dirichletBC.
- [X] Implement PDE loss constraint.
- [ ] Implement gradients.
- [ ] Implement lazy evaluation of gradients.
- [ ] Implement more constrainst.

## Architectures
- [ ] Implement fully connected NN.
- [x] Implement fully connected NN.
- [ ] Implement more neural networks

## Tutorials
- [x] Basic geometry tutorials.
Expand Down

0 comments on commit 208e340

Please sign in to comment.