A Jupyter kernel for Natural Language Programming
From 1773937eb4b5b3469389f1332ef1ee3d81907760 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Sun, 30 Oct 2022 16:53:07 +0100
Subject: [PATCH 09/21] Updated README
---
README.md | 11 ++++++++---
1 file changed, 8 insertions(+), 3 deletions(-)
diff --git a/README.md b/README.md
index 2f97125..e4aec76 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,5 @@
-
+
@@ -10,11 +10,16 @@
- A Jupyter kernel for Natural Language Programming
+ A Python library for [soft-code](https://en.wikipedia.org/wiki/Soft_computing) development — program in plain English with AI code generation!
-ICortex is a [Jupyter kernel](https://jupyter-client.readthedocs.io/en/latest/kernels.html) that lets you program using plain English, by generating Python code from natural language prompts:
+ICortex is a [Jupyter kernel](https://jupyter-client.readthedocs.io/en/latest/kernels.html) that lets you develop **soft programs**:
+
+- sets of instructions (or prompts) written in natural language (such as English)
+- that generate Python code
+- to perfom work in different contexts
+- more flexibly than regular software:
https://user-images.githubusercontent.com/2453968/196814906-1a0de2a1-27a7-4aec-a960-0eb21fbe2879.mp4
From 9b2228aaa0574a63507141a9c850443538972cfc Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Sun, 30 Oct 2022 17:58:13 +0100
Subject: [PATCH 10/21] Updated README
---
README.md | 40 ++++++++++++++++++++++++++++------------
1 file changed, 28 insertions(+), 12 deletions(-)
diff --git a/README.md b/README.md
index e4aec76..1993061 100644
--- a/README.md
+++ b/README.md
@@ -10,45 +10,61 @@
- A Python library for [soft-code](https://en.wikipedia.org/wiki/Soft_computing) development — program in plain English with AI code generation!
+ A Python library for soft-code development — program in plain English with AI code generation!
ICortex is a [Jupyter kernel](https://jupyter-client.readthedocs.io/en/latest/kernels.html) that lets you develop **soft programs**:
-- sets of instructions (or prompts) written in natural language (such as English)
-- that generate Python code
-- to perfom work in different contexts
-- more flexibly than regular software:
+- sets of instructions (i.e. prompts) [written in natural language](https://en.wikipedia.org/wiki/Natural-language_programming) (such as English)
+- given to language models that generate Python code
+- to perfom useful work in various contexts
+- more flexibly than regular software.
+
+To put simply, in goes English, out comes Python:
https://user-images.githubusercontent.com/2453968/196814906-1a0de2a1-27a7-4aec-a960-0eb21fbe2879.mp4
TODO: Prompts are given using the %prompt magic now, update the video accordingly
-It is ...
+ICortex is ...
- a drop-in replacement for the IPython kernel. Prompts can be executed with the [magic commands](https://ipython.readthedocs.io/en/stable/interactive/magics.html) `%prompt` or `%p` for short.
-- an interface for [Natural Language Programming](https://en.wikipedia.org/wiki/Natural-language_programming) interface—prompts written in plain English automatically generate Python code which can then be executed globally.
- interactive—install missing packages directly, decide whether to execute the generated code or not, and so on, directly in the Jupyter Notebook cell.
- open source and fully extensible—if you think we are missing a model or an API, you can request it by creating an issue, or implement it yourself by subclassing `ServiceBase` under [`icortex/services`](icortex/services).
-ICortex is currently in alpha, so expect breaking changes. We are giving free credits to our first users—[join our Discord](https://discord.textcortex.com/) to help us shape this product.
+It is similar to [Github Copilot](https://github.com/features/copilot) but with certain differences that make it stand out:
+
+| | GitHub Copilot | ICortex |
+|---|---|---|
+| Generates code ... | in the text editor | in a [Jupyter kernel](https://docs.jupyter.org/en/latest/projects/kernels.html) (language backend that provides the execution environment) |
+| From ... | existing code and comments | plain English prompts |
+| Level of control over context used to generate code | Low | High |
+| Plain language instructions are ... | just comments | standalone programs |
+| The resulting program is ... | static | dynamic—can adapt to the context it is executed in |
+| Can connect to different code generation APIs | No | Yes |
+
+In other words, the main difference between ICortex and a code-generation plugin like GitHub Copilot is that ICortex is its own programming paradigm similar to [literate programming](https://en.wikipedia.org/wiki/Literate_programming), where the natural language prompt is the first-class citizen, and which allows for fine-grained control over the code-generation context.
+
+ICortex is currently in alpha, so expect breaking changes. We are giving free credits to our first users—[join our Discord](https://discord.textcortex.com/) to help us shape it.
## Installation
-To install the ICortex Kernel, run the following in the main project directory:
+Install directly from PyPI:
```sh
pip install icortex
+# This line is needed to install the kernel spec to Jupyter
+python -m icortex.kernel.install
+# Alternatively, running icortex directly also installs the kernel spec
+icortex
```
-This will install the Python package and the `icortex` command line interface. You will need to run `icortex` once to install the kernel spec to Jupyter.
-
## Using ICortex
Before you can use ICortex in Jupyter, you need to configure it for your current project.
-If you are using the terminal:
+If you are using the terminal, go to your project directory and run:
```bash
icortex init
From d66a066f8d0c6e6861423679aca345ebde95666e Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Sun, 30 Oct 2022 20:45:48 +0100
Subject: [PATCH 11/21] wip
---
README.md | 24 ++--
docs/source/_static/logo-bottom-dark-bg.svg | 141 +++++++++++++++++++
docs/source/_static/logo-bottom-light-bg.svg | 141 +++++++++++++++++++
docs/source/conf.py | 45 +++++-
docs/source/index.rst | 30 ++--
docs/source/quickstart.rst | 139 +++++++++++++++++-
docs/source/reference.rst | 32 +++++
docs/source/specification.rst | 5 -
8 files changed, 525 insertions(+), 32 deletions(-)
create mode 100644 docs/source/_static/logo-bottom-dark-bg.svg
create mode 100644 docs/source/_static/logo-bottom-light-bg.svg
create mode 100644 docs/source/reference.rst
delete mode 100644 docs/source/specification.rst
diff --git a/README.md b/README.md
index 1993061..494857a 100644
--- a/README.md
+++ b/README.md
@@ -18,10 +18,10 @@ ICortex is a [Jupyter kernel](https://jupyter-client.readthedocs.io/en/latest/ke
- sets of instructions (i.e. prompts) [written in natural language](https://en.wikipedia.org/wiki/Natural-language_programming) (such as English)
- given to language models that generate Python code
-- to perfom useful work in various contexts
+- to perform useful work in various contexts
- more flexibly than regular software.
-To put simply, in goes English, out comes Python:
+To put it simply—in goes English, out comes Python:
https://user-images.githubusercontent.com/2453968/196814906-1a0de2a1-27a7-4aec-a960-0eb21fbe2879.mp4
@@ -35,16 +35,16 @@ ICortex is ...
It is similar to [Github Copilot](https://github.com/features/copilot) but with certain differences that make it stand out:
-| | GitHub Copilot | ICortex |
-|---|---|---|
-| Generates code ... | in the text editor | in a [Jupyter kernel](https://docs.jupyter.org/en/latest/projects/kernels.html) (language backend that provides the execution environment) |
-| From ... | existing code and comments | plain English prompts |
+| Feature | GitHub Copilot | ICortex |
+|---|:---:|:---:|
+| Generates code ... | In the text editor | In a [Jupyter kernel](https://docs.jupyter.org/en/latest/projects/kernels.html) (language backend that provides the execution environment) |
+| From ... | Existing code and comments | Plain English prompts |
| Level of control over context used to generate code | Low | High |
-| Plain language instructions are ... | just comments | standalone programs |
-| The resulting program is ... | static | dynamic—can adapt to the context it is executed in |
+| Plain language instructions are ... | Just comments | Standalone programs |
+| The resulting program is ... | Static | Dynamic—adapts to the context it is executed in |
| Can connect to different code generation APIs | No | Yes |
-In other words, the main difference between ICortex and a code-generation plugin like GitHub Copilot is that ICortex is its own programming paradigm similar to [literate programming](https://en.wikipedia.org/wiki/Literate_programming), where the natural language prompt is the first-class citizen, and which allows for fine-grained control over the code-generation context.
+The main difference between ICortex and a code-generation plugin like GitHub Copilot is that ICortex is a programming paradigm similar to [literate programming](https://en.wikipedia.org/wiki/Literate_programming) or [natural language programming](https://en.wikipedia.org/wiki/Natural-language_programming), where the natural language prompt is the first-class citizen, and which allows for fine-grained control over the code-generation context.
ICortex is currently in alpha, so expect breaking changes. We are giving free credits to our first users—[join our Discord](https://discord.textcortex.com/) to help us shape it.
@@ -54,9 +54,9 @@ Install directly from PyPI:
```sh
pip install icortex
-# This line is needed to install the kernel spec to Jupyter
+# This line is needed to install the kernel spec to Jupyter:
python -m icortex.kernel.install
-# Alternatively, running icortex directly also installs the kernel spec
+# Alternatively, running icortex directly also installs the kernel spec:
icortex
```
@@ -94,8 +94,6 @@ If you use up the starter credits and would like to continue testing out ICortex
You can also try out different services e.g. OpenAI's Codex API, if you have access. You can also run code generation models from HuggingFace locally, which we have optimized to run on the CPU—though these produce lower quality outputs due to being smaller.
-## Usage
-
### Executing prompts
To execute a prompt with ICortex, use the `%prompt` [magic command](https://ipython.readthedocs.io/en/stable/interactive/magics.html) (or `%p` for short) as a prefix. Copy and paste the following prompt into a cell and try to run it:
diff --git a/docs/source/_static/logo-bottom-dark-bg.svg b/docs/source/_static/logo-bottom-dark-bg.svg
new file mode 100644
index 0000000..bcc8810
--- /dev/null
+++ b/docs/source/_static/logo-bottom-dark-bg.svg
@@ -0,0 +1,141 @@
+
+
+
diff --git a/docs/source/_static/logo-bottom-light-bg.svg b/docs/source/_static/logo-bottom-light-bg.svg
new file mode 100644
index 0000000..d6e6f42
--- /dev/null
+++ b/docs/source/_static/logo-bottom-light-bg.svg
@@ -0,0 +1,141 @@
+
+
+
diff --git a/docs/source/conf.py b/docs/source/conf.py
index 9f74933..8f5e8d3 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -3,18 +3,26 @@
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
+import icortex
+
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = "ICortex"
copyright = "2022, TextCortex Team"
author = "TextCortex Team"
-release = "0.0.3"
+# release = "0.0.3"
+release = icortex.__version__
+html_title = f"ICortex v{icortex.__version__} docs"
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
-extensions = ["sphinxcontrib.video"]
+extensions = [
+ "sphinxcontrib.video",
+ "sphinx.ext.autodoc",
+ "sphinx.ext.autosummary",
+]
templates_path = ["_templates"]
exclude_patterns = []
@@ -25,3 +33,36 @@
html_theme = "furo"
html_static_path = ["_static"]
+
+html_theme_options = {
+ "source_repository": "https://github.com/textcortex/ICortex/",
+ "source_branch": "main",
+ "source_directory": "docs/source/",
+ "top_of_page_button": None,
+ "light_logo": "logo-bottom-light-bg.svg",
+ "dark_logo": "logo-bottom-dark-bg.svg",
+ # "light_css_variables": {
+ # "color-content-foreground": "#000000",
+ # "color-background-primary": "#ffffff",
+ # "color-background-border": "#ffffff",
+ # "color-sidebar-background": "#f8f9fb",
+ # "color-brand-content": "#1c00e3",
+ # "color-brand-primary": "#192bd0",
+ # "color-link": "#c93434",
+ # "color-link--hover": "#5b0000",
+ # "color-inline-code-background": "#f6f6f6;",
+ # "color-foreground-secondary": "#000",
+ # },
+ # "dark_css_variables": {
+ # "color-content-foreground": "#ffffffd9",
+ # "color-background-primary": "#131416",
+ # "color-background-border": "#303335",
+ # "color-sidebar-background": "#1a1c1e",
+ # "color-brand-content": "#2196f3",
+ # "color-brand-primary": "#007fff",
+ # "color-link": "#51ba86",
+ # "color-link--hover": "#9cefc6",
+ # "color-inline-code-background": "#262626",
+ # "color-foreground-secondary": "#ffffffd9",
+ # },
+}
diff --git a/docs/source/index.rst b/docs/source/index.rst
index 2cd148f..48cf27c 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -7,22 +7,28 @@ ICortex
=======
-ICortex is a `Jupyter kernel `__
-that lets you program using plain English, by generating Python
-code from natural language prompts:
+ICortex is a `Jupyter
+kernel `__
+that lets you develop **soft programs**:
+
+- sets of instructions (i.e. prompts) `written in natural
+ language `__
+ (such as English)
+- given to language models that generate Python code
+- to perform useful work in various contexts
+- more flexibly than regular software.
+
+To put it simply—in goes English, out comes Python:
.. video:: https://user-images.githubusercontent.com/2453968/196814906-1a0de2a1-27a7-4aec-a960-0eb21fbe2879.mp4
:width: 640
-It is …
+ICortex is …
-- a drop-in replacement for the IPython kernel. Prompts start with a
- forward slash ``/``—otherwise the line is treated as regular Python
- code.
-- an interface for `Natural Language
- Programming `__—prompts
- written in plain English automatically generate
- Python code which can then be executed globally.
+- a drop-in replacement for the IPython kernel. Prompts can be executed
+ with the `magic
+ commands `__
+ ``%prompt`` or ``%p`` for short.
- interactive—install missing packages directly, decide whether to
execute the generated code or not, and so on, directly in the Jupyter
Notebook cell.
@@ -45,6 +51,8 @@ Index
:maxdepth: 2
quickstart
+ learn_more
+ reference
.. * :ref:`genindex`
.. * :ref:`modindex`
diff --git a/docs/source/quickstart.rst b/docs/source/quickstart.rst
index e6f22cf..d1f14e3 100644
--- a/docs/source/quickstart.rst
+++ b/docs/source/quickstart.rst
@@ -2,5 +2,142 @@
Quickstart
==========
+Installation
+------------
-Some stuff
\ No newline at end of file
+Install ICortex from PyPI, along with JupyterLab:
+
+.. code:: sh
+
+ pip install icortex jupyterlab
+
+ # This line is needed to install the kernel spec to Jupyter:
+ python -m icortex.kernel.install
+
+ # Alternatively, running icortex in the terminal also installs the kernel spec:
+ icortex
+
+Using ICortex
+-------------
+
+Create a directory for your new project, and start JupyterLab:
+
+::
+
+ mkdir new-icortex-project
+ cd new-icortex-project/
+ jupyter lab
+
+Once JupyterLab is up and running, create a new notebook that uses ICortex. (If you don't see ICortex in the list of available kernels, you may have skipped kernel installation above—run ``python -m icortex.kernel.install`` and restart JupyterLab. If you still don't see ICortex there, `create a new installation issue on GitHub `__.)
+
+In the new notebook, run in the first cell:
+
+::
+
+ %icortex init
+
+ICortex will then instruct you step by step and create a configuration
+file ``icortex.toml`` in your project directory.
+
+Alternatively, you can run the following in the terminal to configure ICortex directly without JupyterLab:
+
+.. code:: bash
+
+ icortex init
+
+Choosing a code generation service
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+ICortex supports different code generation services such as the
+TextCortex API, OpenAI Codex API, local HuggingFace transformers, and so
+on.
+
+To use the TextCortex code generation API,
+
+1. `sign up on the website `__,
+2. `generate an API key on the
+ dashboard `__,
+3. and proceed to configure ``icortex`` for your current project:
+
+If you use up the starter credits and would like to continue testing out
+ICortex, `hit us up on our Discord on #icortex
+channel `__ and we will charge your
+account with more free credits.
+
+You can also try out different services e.g. OpenAI's Codex API, if you
+have access. You can run code generation models from HuggingFace
+locally, which we have optimized to run on the CPU—though these produce
+lower quality outputs due to being smaller.
+
+Executing prompts
+~~~~~~~~~~~~~~~~~
+
+To execute a prompt with ICortex, use the ``%prompt`` `magic
+command `__
+(or ``%p`` for short) as a prefix. Copy and paste the following prompt
+into a cell and try to run it:
+
+::
+
+ %p print Hello World. Then print the Fibonacci numbers till 100
+
+Depending on the response, you should see an output similar to the
+following:
+
+::
+
+ print('Hello World.', end=' ')
+ a, b = 0, 1
+ while b < 100:
+ print(b, end=' ')
+ a, b = b, a+b
+
+ Hello World.
+ 1 1 2 3 5 8 13 21 34 55 89
+
+You can also specify variables or options with command line flags,
+e.g. to auto-install packages, auto-execute the returned code and so on.
+To see the complete list of variables for your chosen service, run:
+
+::
+
+ %help
+
+Using ICortex CLI
+~~~~~~~~~~~~~~~~~
+
+ICortex comes with a full-fledged CLI similar to git or Docker CLI,
+which you can use to configure how you generate code in your project. To
+see all the commands you can invoke, run
+
+::
+
+ icortex help
+
+For example the command ``icortex service`` lets you configure the code
+generation service you would like to use. To see how to use each
+command, call them with ``help``
+
+::
+
+ icortex service help
+
+Accessing ICortex CLI inside Jupyter
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+You can still access the ``icortex`` CLI in a Jupyter Notebook or shell
+by using the magic command ``%icortex``. For example running the
+following in the terminal switches to a local HuggingFace model:
+
+::
+
+ icortex service set huggingface
+
+To do the same in a Jupyter Notebook, you can run
+
+::
+
+ %icortex service set huggingface
+
+in a cell, which initializes and switches to the new service directly in
+your Jupyter session.
\ No newline at end of file
diff --git a/docs/source/reference.rst b/docs/source/reference.rst
new file mode 100644
index 0000000..7898a14
--- /dev/null
+++ b/docs/source/reference.rst
@@ -0,0 +1,32 @@
+
+Reference
+=========
+
+ICortex is a `Jupyter kernel `__ that provides a `soft-programming `__ environment, allowing anyone to create programs with informal and imprecise language. Prompts written in natural language are used to generate Python code at runtime.
+
+ICortex overloads the regular `IPython kernel `__ with `magic commands `__ that provide code-generation capabilities and fine-grained control over generation context. As such, ICortex can run an existing IPython notebook without any compatibility issues. Cells that contain regular Python are executed in the global scope as usual.
+
+The history of code generation and execution is saved in hidden variables and is used to construct the context for each new code generation. In other words, API calls to generate code in a cell contains information about previously ran prompts, executed cells, their outputs and other metadata related to the notebook.
+
+Magic commands
+--------------
+
+``%prompt`` or ``%p``
+~~~~~~~~~~~~~~~~~~~~~
+
+The ``%prompt`` magic is used
+
+.. code:: ipython
+
+ %prompt This is a prompt and will be used to generate code
+
+
+.. code:: python
+
+
+
+.. currentmodule:: icortex
+
+.. automodule:: icortex
+ :members:
+ :undoc-members:
\ No newline at end of file
diff --git a/docs/source/specification.rst b/docs/source/specification.rst
deleted file mode 100644
index f1e1d1a..0000000
--- a/docs/source/specification.rst
+++ /dev/null
@@ -1,5 +0,0 @@
-
-ICortex Specification
-=====================
-
-Some stuff
\ No newline at end of file
From 1984003edc8259eb79fe4f3357b7f113869ac08f Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 00:25:01 +0100
Subject: [PATCH 12/21] wip
---
README.md | 2 +-
docs/source/conf.py | 3 +
docs/source/index.rst | 6 +-
docs/source/quickstart.rst | 185 ++++++++++++++++++++-----------
icortex/services/service_base.py | 6 +-
poetry.lock | 55 ++++++---
pyproject.toml | 1 +
7 files changed, 168 insertions(+), 90 deletions(-)
diff --git a/README.md b/README.md
index 494857a..f22d9de 100644
--- a/README.md
+++ b/README.md
@@ -17,7 +17,7 @@
ICortex is a [Jupyter kernel](https://jupyter-client.readthedocs.io/en/latest/kernels.html) that lets you develop **soft programs**:
- sets of instructions (i.e. prompts) [written in natural language](https://en.wikipedia.org/wiki/Natural-language_programming) (such as English)
-- given to language models that generate Python code
+- processed by language models that generate Python code
- to perform useful work in various contexts
- more flexibly than regular software.
diff --git a/docs/source/conf.py b/docs/source/conf.py
index 8f5e8d3..09f3f99 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -22,6 +22,7 @@
"sphinxcontrib.video",
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
+ "sphinx_copybutton",
]
templates_path = ["_templates"]
@@ -41,6 +42,7 @@
"top_of_page_button": None,
"light_logo": "logo-bottom-light-bg.svg",
"dark_logo": "logo-bottom-dark-bg.svg",
+ "announcement": "TextCortex loves Open Source ❤️ Join our Discord to become part of the community!",
# "light_css_variables": {
# "color-content-foreground": "#000000",
# "color-background-primary": "#ffffff",
@@ -66,3 +68,4 @@
# "color-foreground-secondary": "#ffffffd9",
# },
}
+
diff --git a/docs/source/index.rst b/docs/source/index.rst
index 48cf27c..865e1d4 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -14,7 +14,7 @@ that lets you develop **soft programs**:
- sets of instructions (i.e. prompts) `written in natural
language `__
(such as English)
-- given to language models that generate Python code
+- processed by language models that generate Python code
- to perform useful work in various contexts
- more flexibly than regular software.
@@ -40,9 +40,9 @@ ICortex is …
Get started
-----------
-Visit :doc:`Quickstart` to see all the ways you can start using ICortex.
+Visit :doc:`Quickstart` to get started with ICortex.
-If you are experiencing any problems or bugs, `join our Discord `__ to get help.
+If you are experiencing any issues or have found a bug, `join our Discord `__ to get help.
Index
-----
diff --git a/docs/source/quickstart.rst b/docs/source/quickstart.rst
index d1f14e3..0069203 100644
--- a/docs/source/quickstart.rst
+++ b/docs/source/quickstart.rst
@@ -10,16 +10,23 @@ Install ICortex from PyPI, along with JupyterLab:
.. code:: sh
pip install icortex jupyterlab
-
- # This line is needed to install the kernel spec to Jupyter:
python -m icortex.kernel.install
- # Alternatively, running icortex in the terminal also installs the kernel spec:
- icortex
+The second line is needed to install the kernel spec into Jupyter, otherwise, you may not be able to see it in JupyterLab. To confirm that the kernel spec is installed, run:
+
+.. code:: sh
+
+ jupyter kernelspec list
+
+ICortex should be visible in the list of available kernels.
Using ICortex
-------------
+Start JupyterLab
+~~~~~~~~~~~~~~~~
+
+
Create a directory for your new project, and start JupyterLab:
::
@@ -28,7 +35,10 @@ Create a directory for your new project, and start JupyterLab:
cd new-icortex-project/
jupyter lab
-Once JupyterLab is up and running, create a new notebook that uses ICortex. (If you don't see ICortex in the list of available kernels, you may have skipped kernel installation above—run ``python -m icortex.kernel.install`` and restart JupyterLab. If you still don't see ICortex there, `create a new installation issue on GitHub `__.)
+Once JupyterLab is up and running, create a new notebook that uses ICortex.
+
+.. important::
+ If you don't see ICortex in the list of available kernels, you may have skipped kernel installation above—run ``python -m icortex.kernel.install`` and restart JupyterLab. If you still don't see ICortex there, `create a new installation issue on GitHub `__.
In the new notebook, run in the first cell:
@@ -36,108 +46,153 @@ In the new notebook, run in the first cell:
%icortex init
+.. note::
+ To run a cell, press Shift+Enter, or click the play symbol ▶ above.
+
ICortex will then instruct you step by step and create a configuration
file ``icortex.toml`` in your project directory.
-Alternatively, you can run the following in the terminal to configure ICortex directly without JupyterLab:
+.. note::
+ If you are not working in JupyterLab, you can run ``icortex init`` directly in the terminal to configure ICortex in your project directory.
+
+After running ``%icortex init`` you should see the following message:
+
+::
-.. code:: bash
+ Which code generation service would you like to use?
+ Variables: textcortex, huggingface, openai
+ Default [textcortex]
- icortex init
-Choosing a code generation service
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ICortex supports different code generation services such as the TextCortex API, OpenAI Codex API, local HuggingFace transformers, and so on. We recommend that you start with TextCortex. It is already selected by default—**press Enter once** to move on to the next step.
-ICortex supports different code generation services such as the
-TextCortex API, OpenAI Codex API, local HuggingFace transformers, and so
-on.
+In the next step, the dialog will ask whether to use the default parameters for TextCortex's code generation service:
-To use the TextCortex code generation API,
+::
-1. `sign up on the website `__,
-2. `generate an API key on the
- dashboard `__,
-3. and proceed to configure ``icortex`` for your current project:
+ Use default variable values? [Y/n]
-If you use up the starter credits and would like to continue testing out
-ICortex, `hit us up on our Discord on #icortex
-channel `__ and we will charge your
-account with more free credits.
+**Press Enter once** to choose 'yes' and use the default values. You should see the following message:
-You can also try out different services e.g. OpenAI's Codex API, if you
-have access. You can run code generation models from HuggingFace
-locally, which we have optimized to run on the CPU—though these produce
-lower quality outputs due to being smaller.
+.. code:: text
-Executing prompts
-~~~~~~~~~~~~~~~~~
+ api_key (If you don't have an API key already, generate one at
+ https://app.textcortex.com/user/dashboard/settings/api-key)
-To execute a prompt with ICortex, use the ``%prompt`` `magic
-command `__
-(or ``%p`` for short) as a prefix. Copy and paste the following prompt
-into a cell and try to run it:
-::
+Do not type anything yet, and proceed to the next section.
+
+Create a TextCortex account
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+You need a TextCortex account to connect to TextCortex's code generation API.
+
+|signup_link|.
- %p print Hello World. Then print the Fibonacci numbers till 100
+.. note::
+ This does not require you to enter any payment information and your account will receive free credits to try out the service. If you already have an account, you can skip this step.
-Depending on the response, you should see an output similar to the
-following:
+Next, |api_key_link|.
+
+Copy your API key from the dashboard, go back to the Jupyter notebook where you initially ran ``%icortex init``, and paste it in the dialog where it was asked for. Press Enter to continue.
+
+You should finally see:
::
+ Set service to textcortex successfully.
+
+🎉 Congratulations! ICortex is configured for your current project.
+
+.. note::
+
+ If you use up the starter credits and would like to continue testing out
+ ICortex, `hit us up on our Discord on #icortex channel `__, and we will provide your account with more free credits.
+
+
+Generate your first code
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+ICortex uses the standard IPython `magic
+command `__ syntax—i.e. commands that are prefixed with ``%`` and ``%%``—for various operations, such as generating code from prompts.
+
+The ``%prompt`` magic command is used to generate Python code. Copy and paste the following prompt into a cell and try to run it:
+
+.. code:: text
+
+ %prompt print Hello World. Then print the Fibonacci numbers till 100
+
+The response may vary, but you should see an output similar to the following:
+
+.. code:: python
+
print('Hello World.', end=' ')
a, b = 0, 1
while b < 100:
print(b, end=' ')
a, b = b, a+b
+ Proceed to execute? [Y/n]
+
+ICortex printed the code generated by the API and is asking whether it should execute it. Press Enter to choose 'yes':
+
+.. code:: text
+
Hello World.
1 1 2 3 5 8 13 21 34 55 89
-You can also specify variables or options with command line flags,
-e.g. to auto-install packages, auto-execute the returned code and so on.
-To see the complete list of variables for your chosen service, run:
+🎉 Congratulations! You have generated your first Python code using ICortex.
-::
+.. important::
+ ICortex executes the generated code in the notebook's namespace, so any new variable assigned in the generated code becomes immediately available for access in new notebook cells. Try to print any such variables in a new cell:
- %help
+ .. code:: python
-Using ICortex CLI
-~~~~~~~~~~~~~~~~~
+ print(a, b)
-ICortex comes with a full-fledged CLI similar to git or Docker CLI,
-which you can use to configure how you generate code in your project. To
-see all the commands you can invoke, run
+ If your generated code has the same variable names, then this should return:
-::
+ .. code:: text
- icortex help
+ 89, 144
-For example the command ``icortex service`` lets you configure the code
-generation service you would like to use. To see how to use each
-command, call them with ``help``
+.. important::
+ Try to run the cell that starts with ``%prompt ...`` again. You might notice that the response was faster than the first time you ran it. That is because ICortex caches API responses in a file called ``cache.json`` in your project directory, and uses the cache to serve previous responses identical requests. This helps you prevent any unnecessary costs in case you would like to run the notebook from scratch.
-::
+ To override this default behavior, you can use the ``-r`` or ``--regenerate`` flag at the end of your prompts. This will ensure that the TextCortex API will be called every time the prompt is run.
- icortex service help
+.. note::
+ ICortex adheres to the POSIX argument syntax as implemented by the `Python argparse library `__, and provides various command line flags you can use to e.g. auto-install missing packages, auto-execute the returned code and so on. Moreover, each new code generation service can easily implement their own flags.
+ To see the complete list of options available to your chosen service, run:
-Accessing ICortex CLI inside Jupyter
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ .. code:: text
-You can still access the ``icortex`` CLI in a Jupyter Notebook or shell
-by using the magic command ``%icortex``. For example running the
-following in the terminal switches to a local HuggingFace model:
+ %help
-::
+ which should print out:
- icortex service set huggingface
+ .. code:: text
-To do the same in a Jupyter Notebook, you can run
+ usage: %prompt your prompt goes here [-e] [-r] [-i] [-p] ...
-::
+ TextCortex Python code generator
+
+ positional arguments:
+ prompt The prompt that describes what the generated Python
+ code should perform.
+
+ options:
+ -e, --execute Execute the Python code returned by the TextCortex API.
+ -r, --regenerate Make the kernel ignore cached responses and make a new
+ request to TextCortex API.
+ ...
+
+ and so on.
+
+.. |signup_link| raw:: html
+
+ Click here sign up on the website
- %icortex service set huggingface
+.. |api_key_link| raw:: html
-in a cell, which initializes and switches to the new service directly in
-your Jupyter session.
\ No newline at end of file
+ click here to visit the dashboard and generate an API key
\ No newline at end of file
diff --git a/icortex/services/service_base.py b/icortex/services/service_base.py
index af76f0f..226e5cb 100644
--- a/icortex/services/service_base.py
+++ b/icortex/services/service_base.py
@@ -81,14 +81,14 @@ def __init__(self, config: t.Dict):
"--execute",
action="store_true",
required=False,
- help="Execute the Python code returned by the TextCortex API in the same cell.",
+ help="Execute the Python code returned by TextCortex API.",
)
self.prompt_parser.add_argument(
"-r",
"--regenerate",
action="store_true",
required=False,
- help="Make the kernel ignore cached responses and makes a new request to the TextCortex API.",
+ help="Make the kernel ignore cached responses and make a new request to TextCortex API.",
)
self.prompt_parser.add_argument(
"-i",
@@ -118,7 +118,7 @@ def __init__(self, config: t.Dict):
required=False,
help="Do not print the generated code.",
)
- self.prompt_parser.usage = "%p your prompt goes here [-e] [-r] [-i] [-p] ..."
+ self.prompt_parser.usage = "%%prompt your prompt goes here [-e] [-r] [-i] [-p] ..."
self.prompt_parser.description = self.description
diff --git a/poetry.lock b/poetry.lock
index 0dc82c5..a3acd27 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -2,7 +2,7 @@
name = "alabaster"
version = "0.7.12"
description = "A configurable sidebar-enabled Sphinx theme"
-category = "dev"
+category = "main"
optional = false
python-versions = "*"
@@ -46,7 +46,7 @@ tests_no_zope = ["cloudpickle", "coverage[toml] (>=5.0.2)", "hypothesis", "mypy
name = "Babel"
version = "2.10.3"
description = "Internationalization utilities"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.6"
@@ -134,7 +134,7 @@ python-versions = ">=3.5"
name = "docutils"
version = "0.19"
description = "Docutils -- Python Documentation Utilities"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.7"
@@ -194,7 +194,7 @@ python-versions = ">=3.5"
name = "imagesize"
version = "1.4.1"
description = "Getting image size from png/jpeg/jpeg2000/gif file"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
@@ -202,7 +202,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
name = "importlib-metadata"
version = "5.0.0"
description = "Read metadata from Python packages"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.7"
@@ -319,7 +319,7 @@ testing = ["Django (<3.1)", "colorama", "docopt", "pytest (<7.0.0)"]
name = "Jinja2"
version = "3.1.2"
description = "A very fast and expressive template engine."
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.7"
@@ -395,7 +395,7 @@ python-versions = ">=3.7"
name = "MarkupSafe"
version = "2.1.1"
description = "Safely add untrusted strings to HTML/XML markup."
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.7"
@@ -594,7 +594,7 @@ six = ">=1.5"
name = "pytz"
version = "2022.5"
description = "World timezone definitions, modern and historical"
-category = "dev"
+category = "main"
optional = false
python-versions = "*"
@@ -648,7 +648,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
name = "snowballstemmer"
version = "2.2.0"
description = "This package provides 29 stemmers for 28 languages generated from Snowball algorithms."
-category = "dev"
+category = "main"
optional = false
python-versions = "*"
@@ -664,7 +664,7 @@ python-versions = ">=3.6"
name = "Sphinx"
version = "5.3.0"
description = "Python documentation generator"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.6"
@@ -706,11 +706,26 @@ sphinx = ">=4.0"
[package.extras]
docs = ["furo", "ipython", "myst-parser", "sphinx-copybutton", "sphinx-inline-tabs"]
+[[package]]
+name = "sphinx-copybutton"
+version = "0.5.0"
+description = "Add a copy button to each of your code cells."
+category = "main"
+optional = false
+python-versions = ">=3.6"
+
+[package.dependencies]
+sphinx = ">=1.8"
+
+[package.extras]
+code_style = ["pre-commit (==2.12.1)"]
+rtd = ["ipython", "myst-nb", "sphinx", "sphinx-book-theme"]
+
[[package]]
name = "sphinxcontrib-applehelp"
version = "1.0.2"
description = "sphinxcontrib-applehelp is a sphinx extension which outputs Apple help books"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.5"
@@ -722,7 +737,7 @@ test = ["pytest"]
name = "sphinxcontrib-devhelp"
version = "1.0.2"
description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.5"
@@ -734,7 +749,7 @@ test = ["pytest"]
name = "sphinxcontrib-htmlhelp"
version = "2.0.0"
description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.6"
@@ -746,7 +761,7 @@ test = ["html5lib", "pytest"]
name = "sphinxcontrib-jsmath"
version = "1.0.1"
description = "A sphinx extension which renders display math in HTML via JavaScript"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.5"
@@ -757,7 +772,7 @@ test = ["flake8", "mypy", "pytest"]
name = "sphinxcontrib-qthelp"
version = "1.0.3"
description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.5"
@@ -769,7 +784,7 @@ test = ["pytest"]
name = "sphinxcontrib-serializinghtml"
version = "1.1.5"
description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.5"
@@ -873,7 +888,7 @@ python-versions = ">=3.7"
name = "zipp"
version = "3.10.0"
description = "Backport of pathlib-compatible object wrapper for zip files"
-category = "dev"
+category = "main"
optional = false
python-versions = ">=3.7"
@@ -888,7 +903,7 @@ openai = []
[metadata]
lock-version = "1.1"
python-versions = ">=3.8,<3.11"
-content-hash = "1c83d01d682c42c7e08d734eaf142ede766d3f44966e5d9d45532b12d838ef8e"
+content-hash = "fe29011fe6b139c5405b6d1522047901bf4bd48768a3eef771dbb556970ce7d8"
[metadata.files]
alabaster = [
@@ -1361,6 +1376,10 @@ sphinx-basic-ng = [
{file = "sphinx_basic_ng-1.0.0b1-py3-none-any.whl", hash = "sha256:ade597a3029c7865b24ad0eda88318766bcc2f9f4cef60df7e28126fde94db2a"},
{file = "sphinx_basic_ng-1.0.0b1.tar.gz", hash = "sha256:89374bd3ccd9452a301786781e28c8718e99960f2d4f411845ea75fc7bb5a9b0"},
]
+sphinx-copybutton = [
+ {file = "sphinx-copybutton-0.5.0.tar.gz", hash = "sha256:a0c059daadd03c27ba750da534a92a63e7a36a7736dcf684f26ee346199787f6"},
+ {file = "sphinx_copybutton-0.5.0-py3-none-any.whl", hash = "sha256:9684dec7434bd73f0eea58dda93f9bb879d24bff2d8b187b1f2ec08dfe7b5f48"},
+]
sphinxcontrib-applehelp = [
{file = "sphinxcontrib-applehelp-1.0.2.tar.gz", hash = "sha256:a072735ec80e7675e3f432fcae8610ecf509c5f1869d17e2eecff44389cdbc58"},
{file = "sphinxcontrib_applehelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:806111e5e962be97c29ec4c1e7fe277bfd19e9652fb1a4392105b43e01af885a"},
diff --git a/pyproject.toml b/pyproject.toml
index 77b5f84..039dd08 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -71,6 +71,7 @@ huggingface = [
[tool.poetry.group.dev.dependencies]
Sphinx = "^5.3.0"
sphinxcontrib-video = "^0.0.1.dev3"
+sphinx-copybutton = "^0.5.0"
furo = "^2022.9.29"
pytest = "^7.1.3"
From 3f7f1895013ad9d90a6b307436aee55ef43ea153 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 10:25:52 +0100
Subject: [PATCH 13/21] Minor
---
docs/source/conf.py | 2 +-
docs/source/quickstart.rst | 11 ++++++++---
icortex/context.py | 4 ++++
icortex/services/service_base.py | 2 +-
4 files changed, 14 insertions(+), 5 deletions(-)
diff --git a/docs/source/conf.py b/docs/source/conf.py
index 09f3f99..4c38fa7 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -42,7 +42,7 @@
"top_of_page_button": None,
"light_logo": "logo-bottom-light-bg.svg",
"dark_logo": "logo-bottom-dark-bg.svg",
- "announcement": "TextCortex loves Open Source ❤️ Join our Discord to become part of the community!",
+ "announcement": "TextCortex loves Open Source ❤️ Join our Discord to become part of our developer community!",
# "light_css_variables": {
# "color-content-foreground": "#000000",
# "color-background-primary": "#ffffff",
diff --git a/docs/source/quickstart.rst b/docs/source/quickstart.rst
index 0069203..b8fb874 100644
--- a/docs/source/quickstart.rst
+++ b/docs/source/quickstart.rst
@@ -157,7 +157,7 @@ ICortex printed the code generated by the API and is asking whether it should ex
89, 144
.. important::
- Try to run the cell that starts with ``%prompt ...`` again. You might notice that the response was faster than the first time you ran it. That is because ICortex caches API responses in a file called ``cache.json`` in your project directory, and uses the cache to serve previous responses identical requests. This helps you prevent any unnecessary costs in case you would like to run the notebook from scratch.
+ Try to run the cell that starts with ``%prompt ...`` again. You might notice that the response was faster than the first time you ran it. That is because ICortex caches API responses in a file called ``cache.json`` in your project directory, and uses the cache to serve previous responses for identical requests. This helps you prevent any unnecessary costs in case you would like to run the notebook from scratch.
To override this default behavior, you can use the ``-r`` or ``--regenerate`` flag at the end of your prompts. This will ensure that the TextCortex API will be called every time the prompt is run.
@@ -182,12 +182,17 @@ ICortex printed the code generated by the API and is asking whether it should ex
code should perform.
options:
- -e, --execute Execute the Python code returned by the TextCortex API.
+ -e, --execute Execute the Python code returned by the TextCortex API
+ directly.
-r, --regenerate Make the kernel ignore cached responses and make a new
request to TextCortex API.
...
- and so on.
+
+
+Read and analyze a CSV file
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
.. |signup_link| raw:: html
diff --git a/icortex/context.py b/icortex/context.py
index 213e852..30473ec 100644
--- a/icortex/context.py
+++ b/icortex/context.py
@@ -78,7 +78,11 @@ def add_prompt(
ret = {
"cell_type": "code",
+ # It is actually a prompt, but "code" here refers to the Jupyter cell type
"metadata": {
+ # Any ICortex specific information needs to be stored here to
+ # adhere to the Jupyter notebook format
+ "source_type": "prompt", # This tells that the input was a prompt
"service": service_interaction,
},
"source": prompt,
diff --git a/icortex/services/service_base.py b/icortex/services/service_base.py
index 226e5cb..2330311 100644
--- a/icortex/services/service_base.py
+++ b/icortex/services/service_base.py
@@ -81,7 +81,7 @@ def __init__(self, config: t.Dict):
"--execute",
action="store_true",
required=False,
- help="Execute the Python code returned by TextCortex API.",
+ help="Execute the Python code returned by TextCortex API directly.",
)
self.prompt_parser.add_argument(
"-r",
From 869dd8249f4ebec7112c33c006e042a53c26773b Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:12:43 +0100
Subject: [PATCH 14/21] Added docstrings and more dcoumentation
---
docs/source/api.rst | 36 +++++++++++
docs/source/conf.py | 4 +-
docs/source/index.rst | 2 +-
docs/source/learn_more.rst | 4 ++
docs/source/quickstart.rst | 20 +++----
docs/source/reference.rst | 45 ++++++++++----
icortex/config.py | 2 +-
icortex/context.py | 5 +-
icortex/kernel/__init__.py | 4 ++
icortex/services/__init__.py | 17 +++++-
icortex/services/echo.py | 2 +-
icortex/services/huggingface.py | 10 ++--
icortex/services/openai.py | 8 +--
icortex/services/service_base.py | 100 ++++++++++++++++++++++++++++---
icortex/services/textcortex.py | 10 ++--
15 files changed, 214 insertions(+), 55 deletions(-)
create mode 100644 docs/source/api.rst
create mode 100644 docs/source/learn_more.rst
diff --git a/docs/source/api.rst b/docs/source/api.rst
new file mode 100644
index 0000000..307e1a3
--- /dev/null
+++ b/docs/source/api.rst
@@ -0,0 +1,36 @@
+
+API
+===
+
+ICortex tries to inherit from IPython as much as possible and tries to adhere already established standards.
+
+Kernel
+~~~~~~~~
+
+.. automodule:: icortex.kernel
+ :members:
+ :show-inheritance:
+
+Services
+~~~~~~~~
+
+This sections explains how to add or extend new code generation services.
+
+.. automodule:: icortex.services.service_base
+ :members:
+ :show-inheritance:
+
+.. automodule:: icortex.services
+ :members:
+ :show-inheritance:
+
+.. automodule:: icortex.services.textcortex
+ :members:
+ :show-inheritance:
+
+Context
+~~~~~~~
+
+.. automodule:: icortex.context
+ :members:
+ :show-inheritance:
diff --git a/docs/source/conf.py b/docs/source/conf.py
index 4c38fa7..f73f3e5 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -22,6 +22,9 @@
"sphinxcontrib.video",
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
+ "sphinx.ext.autodoc",
+ "sphinx.ext.coverage",
+ "sphinx.ext.napoleon",
"sphinx_copybutton",
]
@@ -68,4 +71,3 @@
# "color-foreground-secondary": "#ffffffd9",
# },
}
-
diff --git a/docs/source/index.rst b/docs/source/index.rst
index 865e1d4..41cd26a 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -51,8 +51,8 @@ Index
:maxdepth: 2
quickstart
- learn_more
reference
+ api
.. * :ref:`genindex`
.. * :ref:`modindex`
diff --git a/docs/source/learn_more.rst b/docs/source/learn_more.rst
new file mode 100644
index 0000000..39ccd33
--- /dev/null
+++ b/docs/source/learn_more.rst
@@ -0,0 +1,4 @@
+Learn more
+==========
+
+TODO
\ No newline at end of file
diff --git a/docs/source/quickstart.rst b/docs/source/quickstart.rst
index b8fb874..900183e 100644
--- a/docs/source/quickstart.rst
+++ b/docs/source/quickstart.rst
@@ -46,7 +46,7 @@ In the new notebook, run in the first cell:
%icortex init
-.. note::
+.. tip::
To run a cell, press Shift+Enter, or click the play symbol ▶ above.
ICortex will then instruct you step by step and create a configuration
@@ -134,7 +134,7 @@ The response may vary, but you should see an output similar to the following:
Proceed to execute? [Y/n]
-ICortex printed the code generated by the API and is asking whether it should execute it. Press Enter to choose 'yes':
+ICortex printed the code generated by the API and is now asking whether it should be executed. Press Enter to choose 'yes':
.. code:: text
@@ -156,23 +156,19 @@ ICortex printed the code generated by the API and is asking whether it should ex
89, 144
-.. important::
+.. tip::
Try to run the cell that starts with ``%prompt ...`` again. You might notice that the response was faster than the first time you ran it. That is because ICortex caches API responses in a file called ``cache.json`` in your project directory, and uses the cache to serve previous responses for identical requests. This helps you prevent any unnecessary costs in case you would like to run the notebook from scratch.
- To override this default behavior, you can use the ``-r`` or ``--regenerate`` flag at the end of your prompts. This will ensure that the TextCortex API will be called every time the prompt is run.
+ To override the default behavior, you can use the ``-r`` or ``--regenerate`` flag at the end of your prompts. This will ensure that the TextCortex API will be called every time the prompt is run.
.. note::
- ICortex adheres to the POSIX argument syntax as implemented by the `Python argparse library `__, and provides various command line flags you can use to e.g. auto-install missing packages, auto-execute the returned code and so on. Moreover, each new code generation service can easily implement their own flags.
- To see the complete list of options available to your chosen service, run:
+ ICortex adheres to the POSIX argument syntax as implemented by the `Python argparse library `__, and provides various command line flags you can use to auto-install missing packages, auto-execute the generated code and so on. Moreover, each new code generation service can easily implement their own flags.
+ To see the complete list of options available to your chosen service, run ``%help``:
.. code:: text
%help
- which should print out:
-
- .. code:: text
-
usage: %prompt your prompt goes here [-e] [-r] [-i] [-p] ...
TextCortex Python code generator
@@ -190,8 +186,8 @@ ICortex printed the code generated by the API and is asking whether it should ex
-Read and analyze a CSV file
-~~~~~~~~~~~~~~~~~~~~~~~~~~~
+.. Read and analyze a CSV file
+.. ~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. |signup_link| raw:: html
diff --git a/docs/source/reference.rst b/docs/source/reference.rst
index 7898a14..30b0c70 100644
--- a/docs/source/reference.rst
+++ b/docs/source/reference.rst
@@ -2,31 +2,54 @@
Reference
=========
-ICortex is a `Jupyter kernel `__ that provides a `soft-programming `__ environment, allowing anyone to create programs with informal and imprecise language. Prompts written in natural language are used to generate Python code at runtime.
+ICortex is a `Jupyter kernel `__ that provides a `soft-programming `__ environment, allowing anyone to create programs with less formal and precise language. Prompts written in natural language are used to generate Python code at runtime.
-ICortex overloads the regular `IPython kernel `__ with `magic commands `__ that provide code-generation capabilities and fine-grained control over generation context. As such, ICortex can run an existing IPython notebook without any compatibility issues. Cells that contain regular Python are executed in the global scope as usual.
+ICortex overloads the regular `IPython kernel `__ with `magic commands `__ that provide code-generation capabilities and fine-grained control over generation context. As such, ICortex can run an existing IPython notebook without any compatibility issues. Cells that contain regular Python are executed in the current namespace as usual.
-The history of code generation and execution is saved in hidden variables and is used to construct the context for each new code generation. In other words, API calls to generate code in a cell contains information about previously ran prompts, executed cells, their outputs and other metadata related to the notebook.
+Code generation and execution history is saved in hidden variables and is used to construct the context for each new code generation. In other words, API calls to generate code in a cell contain information about previously ran prompts, executed cells, their outputs and other metadata related to the notebook.
Magic commands
--------------
-``%prompt`` or ``%p``
-~~~~~~~~~~~~~~~~~~~~~
+``%prompt``
+~~~~~~~~~~~
+
+The ``%prompt`` magic is used to describe what the generated code should perform.
+Running the command passes the prompt to
+:func:`ServiceBase.generate() `.
+Each service defines how to parse the prompt and generate code
+individually—each class that derives from ServiceBase
+provides :attr:`prompt_parser `
+to parse the prompt and retrieve parameter values.
-The ``%prompt`` magic is used
.. code:: ipython
%prompt This is a prompt and will be used to generate code
+The string that follows the prompt is parsed by
+`argparse.ArgumentParser `__:
+
+.. code:: ipython
+
+ %prompt You can change how a prompt is run by providing flags -r --max-tokens 128
+
+You can escape flags by surrounding the prompt text in single or double quotes:
+
+.. code:: ipython
+
+ %prompt "This flag -r is parsed verbatim"
-.. code:: python
+Running ``%help`` prints a list of all the options you can generate code with your
+chosen service:
+.. tip::
+ ICortex provides the alias magic ``%p``, to let you write
+ prompts faster:
+ .. code :: ipython
-.. currentmodule:: icortex
+ %p This is the same as calling "%prompt ..."
-.. automodule:: icortex
- :members:
- :undoc-members:
\ No newline at end of file
+.. ``%icortex``
+.. ~~~~~~~~~~~~
\ No newline at end of file
diff --git a/icortex/config.py b/icortex/config.py
index b9a6710..15686ad 100644
--- a/icortex/config.py
+++ b/icortex/config.py
@@ -107,7 +107,7 @@ def set_service(self):
service_config = self.dict[service_name]
service_class = get_service(service_name)
- self.kernel.set_service(service_class(service_config))
+ self.kernel.set_service(service_class(**service_config))
return True
def ask_which_service(self) -> str:
diff --git a/icortex/context.py b/icortex/context.py
index 30473ec..9d1d750 100644
--- a/icortex/context.py
+++ b/icortex/context.py
@@ -35,8 +35,9 @@ class ICortexHistory:
"""Interface to construct a history variable in globals for storing
notebook context.
The constructed dict maps to JSON, and the schema is compatible
- with the Jupyter notebook format:
- https://nbformat.readthedocs.io/en/latest/format_description.html"""
+ with the
+ `Jupyter notebook format `__:
+ """
def __init__(self, scope: t.Dict[str, t.Any]):
self.scope = scope
diff --git a/icortex/kernel/__init__.py b/icortex/kernel/__init__.py
index b90ef94..8fed2b5 100644
--- a/icortex/kernel/__init__.py
+++ b/icortex/kernel/__init__.py
@@ -32,6 +32,10 @@
class ICortexKernel(IPythonKernel, SingletonConfigurable):
+ """Class that implements the ICortext kernel. It is basically
+ :class:`ipykernel.ipkernel.IPythonKernel` with magic commands
+ and logic for handling code generation.
+ """
implementation = "ICortex"
implementation_version = __version__
language = "no-op"
diff --git a/icortex/services/__init__.py b/icortex/services/__init__.py
index 11e5d03..53c1567 100644
--- a/icortex/services/__init__.py
+++ b/icortex/services/__init__.py
@@ -11,7 +11,10 @@
DEFAULT_SERVICE,
)
-service_dict = {
+#: A dictionary that maps unique service names to corresponding
+#: classes that derive from :class:`icortex.services.service_base.ServiceBase`.
+#: Extend this to add new code generation services to ICortex.
+AVAILABLE_SERVICES: t.Dict[str, str] = {
"echo": "icortex.services.echo.EchoService",
"textcortex": "icortex.services.textcortex.TextCortexService",
"openai": "icortex.services.openai.OpenAIService",
@@ -20,7 +23,15 @@
def get_service(name: str) -> t.Type[ServiceBase]:
- path = service_dict[name]
+ """Get the class corresponding a service name
+
+ Args:
+ name (str): Name of the service in :data:`AVAILABLE_SERVICES`
+
+ Returns:
+ Type[ServiceBase]: A class that derives from ServiceBase
+ """
+ path = AVAILABLE_SERVICES[name]
module_path, service_name = path.rsplit(".", 1)
module = importlib.import_module(module_path)
service = module.__dict__[service_name]
@@ -31,7 +42,7 @@ def get_available_services() -> t.List[str]:
# sorted_services = sorted(
# [key for key, val in service_dict.items() if not val.hidden]
# )
- sorted_services = list(sorted(service_dict.keys()))
+ sorted_services = list(sorted(AVAILABLE_SERVICES.keys()))
sorted_services.remove(DEFAULT_SERVICE)
sorted_services = [DEFAULT_SERVICE] + sorted_services
diff --git a/icortex/services/echo.py b/icortex/services/echo.py
index e6a3d4a..cecac49 100644
--- a/icortex/services/echo.py
+++ b/icortex/services/echo.py
@@ -22,7 +22,7 @@ def generate(
self,
prompt: str,
context: t.Dict[str, t.Any] = {},
- ):
+ ) -> t.List[t.Dict[t.Any, t.Any]]:
argv = shlex.split(prompt)
# Remove the module name flag from the prompt
diff --git a/icortex/services/huggingface.py b/icortex/services/huggingface.py
index fbb5432..3f1b5c0 100644
--- a/icortex/services/huggingface.py
+++ b/icortex/services/huggingface.py
@@ -92,8 +92,8 @@ class HuggingFaceAutoService(ServiceBase):
),
}
- def __init__(self, config: t.Dict):
- super(HuggingFaceAutoService, self).__init__(config)
+ def __init__(self, **kwargs: t.Dict):
+ super(HuggingFaceAutoService, self).__init__(**kwargs)
import torch
from transformers import AutoTokenizer
@@ -101,8 +101,8 @@ def __init__(self, config: t.Dict):
self.device = "cuda" if torch.cuda.is_available() else "cpu"
self.token_id_cache = {}
- if "model" in config:
- model_id = config["model"]
+ if "model" in kwargs:
+ model_id = kwargs["model"]
else:
model_id = DEFAULT_MODEL
@@ -134,7 +134,7 @@ def generate(
self,
prompt: str,
context: t.Dict[str, t.Any] = {},
- ):
+ ) -> t.List[t.Dict[t.Any, t.Any]]:
argv = shlex.split(prompt)
# Remove the module name flag from the prompt
diff --git a/icortex/services/openai.py b/icortex/services/openai.py
index e71e9df..c3739d4 100644
--- a/icortex/services/openai.py
+++ b/icortex/services/openai.py
@@ -95,11 +95,11 @@ class OpenAIService(ServiceBase):
),
}
- def __init__(self, config: t.Dict):
- super(OpenAIService, self).__init__(config)
+ def __init__(self, **kwargs: t.Dict):
+ super(OpenAIService, self).__init__(**kwargs)
try:
- self.api_key = config["api_key"]
+ self.api_key = kwargs["api_key"]
openai.api_key = self.api_key
except KeyError:
print(MISSING_API_KEY_MSG)
@@ -109,7 +109,7 @@ def generate(
self,
prompt: str,
context: t.Dict[str, t.Any] = {},
- ):
+ ) -> t.List[t.Dict[t.Any, t.Any]]:
argv = shlex.split(prompt)
diff --git a/icortex/services/service_base.py b/icortex/services/service_base.py
index 2330311..a66f865 100644
--- a/icortex/services/service_base.py
+++ b/icortex/services/service_base.py
@@ -14,11 +14,27 @@ def is_str_repr(s: str):
class ServiceVariable:
+ """A variable for a code generation service
+
+ Args:
+ type_ (t.Any): Variable type.
+ default (t.Any, optional): Default value, should match :data:`type_`.
+ help (str, optional): Help string for the variable. Defaults to "".
+ secret (bool, optional): When set to
+ True, the variable is omitted from caches and the context. Defaults to False.
+ argparse_args (t.List, optional): Args to
+ be given to :func:`ArgumentParser.add_argument`. Defaults to [].
+ argparse_kwargs (t.Dict, optional): Keywords args to
+ be given to :func:`ArgumentParser.add_argument`. Defaults to {}.
+ require_arg (bool, optional): When set to true,
+ the prompt parser will raise an error if the variable is not specified.
+ Defaults to False.
+ """
def __init__(
self,
- type_: t.Any,
+ type_: type,
default: t.Any = None,
- help: str = None,
+ help: str = "",
secret: bool = False,
argparse_args: t.List = [],
argparse_kwargs: t.Dict = {},
@@ -58,14 +74,50 @@ def set_help(self, help: str):
class ServiceBase(ABC):
+ """Abstract base class for interfacing a code generation service.
+ Its main purpose is to provide a flexible API for connecting user
+ prompts with whatever logic the service
+ provider might choose to implement. User prompts adhere to
+ POSIX argument syntax and are parsed with
+ `argparse `__.
+
+ To create a new service:
+
+ - Assign a unique name to :attr:`name`
+ - Add your class to the dict :data:`icortex.services.AVAILABLE_SERVICES`.
+ Use :attr:`name` as the key and don't forget to include module information.
+ - Determine the parameters that the service will use for code generation and add
+ them to :attr:`variables`.
+ - Implement :func:`generate`.
+
+ Check out :class:`icortex.services.textcortex.TextCortexService` as a
+ reference implementation.
+
+ Attributes
+ ----------
+ variables: Dict[str, ServiceVariable]
+ A dict that maps variable names to :class:`ServiceVariable` s.
+ name: str
+ A unique name.
+ description: str
+ Description string.
+ prompt_parser: argparse.ArgumentParser
+ Parser to parse the prompts.
+ """
+
name: str = "base"
description: str = "Base class for a code generation service"
# Each child class will need to add their specific arguments
# by extending `variables`
variables: t.Dict[str, ServiceVariable] = {}
+ # This has stopped working, fix
hidden: bool = False
- def __init__(self, config: t.Dict):
+ def __init__(self, **kwargs: t.Dict[str, t.Any]):
+ """Classes that derive from ServiceBase are always initialized with
+ keyword arguments that contain values for the service variables.
+ The values can come
+ """
# Create the prompt parser and add default arguments
self.prompt_parser = argparse.ArgumentParser(
add_help=False,
@@ -118,7 +170,9 @@ def __init__(self, config: t.Dict):
required=False,
help="Do not print the generated code.",
)
- self.prompt_parser.usage = "%%prompt your prompt goes here [-e] [-r] [-i] [-p] ..."
+ self.prompt_parser.usage = (
+ "%%prompt your prompt goes here [-e] [-r] [-i] [-p] ..."
+ )
self.prompt_parser.description = self.description
@@ -126,8 +180,8 @@ def __init__(self, config: t.Dict):
for key, var in self.variables.items():
# If user has specified a value for the variable, use that
# Otherwise, the default value will be used
- if key in config:
- var.set_default(config[key])
+ if key in kwargs:
+ var.set_default(kwargs[key])
# Omit secret arguments from the parser, but still read them
if var.secret == False and len(var.argparse_args) > 0:
@@ -160,7 +214,22 @@ def cache_response(
return self._write_cache(cache, cache_path)
@abstractmethod
- def generate(self, prompt: str, context: t.Dict[str, t.Any] = {}):
+ def generate(
+ self,
+ prompt: str,
+ context: t.Dict[str, t.Any] = {},
+ ) -> t.List[t.Dict[t.Any, t.Any]]:
+ """Implement the logic that generates code from user prompts here.
+
+ Args:
+ prompt (str): The prompt that describes what the generated code should perform
+ context (t.Dict[str, t.Any], optional): A dict containing the current notebook
+ context, that is in the Jupyter notebook format.
+ See :class:`icortex.context.ICortexHistory` for more details.
+
+ Returns:
+ List[Dict[Any, Any]]: A list that contains code generation results. Should ideally be valid Python code.
+ """
raise NotImplementedError
def config_dialog(self, skip_defaults=False):
@@ -188,13 +257,26 @@ def config_dialog(self, skip_defaults=False):
return_dict[key] = user_val
return return_dict
- def get_variable(self, var_name: str):
+ def get_variable(self, var_name: str) -> ServiceVariable:
+ """Get a variable by its name
+
+ Args:
+ var_name (str): Name of the variable
+
+ Returns:
+ ServiceVariable: Requested variable
+ """
for key, var in self.variables.items():
if key == var_name:
return var
return None
- def get_variable_names(self):
+ def get_variable_names(self) -> t.List[str]:
+ """Get a list of variable names.
+
+ Returns:
+ List[str]: List of variable names
+ """
return [var.name for var in self.variables]
def _read_cache(self, cache_path):
diff --git a/icortex/services/textcortex.py b/icortex/services/textcortex.py
index bd12aff..6ca7b91 100644
--- a/icortex/services/textcortex.py
+++ b/icortex/services/textcortex.py
@@ -24,6 +24,7 @@
class TextCortexService(ServiceBase):
+ """Interface to TextCortex's code generation service"""
name = "textcortex"
description = "TextCortex Python code generator"
variables = {
@@ -58,11 +59,11 @@ class TextCortexService(ServiceBase):
),
}
- def __init__(self, config: t.Dict):
- super(TextCortexService, self).__init__(config)
+ def __init__(self, **kwargs: t.Dict):
+ super(TextCortexService, self).__init__(**kwargs)
try:
- self.api_key = config["api_key"]
+ self.api_key = kwargs["api_key"]
except KeyError:
print(MISSING_API_KEY_MSG)
raise Exception("Missing API key")
@@ -71,7 +72,7 @@ def generate(
self,
prompt: str,
context: t.Dict[str, t.Any] = {},
- ):
+ ) -> t.List[t.Dict[t.Any, t.Any]]:
argv = shlex.split(prompt)
# Remove the module name flag from the prompt
@@ -129,7 +130,6 @@ def generate(
self.cache_response(
cached_request_dict, response_dict, cache_path=DEFAULT_CACHE_PATH
)
-
return response_dict["generated_text"]
elif response_dict["status"] == "fail":
raise Exception(
From 80ac9302d676635d46a25e3d1cf6549a68891b79 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:19:02 +0100
Subject: [PATCH 15/21] Minor
---
docs/source/api.rst | 1 -
icortex/services/service_base.py | 10 +++++-----
icortex/services/textcortex.py | 1 +
3 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/docs/source/api.rst b/docs/source/api.rst
index 307e1a3..78885f8 100644
--- a/docs/source/api.rst
+++ b/docs/source/api.rst
@@ -26,7 +26,6 @@ This sections explains how to add or extend new code generation services.
.. automodule:: icortex.services.textcortex
:members:
- :show-inheritance:
Context
~~~~~~~
diff --git a/icortex/services/service_base.py b/icortex/services/service_base.py
index a66f865..aef2a6c 100644
--- a/icortex/services/service_base.py
+++ b/icortex/services/service_base.py
@@ -17,14 +17,14 @@ class ServiceVariable:
"""A variable for a code generation service
Args:
- type_ (t.Any): Variable type.
- default (t.Any, optional): Default value, should match :data:`type_`.
+ type_ (Any): Variable type.
+ default (Any, optional): Default value, should match :data:`type_`.
help (str, optional): Help string for the variable. Defaults to "".
secret (bool, optional): When set to
True, the variable is omitted from caches and the context. Defaults to False.
- argparse_args (t.List, optional): Args to
+ argparse_args (List, optional): Args to
be given to :func:`ArgumentParser.add_argument`. Defaults to [].
- argparse_kwargs (t.Dict, optional): Keywords args to
+ argparse_kwargs (Dict, optional): Keywords args to
be given to :func:`ArgumentParser.add_argument`. Defaults to {}.
require_arg (bool, optional): When set to true,
the prompt parser will raise an error if the variable is not specified.
@@ -223,7 +223,7 @@ def generate(
Args:
prompt (str): The prompt that describes what the generated code should perform
- context (t.Dict[str, t.Any], optional): A dict containing the current notebook
+ context (Dict[str, Any], optional): A dict containing the current notebook
context, that is in the Jupyter notebook format.
See :class:`icortex.context.ICortexHistory` for more details.
diff --git a/icortex/services/textcortex.py b/icortex/services/textcortex.py
index 6ca7b91..278ef6c 100644
--- a/icortex/services/textcortex.py
+++ b/icortex/services/textcortex.py
@@ -73,6 +73,7 @@ def generate(
prompt: str,
context: t.Dict[str, t.Any] = {},
) -> t.List[t.Dict[t.Any, t.Any]]:
+ """"""
argv = shlex.split(prompt)
# Remove the module name flag from the prompt
From 98262114fc6bf104bbe83977694b3328ec62eb2d Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:28:59 +0100
Subject: [PATCH 16/21] Added rtd config
---
.readthedocs.yml | 11 +++++++++++
docs/requirements.txt | 4 ++++
2 files changed, 15 insertions(+)
create mode 100644 .readthedocs.yml
create mode 100644 docs/requirements.txt
diff --git a/.readthedocs.yml b/.readthedocs.yml
new file mode 100644
index 0000000..3f105f8
--- /dev/null
+++ b/.readthedocs.yml
@@ -0,0 +1,11 @@
+version: 2
+build:
+ image: ubuntu-22.04
+sphinx:
+ configuration: docs/conf.py
+python:
+ version: 3.10
+ install:
+ - requirements: docs/requirements.txt
+ - method: pip
+ path: .
\ No newline at end of file
diff --git a/docs/requirements.txt b/docs/requirements.txt
new file mode 100644
index 0000000..e62d64a
--- /dev/null
+++ b/docs/requirements.txt
@@ -0,0 +1,4 @@
+Sphinx==5.3.0
+sphinxcontrib-video==0.0.1.dev3
+sphinx-copybutton==0.5.0
+furo==2022.9.29
\ No newline at end of file
From a69a2ebeb61db9ebb180fba4d3b949ac3f5c00f3 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:30:44 +0100
Subject: [PATCH 17/21] Added badge
---
README.md | 1 +
1 file changed, 1 insertion(+)
diff --git a/README.md b/README.md
index f22d9de..1b11965 100644
--- a/README.md
+++ b/README.md
@@ -5,6 +5,7 @@
+
From 6286a123d2948cc80df49caf1df2e3d789934334 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:37:02 +0100
Subject: [PATCH 18/21] Updated README
---
README.md | 93 ++-----------------------------------------------------
1 file changed, 2 insertions(+), 91 deletions(-)
diff --git a/README.md b/README.md
index 1b11965..7e16785 100644
--- a/README.md
+++ b/README.md
@@ -57,100 +57,11 @@ Install directly from PyPI:
pip install icortex
# This line is needed to install the kernel spec to Jupyter:
python -m icortex.kernel.install
-# Alternatively, running icortex directly also installs the kernel spec:
-icortex
```
-## Using ICortex
+## Quickstart
-Before you can use ICortex in Jupyter, you need to configure it for your current project.
-
-If you are using the terminal, go to your project directory and run:
-
-```bash
-icortex init
-```
-
-Alternatively, you can initialize directly in a Jupyter Notebook ([instructions on how to start JupyterLab](https://jupyterlab.readthedocs.io/en/stable/getting_started/starting.html)):
-
-```
-%icortex init
-```
-
-The shell will then instruct you step by step and create a configuration file `icortex.toml` in the current directory.
-
-### Choosing a code generation service
-
-ICortex supports different code generation services such as the TextCortex API, OpenAI Codex API, local HuggingFace transformers, and so on.
-
-To use the TextCortex code generation API,
-
-1. [sign up on the website](https://app.textcortex.com/user/signup),
-2. [generate an API key on the dashboard](https://app.textcortex.com/user/dashboard/settings/api-key),
-3. and proceed to configure `icortex` for your current project:
-
-[](https://asciinema.org/a/sTU1EaGFfi3jdSV8Ih7vulsfT)
-
-If you use up the starter credits and would like to continue testing out ICortex, [hit us up on our Discord on #icortex channel](https://discord.textcortex.com) and we will charge your account with more free credits.
-
-You can also try out different services e.g. OpenAI's Codex API, if you have access. You can also run code generation models from HuggingFace locally, which we have optimized to run on the CPU—though these produce lower quality outputs due to being smaller.
-
-### Executing prompts
-
-To execute a prompt with ICortex, use the `%prompt` [magic command](https://ipython.readthedocs.io/en/stable/interactive/magics.html) (or `%p` for short) as a prefix. Copy and paste the following prompt into a cell and try to run it:
-
-```
-%p print Hello World. Then print the Fibonacci numbers till 100
-```
-
-Depending on the response, you should see an output similar to the following:
-
-```
-print('Hello World.', end=' ')
-a, b = 0, 1
-while b < 100:
- print(b, end=' ')
- a, b = b, a+b
-
-Hello World.
-1 1 2 3 5 8 13 21 34 55 89
-```
-
-You can also specify variables or options with command line flags, e.g. to auto-install packages, auto-execute the returned code and so on. To see the complete list of variables for your chosen service, run:
-
-```
-%help
-```
-
-### Using ICortex CLI
-
-ICortex comes with a full-fledged CLI similar to git or Docker CLI, which you can use to configure how you generate code in your project. To see all the commands you can invoke, run
-
-```sh
-icortex help
-```
-
-For example the command `icortex service` lets you configure the code generation service you would like to use. To see how to use each command, call them with `help`
-
-```
-icortex service help
-```
-
-### Accessing ICortex CLI inside Jupyter
-
-You can still access the `icortex` CLI in a Jupyter Notebook or shell by using the magic command `%icortex`. For example running the following in the terminal switches to a local HuggingFace model:
-
-```
-icortex service set huggingface
-```
-
-To do the same in a Jupyter Notebook, you can run
-
-```
-%icortex service set huggingface
-```
-
-in a cell, which initializes and switches to the new service directly in your Jupyter session.
+[Click here to visit the docs and get started using ICortex](https://icortex.readthedocs.io/en/latest/quickstart.html).
## Getting help
From 938b877b23ee897f783cc7f66698ed420743e4e3 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:41:49 +0100
Subject: [PATCH 19/21] Minor
---
.readthedocs.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/.readthedocs.yml b/.readthedocs.yml
index 3f105f8..0a37f50 100644
--- a/.readthedocs.yml
+++ b/.readthedocs.yml
@@ -1,6 +1,6 @@
version: 2
build:
- image: ubuntu-22.04
+ image: latest
sphinx:
configuration: docs/conf.py
python:
From 1d713590a0d32e85cb5f3b6ac642608fb9782b87 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:42:30 +0100
Subject: [PATCH 20/21] Minor
---
.readthedocs.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/.readthedocs.yml b/.readthedocs.yml
index 0a37f50..ba1ceb3 100644
--- a/.readthedocs.yml
+++ b/.readthedocs.yml
@@ -4,7 +4,7 @@ build:
sphinx:
configuration: docs/conf.py
python:
- version: 3.10
+ version: 3.8
install:
- requirements: docs/requirements.txt
- method: pip
From 5ae22a4c15456c8223b878cd8d6baa3e50014a76 Mon Sep 17 00:00:00 2001
From: osolmaz
Date: Mon, 31 Oct 2022 17:45:03 +0100
Subject: [PATCH 21/21] Minor
---
.readthedocs.yml | 2 +-
docs/source/conf.py | 1 -
2 files changed, 1 insertion(+), 2 deletions(-)
diff --git a/.readthedocs.yml b/.readthedocs.yml
index ba1ceb3..1159271 100644
--- a/.readthedocs.yml
+++ b/.readthedocs.yml
@@ -2,7 +2,7 @@ version: 2
build:
image: latest
sphinx:
- configuration: docs/conf.py
+ configuration: docs/source/conf.py
python:
version: 3.8
install:
diff --git a/docs/source/conf.py b/docs/source/conf.py
index f73f3e5..abd8bd0 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -22,7 +22,6 @@
"sphinxcontrib.video",
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
- "sphinx.ext.autodoc",
"sphinx.ext.coverage",
"sphinx.ext.napoleon",
"sphinx_copybutton",