diff --git a/.github/workflows/documentation.yml b/.github/workflows/documentation.yml
index 29092570..c3b0cd6c 100644
--- a/.github/workflows/documentation.yml
+++ b/.github/workflows/documentation.yml
@@ -23,6 +23,6 @@ jobs:
with:
key: ${{ github.ref }}
path: .cache
- - run: pip install mkdocs-material mkdocs-static-i18n mdx-include mkdocs-macros-plugin
+ - run: pip install mkdocs-material mkdocs-static-i18n mdx-include mkdocs-macros-plugin mkdocs-glightbox
- working-directory: ./docs
run: mkdocs gh-deploy --force
diff --git a/README.md b/README.md
index b7235799..ca0f7aa4 100644
--- a/README.md
+++ b/README.md
@@ -57,6 +57,7 @@ It is a modern, high-level framework on top of popular specific Python brokers l
* framework-independent way to manage the project environment
* application code *hot reload*
* robust application templates
+* [**Documentation**](#project-documentation): **Propan** automatically generates and presents an interactive **AsyncAPI** documentation for your project
* **Testability** : **Propan** allows you to test your app without external dependencies: you do not have to set up a Message Broker, you can use a virtual one!
### Supported MQ brokers
@@ -238,6 +239,16 @@ async def base_handler(body: dict,
---
+## Project Documentation
+
+**Propan** automatically generates documentation for your project according to the **AsyncAPI** specification. You can work with both generated artifacts and place a Web view of your documentation on resources available to related teams.
+
+The availability of such documentation significantly simplifies the integration of services: you can immediately see what channels and message format the application works with. And most importantly, it doesn't cost you anything - **Propan** has already done everything for you!
+
+![HTML-page](../../assets/img/docs-html-short.png)
+
+---
+
## CLI power
**Propan** has its own CLI tool that provided the following features:
diff --git a/docs/docs/assets/img/docs-html-short.png b/docs/docs/assets/img/docs-html-short.png
new file mode 100644
index 00000000..464e501a
Binary files /dev/null and b/docs/docs/assets/img/docs-html-short.png differ
diff --git a/docs/docs/assets/img/docs-html.png b/docs/docs/assets/img/docs-html.png
new file mode 100644
index 00000000..b505e660
Binary files /dev/null and b/docs/docs/assets/img/docs-html.png differ
diff --git a/docs/docs/en/CHANGELOG.md b/docs/docs/en/CHANGELOG.md
index cc9caf80..82dec10f 100644
--- a/docs/docs/en/CHANGELOG.md
+++ b/docs/docs/en/CHANGELOG.md
@@ -1,5 +1,29 @@
# CHANGELOG
+## 2023-06-14 **0.1.3.0** AsyncAPI
+
+The current update adds functionality that I've been working hard on for the last month:
+Now **Propan** can automatically generate and host documentation for your application
+according to the [**AsyncAPI**]({{ urls.asyncapi }}){.external-link target="_blank"} specification.
+
+You can simply provide related teams with a link to your documentation page, where they can get acquainted with all the parameters of the server used, channels, and the format of messages consumed by your service.
+
+![HTML-page](../../assets/img/docs-html-short.png)
+
+You can learn more about this functionality in the corresponding [documentation section] (getting_started/9_documentation.md).
+
+Also, the ability to determine the dependencies of the broker level and consumers has been added.:
+
+```python
+from propan import RabbitBroker, Depends
+
+broker = RabbitBroker(dependencies=[Depends(...)])
+
+@broker.handler(..., dependencies=[Depends(...)])
+async def handler():
+ ...
+```
+
## 2023-06-13 **0.1.2.17**
The current update is a sum of several changes and improvements released from the previous release.
diff --git a/docs/docs/en/contributing/1_todo.md b/docs/docs/en/contributing/1_todo.md
index 13b6c63c..83f33887 100644
--- a/docs/docs/en/contributing/1_todo.md
+++ b/docs/docs/en/contributing/1_todo.md
@@ -16,15 +16,7 @@ To participate in the development of documentation, go to the following [section
## Code
-If you want to work a little with the code, then you have even more opportunities to prove yourself. At the moment, the priority of the project is the following tasks:
-
-* `PushBackWatcher` should store information about the number of message processing in the header:
-this will allow you to keep a common counter for all consumers.
-* Merge the arguments of the methods `__init__` and `connect` brokers for more flexible management of default values
-* Coverage with `NatsBroker` tests
-* Implementation of `NatsJSBroker`
-* Implementation of the synchronous version of the application and brokers
-* Broker implementation for `Apache Kafka' and other brokers from [plan](../../#supported-mq-brokers)
+All actual tasks you can find at project [Issues](https://github.com/Lancetnik/Propan/issues){.external-link target="_blank"}.
To start developing the project, go to the following [section](../2_contributing-index/).
diff --git a/docs/docs/en/getting_started/1_quick-start.md b/docs/docs/en/getting_started/1_quick-start.md
index 1cf0b848..8248aeed 100644
--- a/docs/docs/en/getting_started/1_quick-start.md
+++ b/docs/docs/en/getting_started/1_quick-start.md
@@ -56,6 +56,16 @@ and [more](../5_dependency/1_di-index).
---
+## Project Documentation
+
+**Propan** automatically generates documentation for your project according to the [**AsyncAPI**]({{ urls.asyncapi }}){ target="_blank"} specification. You can work with both generated artifacts and place a Web view of your documentation on resources available to related teams.
+
+The availability of such documentation significantly simplifies the integration of services: you can immediately see what channels and message format the application works with. And most importantly, it doesn't cost you anything - **Propan** has already done everything for you!
+
+![HTML-page](../../assets/img/docs-html-short.png)
+
+---
+
## Project template
Also, **Propan CLI** is able to generate a production-ready application template:
diff --git a/docs/docs/en/getting_started/4_broker/3_type-casting.md b/docs/docs/en/getting_started/4_broker/3_type-casting.md
index a91100fa..051fbaa2 100644
--- a/docs/docs/en/getting_started/4_broker/3_type-casting.md
+++ b/docs/docs/en/getting_started/4_broker/3_type-casting.md
@@ -2,13 +2,13 @@
The first argument of the function decorated by `@broker.hanle` is the decrypted body of the incoming message.
-It can be of three types:
+Incoming message body can be of three types:
* `str` - if the message has the header `content-type: text/plain`
* `dict` - if the message has the header `content-type: application/json`
* `bytes` - if the message has any other header
-All incoming messages will be automatically brought to this view.
+Either these types can be used as an annotation, or any primitive types to which **pydantic** can cast incoming arguments (for example, `str -> float`).
A few examples:
@@ -19,7 +19,7 @@ A few examples:
async def base_handler(body: str):
'''
We are expecting a text/plain message
- Messages of a different kind will trigger an error
+ Messages of a different kind will raise an error
'''
```
@@ -31,7 +31,7 @@ async def base_handler(body: str):
async def base_handler(body: dict):
'''
We are expecting an application/json message
- Messages of a different kind will trigger an error
+ Messages of a different kind will raise an error
'''
```
@@ -42,7 +42,7 @@ async def base_handler(body: dict):
async def base_handler(body: bytes):
'''
We are expecting a 'raw' message
- Messages of a different kind will trigger an error
+ Messages of a different kind will raise an error
'''
```
@@ -62,6 +62,27 @@ async def base_handler(body: Message):
'''
We are expecting an application/json message
Type { key: 1.0 }
- Messages of a different kind will trigger an error
+ Messages of a different kind will raise an error
'''
-```
\ No newline at end of file
+```
+
+### Multiple arguments
+
+When annotating multiple incoming arguments, the result will be equivalent to using a similar `pydantic' model.
+
+```python
+from pydantic import BaseModel
+
+class Message(BaseModel):
+ a: int
+ b: float
+
+@broker.handle("test")
+async def base_handler(a: int, b: float):
+# async def base_handler(body: Message): - the same
+ '''
+ We are expecting an application/json message
+ Type { a: 1, b: 1.0 }
+ Messages of a different kind will raise an error
+ '''
+```
diff --git a/docs/docs/en/getting_started/5_dependency/1_di-index.md b/docs/docs/en/getting_started/5_dependency/1_di-index.md
index 0b48d064..102c8a3a 100644
--- a/docs/docs/en/getting_started/5_dependency/1_di-index.md
+++ b/docs/docs/en/getting_started/5_dependency/1_di-index.md
@@ -50,6 +50,28 @@ It's easy, isn't it?
In the code above, we didn't use this decorator for our dependencies. However, it still applies
to all functions used as dependencies. Keep this in your mind.
+## Top-level dependencies
+
+If you don't need a dependency result you can use the following code:
+
+```python
+@broker.handle("test")
+def method(_ = Depends(...)): ...
+```
+
+But, using a special `handle` parameter is more suitable:
+
+```python
+@broker.handle("test", dependencies=[Depends(...)])
+def method(): ...
+```
+
+Also, you are able to declare broker-level dependencies: they will be applied to all brokers' handlers.
+
+```python
+broker = RabbitBroker(dependencies=[Depends(...)])
+```
+
## Nested dependencies
Dependencies can also contain other dependencies. This works in a very predictable way: just declare
diff --git a/docs/docs/en/getting_started/6_lifespans.md b/docs/docs/en/getting_started/6_lifespans.md
index 06dd0316..f776ebd5 100644
--- a/docs/docs/en/getting_started/6_lifespans.md
+++ b/docs/docs/en/getting_started/6_lifespans.md
@@ -112,3 +112,5 @@ Command line arguments are available in all `@app.on_startup` hooks. To use them
### Broker initialization
The `@app.on_startup` hooks are called **BEFORE** the broker is launched by the application. The `@app.after_shutdown` hooks are triggered **AFTER** stopping the broker.
+
+If you want to perform some actions **AFTER** initializing the broker: send messages, initialize objects, etc., you should use the `@app.after_startup` hook.
diff --git a/docs/docs/en/getting_started/9_documentation.md b/docs/docs/en/getting_started/9_documentation.md
new file mode 100644
index 00000000..f33d505d
--- /dev/null
+++ b/docs/docs/en/getting_started/9_documentation.md
@@ -0,0 +1,94 @@
+---
+hide:
+ - toc
+---
+
+# Documenting
+
+**Propan** allows you not to think about the documentation of your project - it is already generated automatically in accordance with the [**AsyncAPI**]({{ urls.asyncapi }}){.external-link target="_blank"} specification !
+
+!!! note ""
+ To work with a documentation you should install an extra requirements:
+
+ ```console
+ pip install "propan[doc]"
+ ```
+
+## Example
+
+Let's look at an example.
+
+To begin with, we will write a small application with the following content:
+
+```python linenums='1'
+{!> docs_src/quickstart/documentation/example.py !}
+```
+
+## YAML schema
+
+To generate the **AsyncAPI** specification of your project in the `.yaml` format use the following command:
+
+
+```console
+$ propan docs gen example:app
+
+Your project AsyncAPI scheme was placed to `./asyncapi.yaml`
+```
+
+
+Now you have a scheme of your project: you can use it to generate various clients in any language using an [**AsyncAPI** tools]({{ urls.asyncapi }}/tools/generator){.external-link target="_blank"}.
+
+???- example "Asyncapi.yaml"
+ ```yaml
+ {!> docs_src/quickstart/documentation/example.yaml !}
+ ```
+
+## Online documentation
+
+Also, **Propan** allows you to host HTML representation of your documentation with the following command
+
+!!! warning ""
+ The online representation of documentation does not work without an internet connection, since **CDN** dependencies are used to display it.
+
+
+```console
+$ propan docs serve example:app
+```
+
+
+This way you can provide all external consumers with access to your project documentation without additional development costs.
+
+???- example "HTML page"
+ ![HTML-page](../../assets/img/docs-html.png)
+
+!!! tip
+ **Propan** can also host `asyncapi.yaml` files.
+
+```console
+propan docs serve asyncapi.yaml
+```
+
+This can be useful if you want to extend the automatically generated **AsyncAPI** documentation: you just generate a file, modify and host it!
+
+When using online documentation, you can also download it using the following paths:
+
+* `/asyncapi.json` - **JSON** schema (available when hosting an application)
+* `/asyncapi.yaml` - **YAML** schema (available for an application and a file both)
+
+### FastAPI Plugin
+
+When using **Propan** as a router for **FastAPI**, the framework automatically registers endpoints for hosting **AsyncAPI** documentation in your application with the following default values:
+
+```python linenums='1'
+{!> docs_src/quickstart/documentation/fastapi.py !}
+```
+
+## Own hosting
+
+For hosting documentation **Propan** uses **FastAPI** + **uvicorn**.
+You may want to implement the logic of displaying documentation yourself: restrict access rights, customize content regardless of access rights, embed documentation in your frontend application, and so on.
+To do this, you can generate a `json`/`yaml`/`html` document yourself and use it in your own service.
+
+```python linenums='1' hl_lines="9-12"
+{!> docs_src/quickstart/documentation/custom_schema.py !}
+```
diff --git a/docs/docs/en/index.md b/docs/docs/en/index.md
index 7f234b37..c63e6249 100644
--- a/docs/docs/en/index.md
+++ b/docs/docs/en/index.md
@@ -51,6 +51,7 @@ It is a modern, high-level framework on top of popular Python libraries for vari
* framework-independent way to manage the project environment
* application code *hot reload*
* robust application templates
+* [**Documentation**](getting_started/9_documentation/): **Propan** automatically generates and presents an interactive [**AsyncAPI**]({{ urls.asyncapi }}){target="_blank"} documentation for your project
* [**Testability**](getting_started/7_testing): **Propan** allows you to test your app without external dependencies: you do not have to set up a Message Broker, you can use a virtual one!
---
diff --git a/docs/docs/en/integrations/2_fastapi-plugin.md b/docs/docs/en/integrations/2_fastapi-plugin.md
index 520d9e1c..d4230df4 100644
--- a/docs/docs/en/integrations/2_fastapi-plugin.md
+++ b/docs/docs/en/integrations/2_fastapi-plugin.md
@@ -1,5 +1,7 @@
# **FastAPI** Plugin
+## Handle messages
+
**Propan** can be used as a part of **FastAPI**.
Just import a **PropanRouter** you need and declare the message handler
@@ -19,7 +21,7 @@ in any way convenient for you. The message header is placed in `headers`.
Also, this router can be fully used as an `HttpRouter` (of which it is the inheritor). So you can
use it to declare any `get`, `post`, `put` and other HTTP methods. For example, this is done at **19** line.
-### Sending messages
+## Sending messages
Inside each router there is a broker. You can easily access it if you need to send a message to MQ.
@@ -28,3 +30,23 @@ Inside each router there is a broker. You can easily access it if you need to se
You can use the following `Depends` to access the broker if you want to use it at different parts of your program.
{! includes/integrations/fastapi/fastapi_plugin_depends.md !}
+
+Or you can access broker from a **FastAPI** application state
+
+```python
+{! docs_src/integrations/fastapi/request.py !}
+```
+
+## @after_startup
+
+The `PropanApp` application has the `after_startup` hook, which allows you to perform operations with your message broker after the connection is established. This can be extremely convenient for managing your brokers' objects and/or sending messages. This hook is also available for your **FastAPI PropanRouter**
+
+{! includes/integrations/fastapi/after_startup.md !}
+
+## Documentation
+
+When using **Propan** as a router for **FastAPI**, the framework automatically registers endpoints for hosting **AsyncAPI** documentation into your application with the following default values:
+
+```python linenums='1'
+{!> docs_src/quickstart/documentation/fastapi.py !}
+```
diff --git a/docs/docs/en/nats/direct.py b/docs/docs/en/nats/direct.py
deleted file mode 100644
index 4dd5f692..00000000
--- a/docs/docs/en/nats/direct.py
+++ /dev/null
@@ -1,23 +0,0 @@
-from propan import PropanApp, NatsBroker
-from propan.annotations import Logger
-
-broker = NatsBroker()
-app = PropanApp(broker)
-
-@broker.handle("test-subj-1", "workers")
-async def base_handler1(logger: Logger):
- logger.info("base_handler1")
-
-@broker.handle("test-subj-1", "workers")
-async def base_handler2(logger: Logger):
- logger.info("base_handler2")
-
-@broker.handle("test-subj-2", "workers")
-async def base_handler3(logger: Logger):
- logger.info("base_handler3")
-
-@app.after_startup
-async def send_messages():
- await broker.publish("", "test-subj-1") # handlers: 1 or 2
- await broker.publish("", "test-subj-1") # handlers: 1 or 2
- await broker.publish("", "test-subj-2") # handlers: 3
diff --git a/docs/docs/en/nats/pattern.py b/docs/docs/en/nats/pattern.py
deleted file mode 100644
index e00c4b02..00000000
--- a/docs/docs/en/nats/pattern.py
+++ /dev/null
@@ -1,23 +0,0 @@
-from propan import PropanApp, NatsBroker
-from propan.annotations import Logger
-
-broker = NatsBroker()
-app = PropanApp(broker)
-
-@broker.handle("*.info", "workers")
-async def base_handler1(logger: Logger):
- logger.info("base_handler1")
-
-@broker.handle("*.info", "workers")
-async def base_handler2(logger: Logger):
- logger.info("base_handler2")
-
-@broker.handle("*.error", "workers")
-async def base_handler3(logger: Logger):
- logger.info("base_handler3")
-
-@app.after_startup
-async def send_messages():
- await broker.publish("", "logs.info") # handlers: 1 or 2
- await broker.publish("", "logs.info") # handlers: 1 or 2
- await broker.publish("", "logs.error") # handlers: 3
diff --git a/docs/docs/en/rabbit/2_exchanges.md b/docs/docs/en/rabbit/2_exchanges.md
index b2525422..236ca3a1 100644
--- a/docs/docs/en/rabbit/2_exchanges.md
+++ b/docs/docs/en/rabbit/2_exchanges.md
@@ -10,7 +10,7 @@ from propan.brokers.rabbit import RabbitBroker, RabbitExchange
broker = RabbitBroker()
@broker.handler("test", exchange=RabbitExchange("test"))
-asynchronous definition handler():
+async def handler():
...
...
diff --git a/docs/docs/ru/CHANGELOG.md b/docs/docs/ru/CHANGELOG.md
index bcafa626..c1d48ff4 100644
--- a/docs/docs/ru/CHANGELOG.md
+++ b/docs/docs/ru/CHANGELOG.md
@@ -1,5 +1,29 @@
# CHANGELOG
+## 2023-06-14 **0.1.3.0** AsyncAPI
+
+Текущее обновление добавляет функционал, над которым я усердно работал последний месяц:
+теперь **Propan** может автоматически генерировать и хостить документацию для вашего приложения в
+соответствии со спецификацией [**AsyncAPI**](https://www.asyncapi.com/){.external-link target="_blank"}.
+
+Вы можете просто предоставить смежным командам ссылку на страницу с вашей документацией, где они смогут ознакомиться со всеми параметрами используемого сервера, каналов и форматом сообщений, потребляемых вашим сервисом.
+
+![HTML-page](../../assets/img/docs-html-short.png)
+
+Подробнее с этим функционалом вы можете ознакомиться в соответсвующем [разделе документации](getting_started/9_documentation.md).
+
+Также, добавлена возможность определения зависимостей уровня брокера и потребителей:
+
+```python
+from propan import RabbitBroker, Depends
+
+broker = RabbitBroker(dependencies=[Depends(...)])
+
+@broker.handler(..., dependencies=[Depends(...)])
+async def handler():
+ ...
+```
+
## 2023-06-13 **0.1.2.17**
В этом обовлении стоит обобщить несколько изменения и улучшений, выпущенных с предыдущего релиза.
diff --git a/docs/docs/ru/contributing/1_todo.md b/docs/docs/ru/contributing/1_todo.md
index 60a8fd24..093e2761 100644
--- a/docs/docs/ru/contributing/1_todo.md
+++ b/docs/docs/ru/contributing/1_todo.md
@@ -16,16 +16,7 @@
## Код
-Если же вы хотите немного поработать с кодом, то тут у вас еще больше возможностей проявить себя. На текущий момент приоритет перед проектом стоят следующие задачи:
-
-* `PushBackWatcher` должен хранить информацию о количестве обработок сообщения в заголовке:
- это позволит вести общий счетчик для всех потребителей.
-* Сделать слияние аргументов методов `__init__` и `connect` брокеров для более гибкого управления дефолтными значениями
-* Добавить генерацию документации приложения в соответствии с [Async API](https://www.asyncapi.com){target="_blank"}
-* Покрытие тестами `NatsBroker`
-* Реализация `NatsJSBroker`
-* Реализация синхронной версии приложения и брокеров
-* Реализация брокера для `Apache Kafka` и других брокеров из [плана](../../#_3)
+Все актуальные задачи вы моежете найти в [Issues](https://github.com/Lancetnik/Propan/issues){.external-link target="_blank"}.
Для того, чтобы приступить к разработке проекта, перейдите в следующий [раздел](../2_contributing-index/).
diff --git a/docs/docs/ru/getting_started/1_quick-start.md b/docs/docs/ru/getting_started/1_quick-start.md
index 2242aa4a..4ba93523 100644
--- a/docs/docs/ru/getting_started/1_quick-start.md
+++ b/docs/docs/ru/getting_started/1_quick-start.md
@@ -59,7 +59,17 @@ Propan имеет систему управления зависимостями
---
-## Готовый шаблон
+## Документация Проекта
+
+**Propan** автоматически генерирует документацию для вашего проекта в соответсвии со спецификацией [**AsyncAPI**]({{ urls.asyncapi }}){target="_blank"}. Вы можете работать как со сгенерированным артефактами, так и разместить Web-представление вашей документацие на ресурсах, доступных для смежных команд.
+
+Наличие такой документации существенно упрощает интеграцию сервисов: вы сразу видите, с какими каналами и каким форматом сообщений работает приложение. А самое главное, это не стоит вам ничего - **Propan** уже сделал все за вас!
+
+![HTML-page](../../assets/img/docs-html-short.png)
+
+---
+
+## Готовый шаблон проекта
Вы можете сгенерировать готовый к использованию шаблон проекта с помощью **Propan CLI**:
diff --git a/docs/docs/ru/getting_started/4_broker/3_type-casting.md b/docs/docs/ru/getting_started/4_broker/3_type-casting.md
index 57ad8190..793387d6 100644
--- a/docs/docs/ru/getting_started/4_broker/3_type-casting.md
+++ b/docs/docs/ru/getting_started/4_broker/3_type-casting.md
@@ -2,13 +2,13 @@
Первый аргумент функции, обрамленной в `@broker.hanle` - это расшифрованное тело входящего сообщения.
-Оно может быть трех типов:
+Тело входящих сообщений в **Propan** может быть одним из трех типов:
* `str` - если сообщение имеет заголовок `content-type: text/plain`
* `dict` - если сообщение имеет заголовок `content-type: application/json`
* `bytes` - если сообщение имеет любой другой заголовок
-Все входящие сообщения будут автоматически приводится к этому виду.
+В качестве аннотации могут использоваться либо эти типы, либо любые примитивные типы, к которым **pydantic** сможет привести входящие аргументы (например `str -> float`).
Несколько примеров:
@@ -65,3 +65,24 @@ async def base_handler(body: Message):
Сообщения другого вида спровоцируют ошибку
'''
```
+
+### Несколько аргументов
+
+При аннотировании нескольких входящих аргументов, результат будет равносилен использованию использованию аналогичной `pydantic` модели.
+
+```python
+from pydantic import BaseModel
+
+class Message(BaseModel):
+ a: int
+ b: float
+
+@broker.handle("test")
+async def base_handler(a: int, b: float):
+# async def base_handler(body: Message): - аналогично
+ '''
+ Мы ожидаем application/json сообщение
+ Вида { a: 1, b: 1.0 }
+ Сообщения другого вида спровоцируют ошибку
+ '''
+```
diff --git a/docs/docs/ru/getting_started/5_dependency/1_di-index.md b/docs/docs/ru/getting_started/5_dependency/1_di-index.md
index 16a54761..fa8b28df 100644
--- a/docs/docs/ru/getting_started/5_dependency/1_di-index.md
+++ b/docs/docs/ru/getting_started/5_dependency/1_di-index.md
@@ -48,7 +48,29 @@ nested: Здесь вызывается вложенная зависимост
!!! tip "Автоматическое применений @apply_types"
В коде выше мы не использовали этот декоратор для наших зависимостей. Однако, он все равно применяется
- ко всем функциям, используемым в качестве зависимостей. Держите это в уме.
+ ко всем функциям, используемым в качестве зависимостей.
+
+## Зависимости верхнего уровня
+
+Если вам не нужен результат выполнения зависимостей, вы, конечно, можете использовать следующую конструкцию:
+
+```python
+@broker.handle("test")
+def method(_ = Depends(...)): ...
+```
+
+Однако, гораздо удобнее использовать для этого специальный параметр метода `handle`
+
+```python
+@broker.handle("test", dependencies=[Depends(...)])
+def method(): ...
+```
+
+Также вы можете объявить такие зависимости на уровне брокера: в таком случае, они будут применяться ко всем обработчикам этого брокера.
+
+```python
+broker = RabbitBroker(dependencies=[Depends(...)])
+```
## Вложенные зависимости
@@ -88,8 +110,7 @@ nested: Здесь вызывается вложенная зависимост
## Приведение типов зависимостей
**FastDepends**, используемый **Propan**, также приводит тип `return`. Это означает, что значение, возвращаемое зависимостью будет
-дважды приводиться к типу: как `return` это зависимости и как входной аргумент основной функции. Это не несет дополнительных расходов, если
-эти типы имеют одну и ту же аннотацию. Просто держите это в голове. Или нет... В любом случае, я вас предупредил.
+дважды приводиться к типу: как `return` этой зависимости и как входной аргумент основной функции. Это не несет дополнительных расходов, если эти типы имеют одну и ту же аннотацию. Просто держите это в голове. Или нет... В любом случае, я вас предупредил.
```python linenums="1"
from propan import Depends, apply_types
@@ -108,4 +129,4 @@ assert method("1") == 5
Также, результат выполнения зависимости кешируется. Если вы используете эту зависимости в `N` функциях,
этот закешированный результат будет приводится к типу `N` раз (на входе в используемую функцию).
-Для избежания проблем с этим, используйте [mypy](https://www.mypy-lang.org){target="_blank"} или просто будьте аккуратны с аннотацией типов в вашем проекте.
+Чтобы избежать потенциальных проблем, используйте [mypy](https://www.mypy-lang.org){target="_blank"} или просто будьте аккуратны с аннотацией типов в вашем проекте.
diff --git a/docs/docs/ru/getting_started/6_lifespans.md b/docs/docs/ru/getting_started/6_lifespans.md
index 8f4e0125..4a313023 100644
--- a/docs/docs/ru/getting_started/6_lifespans.md
+++ b/docs/docs/ru/getting_started/6_lifespans.md
@@ -113,3 +113,5 @@ propan run serve:app --env .env.test
### Инициализация брокера
Хуки `@app.on_startup` вызываются **ДО** запуска брокера приложением. Хуки `@app.after_shutdown` запускаются **ПОСЛЕ** остановки брокера.
+
+Если же вы хотите совершить какие-то действия **ПОСЛЕ** инициализации брокера: отправить сообщения, инициализировать объекты и т.д., вам стоит использовать хук `@app.after_startup`.
diff --git a/docs/docs/ru/getting_started/9_documentation.md b/docs/docs/ru/getting_started/9_documentation.md
new file mode 100644
index 00000000..8f1d37f9
--- /dev/null
+++ b/docs/docs/ru/getting_started/9_documentation.md
@@ -0,0 +1,93 @@
+---
+hide:
+ - toc
+---
+
+# Документирование
+
+**Propan** позволяет вам не думать о документации своего проекта - она уже сгенерирована автоматически в соответсвии со спецификацией [**AsyncAPI**]({{ urls.asyncapi }}){.external-link target="_blank"}!
+
+!!! note ""
+ Для работы с документацией вам необходимо установить дополнительные зависимости:
+
+ ```console
+ pip install "propan[doc]"
+ ```
+
+## Пример
+
+Давайте разберемся на примере, как это работает.
+
+Для начала напишем небольшое приложение примерно следующего содержания:
+
+```python linenums='1'
+{!> docs_src/quickstart/documentation/example.py !}
+```
+
+## YAML схема
+
+Для того, чтобы сгенерировать **AsyncAPI** спецификацию вашего проекта в формате `.yaml` используйте следующую команду:
+
+
+```console
+$ propan docs gen example:app
+
+Your project AsyncAPI scheme was placed to `./asyncapi.yaml`
+```
+
+
+Теперь у вас есть схема вашего проекта: вы можете использовать ее для генерации различных клиентов на любом языке с помощью соответсвующих инструментов [**AsyncAPI**]({{ urls.asyncapi }}/tools/generator){.external-link target="_blank"}
+
+???- example "Asyncapi.yaml"
+ ```yaml
+ {!> docs_src/quickstart/documentation/example.yaml !}
+ ```
+
+## Онлайн документация
+
+Также, **Propan** позволяет вам развернуть HTML-представление вашей документации следующей командой
+
+!!! warning ""
+ Онлайн представлени документации не работает без интернет-соединения, так как для ее отображения используются **CDN** зависимости.
+
+
+```console
+$ propan docs serve example:app
+```
+
+
+Так вы можете предоставить всем внешним потребителям доступ к документации вашего проекта без дополнительных затрат на разработку.
+
+???- example "HTML page"
+ ![HTML-page](../../assets/img/docs-html.png)
+
+!!! tip
+ **Propan** также можете хостить `asyncapi.yaml` файлы.
+
+ ```console
+ propan docs serve asyncapi.yaml
+ ```
+ Это может быть полезно если вы хотите расширить автоматически сгенерированную **AsyncAPI** документацию: вы просто генерирует файл, дорабатываете его и хостите!
+
+При использовании онлайн документации вы также можете скачать ее по соответствующим путям:
+
+* `/asyncapi.json` - **JSON** схема (доступно при хостинге приложения)
+* `/asyncapi.yaml` - **YAML** схема (доступна как для приложения, так и для файла)
+
+### FastAPI Plugin
+
+При использовании **Propan** в качестве роутера для **FastAPI**, фреймворк автоматически регистрирует эндпоинты для хостинга **AsyncAPI** документации в ваше приложение со следующими значениями по умолчанию:
+
+```python linenums='1'
+{!> docs_src/quickstart/documentation/fastapi.py !}
+```
+
+## Собственный хостинг
+
+Для хостинга документации онлайн **Propan** использует **FastAPI** + **uvicorn**.
+Возможно, вы захотите самостоятельно реализовать логику показа документации: ограничить права доступа, кастомизировать контент в заивисмоти от прав доступа, встроить документацию в свое frontend-приложение и тд.
+Для это вы можете самостоятельно сгенерировать `json`/`yaml`/`html` документ и использовать в собственном сервисе.
+
+```python linenums='1' hl_lines="9-12"
+{!> docs_src/quickstart/documentation/custom_schema.py !}
+```
diff --git a/docs/docs/ru/index.md b/docs/docs/ru/index.md
index 86cadbf2..770055d2 100644
--- a/docs/docs/ru/index.md
+++ b/docs/docs/ru/index.md
@@ -51,6 +51,7 @@
* Полностью совместимый с любым фреймворком способ управлять окружением проекта
* *hot reloading* при изменениях в коде
* Готовые шаблоны проекта
+* [**Документация**](getting_started/9_documentation/): **Propan** автоматически генерирует и представляет интерактивную [**AsyncAPI**]({{ urls.asyncapi }}){target="_blank"} документацию для вашего проекта
* [**Тестируемость**](getting_started/7_testing): **Propan** позволяет тестировать ваше приложение без внешних зависимостей: вам не нужно поднимать брокер сообщений, используйте виртуального!
---
diff --git a/docs/docs/ru/integrations/2_fastapi-plugin.md b/docs/docs/ru/integrations/2_fastapi-plugin.md
index a0418c67..b52bbc72 100644
--- a/docs/docs/ru/integrations/2_fastapi-plugin.md
+++ b/docs/docs/ru/integrations/2_fastapi-plugin.md
@@ -1,6 +1,6 @@
# **FastAPI** Plugin
-### Прием сообщений
+## Прием сообщений
**Propan** может использоваться как полноценная часть **FastAPI**.
@@ -20,7 +20,7 @@
Также этот роутер может полноценно использоваться как `HttpRouter` (наследником которого он и является). Поэтому вы можете
объявлять с его помощью любые `get`, `post`, `put` и прочие HTTP методы. Как например, это сделано в строке **19**.
-### Отправка сообщений
+## Отправка сообщений
Внутри каждого роутера есть соответсвующий брокер. Вы можете легко получить к нему доступ, если вам необходимо отправить сообщение в MQ.
@@ -29,3 +29,23 @@
Вы можете оформить доступ к брокеру в виде `Depends`, если хотите использовать его в разных частях вашей программы.
{! includes/integrations/fastapi/fastapi_plugin_depends.md !}
+
+Либо вы можете получить доступ к брокеру из контекста приложения **FastAPI**
+
+```python
+{! docs_src/integrations/fastapi/request.py !}
+```
+
+## @after_startup
+
+Приложение `PropanApp` имеет хук `after_startup`, который позволяет вам осуществлять операции с вашим брокером сообщений после того, как соединение с ним будет установлено. Это может быть крайне удобно для управление объектами вашего брокера и/или отправки сообщений. Этот хук также доступен и для ваших **FastAPI PropanRouter**
+
+{! includes/integrations/fastapi/after_startup.md !}
+
+## Документация
+
+При использовании **Propan** в качестве роутера для **FastAPI**, фреймворк автоматически регистрирует эндпоинты для хостинга **AsyncAPI** документации в ваше приложение со следующими значениями по умолчанию:
+
+```python linenums='1'
+{!> docs_src/quickstart/documentation/fastapi.py !}
+```
diff --git a/docs/docs/ru/nats/direct.py b/docs/docs/ru/nats/direct.py
deleted file mode 100644
index 4dd5f692..00000000
--- a/docs/docs/ru/nats/direct.py
+++ /dev/null
@@ -1,23 +0,0 @@
-from propan import PropanApp, NatsBroker
-from propan.annotations import Logger
-
-broker = NatsBroker()
-app = PropanApp(broker)
-
-@broker.handle("test-subj-1", "workers")
-async def base_handler1(logger: Logger):
- logger.info("base_handler1")
-
-@broker.handle("test-subj-1", "workers")
-async def base_handler2(logger: Logger):
- logger.info("base_handler2")
-
-@broker.handle("test-subj-2", "workers")
-async def base_handler3(logger: Logger):
- logger.info("base_handler3")
-
-@app.after_startup
-async def send_messages():
- await broker.publish("", "test-subj-1") # handlers: 1 or 2
- await broker.publish("", "test-subj-1") # handlers: 1 or 2
- await broker.publish("", "test-subj-2") # handlers: 3
diff --git a/docs/docs/ru/nats/pattern.py b/docs/docs/ru/nats/pattern.py
deleted file mode 100644
index e00c4b02..00000000
--- a/docs/docs/ru/nats/pattern.py
+++ /dev/null
@@ -1,23 +0,0 @@
-from propan import PropanApp, NatsBroker
-from propan.annotations import Logger
-
-broker = NatsBroker()
-app = PropanApp(broker)
-
-@broker.handle("*.info", "workers")
-async def base_handler1(logger: Logger):
- logger.info("base_handler1")
-
-@broker.handle("*.info", "workers")
-async def base_handler2(logger: Logger):
- logger.info("base_handler2")
-
-@broker.handle("*.error", "workers")
-async def base_handler3(logger: Logger):
- logger.info("base_handler3")
-
-@app.after_startup
-async def send_messages():
- await broker.publish("", "logs.info") # handlers: 1 or 2
- await broker.publish("", "logs.info") # handlers: 1 or 2
- await broker.publish("", "logs.error") # handlers: 3
diff --git a/docs/docs_src/contributing/adapter/parent.py b/docs/docs_src/contributing/adapter/parent.py
index 8a93d15e..ef63f016 100644
--- a/docs/docs_src/contributing/adapter/parent.py
+++ b/docs/docs_src/contributing/adapter/parent.py
@@ -1,7 +1,7 @@
from typing import Any, Callable, Optional, TypeVar
-from propan.brokers.model import BrokerUsecase
-from propan.brokers.model.schemas import PropanMessage
+from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.schemas import PropanMessage
from propan.brokers.push_back_watcher import BaseWatcher
from propan.types import HandlerWrapper, SendableMessage
diff --git a/docs/docs_src/contributing/adapter/rabbit_connect.py b/docs/docs_src/contributing/adapter/rabbit_connect.py
index 6e7668e7..3c824381 100644
--- a/docs/docs_src/contributing/adapter/rabbit_connect.py
+++ b/docs/docs_src/contributing/adapter/rabbit_connect.py
@@ -3,7 +3,7 @@
import aio_pika
-from propan.brokers.model import BrokerUsecase
+from propan.brokers._model import BrokerUsecase
class RabbitBroker(BrokerUsecase):
_connection: Optional[aio_pika.RobustConnection]
diff --git a/docs/docs_src/contributing/adapter/rabbit_handle.py b/docs/docs_src/contributing/adapter/rabbit_handle.py
index 2da77d4d..14e95c7d 100644
--- a/docs/docs_src/contributing/adapter/rabbit_handle.py
+++ b/docs/docs_src/contributing/adapter/rabbit_handle.py
@@ -2,8 +2,8 @@
from typing import List, Union, Optional
from propan.types import HandlerWrapper, HandlerCallable
-from propan.brokers.model import BrokerUsecase
-from propan.brokers.model.schemas import BaseHandler
+from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.schemas import BaseHandler
from propan.brokers.rabbit import RabbitExchange, RabbitQueue
diff --git a/docs/docs_src/contributing/adapter/rabbit_init.py b/docs/docs_src/contributing/adapter/rabbit_init.py
index 1b8edc34..8682b539 100644
--- a/docs/docs_src/contributing/adapter/rabbit_init.py
+++ b/docs/docs_src/contributing/adapter/rabbit_init.py
@@ -1,6 +1,6 @@
from typing import Any, Optional
-from propan.brokers.model import BrokerUsecase
+from propan.brokers._model import BrokerUsecase
class RabbitBroker(BrokerUsecase):
diff --git a/docs/docs_src/contributing/adapter/rabbit_parse.py b/docs/docs_src/contributing/adapter/rabbit_parse.py
index 79a815fa..6ef4c0b8 100644
--- a/docs/docs_src/contributing/adapter/rabbit_parse.py
+++ b/docs/docs_src/contributing/adapter/rabbit_parse.py
@@ -1,7 +1,7 @@
import aio_pika
-from propan.brokers.model import BrokerUsecase
-from propan.brokers.model.schemas import PropanMessage
+from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.schemas import PropanMessage
class RabbitBroker(BrokerUsecase):
diff --git a/docs/docs_src/contributing/adapter/rabbit_process.py b/docs/docs_src/contributing/adapter/rabbit_process.py
index 964e99cb..92355a58 100644
--- a/docs/docs_src/contributing/adapter/rabbit_process.py
+++ b/docs/docs_src/contributing/adapter/rabbit_process.py
@@ -1,8 +1,8 @@
from functools import wraps
from typing import Optional, TypeVar, Callable
-from propan.brokers.model import BrokerUsecase
-from propan.brokers.model.schemas import PropanMessage
+from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.schemas import PropanMessage
from propan.brokers.push_back_watcher import BaseWatcher, WatcherContext
T = TypeVar("T")
diff --git a/docs/docs_src/contributing/adapter/redis_process.py b/docs/docs_src/contributing/adapter/redis_process.py
index 5f0d3c9d..def64ebb 100644
--- a/docs/docs_src/contributing/adapter/redis_process.py
+++ b/docs/docs_src/contributing/adapter/redis_process.py
@@ -1,8 +1,8 @@
from functools import wraps
from typing import Optional, TypeVar, Callable
-from propan.brokers.model import BrokerUsecase
-from propan.brokers.model.schemas import PropanMessage
+from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.schemas import PropanMessage
from propan.brokers.push_back_watcher import BaseWatcher
T = TypeVar("T")
diff --git a/docs/docs_src/contributing/adapter/redis_publish.py b/docs/docs_src/contributing/adapter/redis_publish.py
index 11e25f2f..3f166cfd 100644
--- a/docs/docs_src/contributing/adapter/redis_publish.py
+++ b/docs/docs_src/contributing/adapter/redis_publish.py
@@ -1,7 +1,7 @@
from typing import Optional, Dict, Any
from propan.types import SendableMessage
-from propan.brokers.model import BrokerUsecase
+from propan.brokers._model import BrokerUsecase
from propan.brokers.redis.schemas import RedisMessage
diff --git a/docs/docs_src/contributing/adapter/redis_start.py b/docs/docs_src/contributing/adapter/redis_start.py
index 2f528cce..f3d84d28 100644
--- a/docs/docs_src/contributing/adapter/redis_start.py
+++ b/docs/docs_src/contributing/adapter/redis_start.py
@@ -4,7 +4,7 @@
from redis.asyncio.client import PubSub, Redis
-from propan.brokers.model import BrokerUsecase
+from propan.brokers._model import BrokerUsecase
@dataclass
diff --git a/docs/docs_src/integrations/fastapi/request.py b/docs/docs_src/integrations/fastapi/request.py
new file mode 100644
index 00000000..773473ed
--- /dev/null
+++ b/docs/docs_src/integrations/fastapi/request.py
@@ -0,0 +1,3 @@
+@app.get("/")
+def main(request: Request):
+ broker = request.state.broker
\ No newline at end of file
diff --git a/docs/docs_src/integrations/kafka/fastapi_after_startup.py b/docs/docs_src/integrations/kafka/fastapi_after_startup.py
new file mode 100644
index 00000000..6d739a06
--- /dev/null
+++ b/docs/docs_src/integrations/kafka/fastapi_after_startup.py
@@ -0,0 +1,16 @@
+from fastapi import FastAPI
+from propan.fastapi import KafkaRouter
+
+router = KafkaRouter("localhost:9092")
+
+app = FastAPI(lifespan=router.lifespan_context)
+
+@router.after_startup
+def do_smth(app: FastAPI):
+ ...
+
+@router.after_startup
+async def publish_smth(app: FastAPI):
+ await router.broker.publish(...)
+
+app.include_router(router)
\ No newline at end of file
diff --git a/docs/docs_src/integrations/nats/fastapi_after_startup.py b/docs/docs_src/integrations/nats/fastapi_after_startup.py
new file mode 100644
index 00000000..6f88576c
--- /dev/null
+++ b/docs/docs_src/integrations/nats/fastapi_after_startup.py
@@ -0,0 +1,16 @@
+from fastapi import FastAPI
+from propan.fastapi import NatsRouter
+
+router = NatsRouter("nats://localhost:4222")
+
+app = FastAPI(lifespan=router.lifespan_context)
+
+@router.after_startup
+def do_smth(app: FastAPI):
+ ...
+
+@router.after_startup
+async def publish_smth(app: FastAPI):
+ await router.broker.publish(...)
+
+app.include_router(router)
\ No newline at end of file
diff --git a/docs/docs_src/integrations/rabbit/fastapi_after_startup.py b/docs/docs_src/integrations/rabbit/fastapi_after_startup.py
new file mode 100644
index 00000000..d0692823
--- /dev/null
+++ b/docs/docs_src/integrations/rabbit/fastapi_after_startup.py
@@ -0,0 +1,16 @@
+from fastapi import FastAPI
+from propan.fastapi import RabbitRouter
+
+router = RabbitRouter("amqp://guest:guest@localhost:5672")
+
+app = FastAPI(lifespan=router.lifespan_context)
+
+@router.after_startup
+def do_smth(app: FastAPI):
+ ...
+
+@router.after_startup
+async def publish_smth(app: FastAPI):
+ await router.broker.publish(...)
+
+app.include_router(router)
\ No newline at end of file
diff --git a/docs/docs_src/integrations/redis/fastapi_after_startup.py b/docs/docs_src/integrations/redis/fastapi_after_startup.py
new file mode 100644
index 00000000..faa37b46
--- /dev/null
+++ b/docs/docs_src/integrations/redis/fastapi_after_startup.py
@@ -0,0 +1,16 @@
+from fastapi import FastAPI
+from propan.fastapi import RedisRouter
+
+router = RedisRouter("redis://localhost:6379")
+
+app = FastAPI(lifespan=router.lifespan_context)
+
+@router.after_startup
+def do_smth(app: FastAPI):
+ ...
+
+@router.after_startup
+async def publish_smth(app: FastAPI):
+ await router.broker.publish(...)
+
+app.include_router(router)
\ No newline at end of file
diff --git a/docs/docs_src/integrations/sqs/fastapi_after_startup.py b/docs/docs_src/integrations/sqs/fastapi_after_startup.py
new file mode 100644
index 00000000..b3ee675d
--- /dev/null
+++ b/docs/docs_src/integrations/sqs/fastapi_after_startup.py
@@ -0,0 +1,16 @@
+from fastapi import FastAPI
+from propan.fastapi import SQSRouter
+
+router = SQSRouter("http://localhost:9324")
+
+app = FastAPI(lifespan=router.lifespan_context)
+
+@router.after_startup
+def do_smth(app: FastAPI):
+ ...
+
+@router.after_startup
+async def publish_smth(app: FastAPI):
+ await router.broker.publish(...)
+
+app.include_router(router)
\ No newline at end of file
diff --git a/docs/docs_src/quickstart/documentation/custom_schema.py b/docs/docs_src/quickstart/documentation/custom_schema.py
new file mode 100644
index 00000000..bddb93a9
--- /dev/null
+++ b/docs/docs_src/quickstart/documentation/custom_schema.py
@@ -0,0 +1,12 @@
+from propan import PropanApp, RabbitBroker
+from propan.asyncapi.main import AsyncAPISchema
+from propan.cli.docs.gen import gen_app_schema_json, gen_app_schema_yaml, get_app_schema
+from propan.cli.docs.serve import get_asyncapi_html
+
+broker = RabbitBroker()
+app = PropanApp(broker)
+
+schema: AsyncAPISchema = get_app_schema(app)
+json_schema = gen_app_schema_json(app)
+yaml_schema = gen_app_schema_yaml(app)
+html = get_asyncapi_html(yaml_schema)
\ No newline at end of file
diff --git a/docs/docs_src/quickstart/documentation/example.py b/docs/docs_src/quickstart/documentation/example.py
new file mode 100644
index 00000000..77acde85
--- /dev/null
+++ b/docs/docs_src/quickstart/documentation/example.py
@@ -0,0 +1,23 @@
+from propan import PropanApp, RabbitBroker
+from propan.brokers.rabbit import RabbitQueue, RabbitExchange, ExchangeType
+
+broker = RabbitBroker()
+app = PropanApp(
+ broker=broker,
+ title="Smartylighting Streetlights Propan API",
+ version="1.0.0",
+ description="""
+ The Smartylighting Streetlights API.
+ ### Check out its awesome features:
+ * Turn a specific streetlight on/off 🌃
+ * Receive real-time information about environmental 📈
+ """
+)
+
+@broker.handle(
+ queue=RabbitQueue("*.info", durable=True),
+ exchange=RabbitExchange("logs", durable=True, type=ExchangeType.TOPIC)
+)
+async def handle_logs(level: int, message: str = ""):
+ """Handle all environmental events"""
+ ...
diff --git a/docs/docs_src/quickstart/documentation/example.yaml b/docs/docs_src/quickstart/documentation/example.yaml
new file mode 100644
index 00000000..1034044b
--- /dev/null
+++ b/docs/docs_src/quickstart/documentation/example.yaml
@@ -0,0 +1,67 @@
+asyncapi: 2.6.0
+defaultContentType: application/json
+info:
+ title: Smartylighting Streetlights Propan API
+ version: 1.0.0
+ description: "\n The Smartylighting Streetlights API.\n ### Check out its\
+ \ awesome features:\n * Turn a specific streetlight on/off \U0001F303\n \
+ \ * Receive real-time information about environmental \U0001F4C8\n "
+servers:
+ dev:
+ url: amqp://guest:guest@localhost:5672/
+ protocol: amqp
+ protocolVersion: 0.9.1
+channels:
+ HandleLogs:
+ servers:
+ - dev
+ bindings:
+ amqp:
+ is: routingKey
+ bindingVersion: 0.2.0
+ queue:
+ name: '*.info'
+ durable: true
+ exclusive: false
+ autoDelete: false
+ vhost: /
+ exchange:
+ name: logs
+ type: topic
+ durable: true
+ autoDelete: false
+ vhost: /
+ subscribe:
+ description: Handle all environmental events
+ bindings:
+ amqp:
+ cc: '*.info'
+ ack: true
+ bindingVersion: 0.2.0
+ message:
+ $ref: '#/components/messages/HandleLogsMessage'
+components:
+ messages:
+ HandleLogsMessage:
+ title: HandleLogsMessage
+ correlationId:
+ location: $message.header#/correlation_id
+ payload:
+ $ref: '#/components/schemas/HandleLogsPayload'
+ schemas:
+ HandleLogsPayload:
+ title: HandleLogsPayload
+ type: object
+ properties:
+ level:
+ title: Level
+ type: integer
+ message:
+ title: Message
+ default: ''
+ type: string
+ required:
+ - level
+ example:
+ level: 4015
+ message: evwWheCeRIGhHEHYxKSJ
diff --git a/docs/docs_src/quickstart/documentation/fastapi.py b/docs/docs_src/quickstart/documentation/fastapi.py
new file mode 100644
index 00000000..98c1cf49
--- /dev/null
+++ b/docs/docs_src/quickstart/documentation/fastapi.py
@@ -0,0 +1,6 @@
+from propan.fastapi import RabbitRouter
+
+router = RabbitRouter(
+ schema_url="/asyncapi",
+ include_in_schema=True,
+)
\ No newline at end of file
diff --git a/docs/includes/integrations/fastapi/after_startup.md b/docs/includes/integrations/fastapi/after_startup.md
new file mode 100644
index 00000000..56d1b829
--- /dev/null
+++ b/docs/includes/integrations/fastapi/after_startup.md
@@ -0,0 +1,24 @@
+=== "Redis"
+ ```python linenums="1" hl_lines="8 12"
+ {!> docs_src/integrations/redis/fastapi_after_startup.py!}
+ ```
+
+=== "RabbitMQ"
+ ```python linenums="1" hl_lines="8 12"
+ {!> docs_src/integrations/rabbit/fastapi_after_startup.py!}
+ ```
+
+=== "Kafka"
+ ```python linenums="1" hl_lines="8 12"
+ {!> docs_src/integrations/kafka/fastapi_after_startup.py!}
+ ```
+
+=== "SQS"
+ ```python linenums="1" hl_lines="8 12"
+ {!> docs_src/integrations/sqs/fastapi_after_startup.py!}
+ ```
+
+=== "NATS"
+ ```python linenums="1" hl_lines="8 12"
+ {!> docs_src/integrations/nats/fastapi_after_startup.py!}
+ ```
\ No newline at end of file
diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml
index e798912b..176982b3 100644
--- a/docs/mkdocs.yml
+++ b/docs/mkdocs.yml
@@ -54,6 +54,7 @@ theme:
plugins:
- search
+ - glightbox
- macros:
include_dir: includes
- i18n:
@@ -104,6 +105,7 @@ extra:
nats_py: https://github.com/nats-io/nats.py
pydantic: https://docs.pydantic.dev/
pytest: https://docs.pytest.org/en/latest/
+ asyncapi: https://www.asyncapi.com/
social:
- icon: fontawesome/brands/github-alt
link: https://github.com/lancetnik/propan
@@ -128,6 +130,7 @@ nav:
- Lifespan: getting_started/6_lifespans.md
- Testing: getting_started/7_testing.md
- Logging: getting_started/8_logging.md
+ - Documentation: getting_started/9_documentation.md
- RabbitMQ:
- Routing: rabbit/1_routing.md
- Exchanges: rabbit/2_exchanges.md
diff --git a/examples/grpc/gen_py_code.sh b/examples/grpc/gen_py_code.sh
new file mode 100644
index 00000000..ab5b2fc8
--- /dev/null
+++ b/examples/grpc/gen_py_code.sh
@@ -0,0 +1 @@
+python -m grpc_tools.protoc --python_out=. --pyi_out=. -I . message.proto
\ No newline at end of file
diff --git a/examples/grpc/grpc_encoding.py b/examples/grpc/grpc_encoding.py
new file mode 100644
index 00000000..1542319a
--- /dev/null
+++ b/examples/grpc/grpc_encoding.py
@@ -0,0 +1,25 @@
+from message_pb2 import Person
+
+from propan import PropanApp, RabbitBroker
+from propan.annotations import Logger, NoCast
+from propan.brokers.rabbit import RabbitMessage
+
+broker = RabbitBroker()
+app = PropanApp(broker)
+
+
+async def decode_message(msg: RabbitMessage, *args) -> Person:
+ decoded = Person()
+ decoded.ParseFromString(msg.body)
+ return decoded
+
+
+@broker.handle("test", decode_message=decode_message)
+async def consume(body: NoCast[Person], logger: Logger):
+ logger.info(body)
+
+
+@app.after_startup
+async def publish():
+ body = Person(name="john", age=25).SerializeToString()
+ await broker.publish(body, "test")
diff --git a/examples/grpc/message.proto b/examples/grpc/message.proto
new file mode 100644
index 00000000..91552f70
--- /dev/null
+++ b/examples/grpc/message.proto
@@ -0,0 +1,6 @@
+syntax = "proto3";
+
+message Person {
+ string name = 1;
+ float age = 2;
+}
\ No newline at end of file
diff --git a/examples/grpc/requirements.txt b/examples/grpc/requirements.txt
new file mode 100644
index 00000000..4a2666eb
--- /dev/null
+++ b/examples/grpc/requirements.txt
@@ -0,0 +1 @@
+grpcio-tools
\ No newline at end of file
diff --git a/examples/http_frameworks_integrations/native_fastapi.py b/examples/http_frameworks_integrations/native_fastapi.py
index 3bfebdcd..366cfaa1 100644
--- a/examples/http_frameworks_integrations/native_fastapi.py
+++ b/examples/http_frameworks_integrations/native_fastapi.py
@@ -3,10 +3,10 @@
from propan.fastapi import RabbitRouter
-app = FastAPI()
-
router = RabbitRouter("amqp://guest:guest@localhost:5672")
+app = FastAPI(lifespan=router.lifespan_context)
+
class Incoming(BaseModel):
m: dict
diff --git a/propan/__about__.py b/propan/__about__.py
index 35b4f3b4..99fbe0ff 100644
--- a/propan/__about__.py
+++ b/propan/__about__.py
@@ -2,7 +2,7 @@
from unittest.mock import Mock
-__version__ = "0.1.2.17"
+__version__ = "0.1.3.0"
INSTALL_MESSAGE = (
diff --git a/propan/__init__.py b/propan/__init__.py
index 742cd578..393ccc9b 100644
--- a/propan/__init__.py
+++ b/propan/__init__.py
@@ -45,9 +45,10 @@
## context
"context",
"Context",
- "ContextRepo" "Alias",
+ "ContextRepo",
"Depends",
# brokers
+ "PropanMessage",
"NatsBroker",
"RabbitBroker",
"RedisBroker",
diff --git a/propan/annotations.py b/propan/annotations.py
index 50533ddd..198c4ea4 100644
--- a/propan/annotations.py
+++ b/propan/annotations.py
@@ -1,16 +1,19 @@
import logging
-from typing_extensions import Annotated
+from typing_extensions import Annotated, TypeVar
from propan import __about__ as about
from propan.cli.app import PropanApp
from propan.utils.context import Context as ContextField
from propan.utils.context import ContextRepo as CR
+from propan.utils.no_cast import NoCast as NC
Logger = Annotated[logging.Logger, ContextField("logger")]
App = Annotated[PropanApp, ContextField("app")]
ContextRepo = Annotated[CR, ContextField("context")]
+NoCastType = TypeVar("NoCastType")
+NoCast = Annotated[NoCastType, NC()]
try:
import aio_pika
diff --git a/propan/asyncapi/__init__.py b/propan/asyncapi/__init__.py
new file mode 100644
index 00000000..cff29e20
--- /dev/null
+++ b/propan/asyncapi/__init__.py
@@ -0,0 +1,32 @@
+from propan.asyncapi.bindings import AsyncAPIChannelBinding
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.info import AsyncAPIContact, AsyncAPIInfo, AsyncAPILicense
+from propan.asyncapi.main import ASYNC_API_VERSION, AsyncAPIComponents, AsyncAPISchema
+from propan.asyncapi.message import AsyncAPIMessage
+from propan.asyncapi.security import AsyncAPISecuritySchemeComponent
+from propan.asyncapi.servers import AsyncAPIServer
+from propan.asyncapi.utils import AsyncAPIExternalDocs, AsyncAPITag
+
+__all__ = (
+ # main
+ "ASYNC_API_VERSION",
+ "AsyncAPISchema",
+ "AsyncAPIComponents",
+ # info
+ "AsyncAPIInfo",
+ "AsyncAPIContact",
+ "AsyncAPILicense",
+ # servers
+ "AsyncAPIServer",
+ # channels
+ "AsyncAPIChannel",
+ # utils
+ "AsyncAPITag",
+ "AsyncAPIExternalDocs",
+ # bindings
+ "AsyncAPIChannelBinding",
+ # messages
+ "AsyncAPIMessage",
+ # security
+ "AsyncAPISecuritySchemeComponent",
+)
diff --git a/propan/asyncapi/bindings/__init__.py b/propan/asyncapi/bindings/__init__.py
new file mode 100644
index 00000000..42f4abb2
--- /dev/null
+++ b/propan/asyncapi/bindings/__init__.py
@@ -0,0 +1,9 @@
+from propan.asyncapi.bindings.main import (
+ AsyncAPIChannelBinding,
+ AsyncAPIOperationBinding,
+)
+
+__all__ = (
+ "AsyncAPIChannelBinding",
+ "AsyncAPIOperationBinding",
+)
diff --git a/propan/asyncapi/bindings/amqp.py b/propan/asyncapi/bindings/amqp.py
new file mode 100644
index 00000000..8ea90340
--- /dev/null
+++ b/propan/asyncapi/bindings/amqp.py
@@ -0,0 +1,57 @@
+from typing import Optional
+
+from pydantic import BaseModel, Field
+from typing_extensions import Literal
+
+from propan.types import AnyDict
+
+
+class AsyncAPIAmqpQueue(BaseModel):
+ name: str
+ durable: bool
+ exclusive: bool
+ auto_delete: bool = Field(alias="autoDelete")
+ vhost: str = "/"
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPIAmqpExchange(BaseModel):
+ name: Optional[str] = None
+ type: Literal["default", "direct", "topic", "fanout", "headers"]
+ durable: Optional[bool] = None
+ auto_delete: Optional[bool] = Field(
+ default=None,
+ alias="autoDelete",
+ )
+ vhost: str = "/"
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPIAmqpChannelBinding(BaseModel):
+ is_: Literal["queue", "routingKey"] = Field(..., alias="is")
+ version: str = Field(
+ default="0.2.0",
+ alias="bindingVersion",
+ )
+ queue: Optional[AsyncAPIAmqpQueue] = None
+ exchange: Optional[AsyncAPIAmqpExchange] = None
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPIAmqpOperationBinding(BaseModel):
+ cc: Optional[str] = None
+ ack: bool = True
+ reply_to: Optional[AnyDict] = Field(default=None, alias="replyTo")
+ version: str = Field(
+ default="0.2.0",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/bindings/kafka.py b/propan/asyncapi/bindings/kafka.py
new file mode 100644
index 00000000..b6ff2f98
--- /dev/null
+++ b/propan/asyncapi/bindings/kafka.py
@@ -0,0 +1,41 @@
+from typing import List, Optional
+
+from pydantic import BaseModel, Field
+
+from propan.types import AnyDict
+
+
+class AsyncAPIKafkaChannelBinding(BaseModel):
+ topic: List[str]
+ partitions: Optional[int] = None
+ replicas: Optional[int] = None
+ # TODO:
+ # topicConfiguration
+ version: str = Field(
+ default="0.4.0",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPIKafkaOperationBinding(BaseModel):
+ group_id: Optional[AnyDict] = Field(
+ default=None,
+ alias="groupId",
+ )
+ client_id: Optional[AnyDict] = Field(
+ default=None,
+ alias="clientId",
+ )
+
+ reply_to: Optional[AnyDict] = Field(default=None, alias="replyTo")
+
+ version: str = Field(
+ default="0.4.0",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/bindings/main.py b/propan/asyncapi/bindings/main.py
new file mode 100644
index 00000000..577c90d0
--- /dev/null
+++ b/propan/asyncapi/bindings/main.py
@@ -0,0 +1,40 @@
+from typing import Optional
+
+from pydantic import BaseModel
+
+from propan.asyncapi.bindings.amqp import (
+ AsyncAPIAmqpChannelBinding,
+ AsyncAPIAmqpOperationBinding,
+)
+from propan.asyncapi.bindings.kafka import (
+ AsyncAPIKafkaChannelBinding,
+ AsyncAPIKafkaOperationBinding,
+)
+from propan.asyncapi.bindings.nats import (
+ AsyncAPINatsChannelBinding,
+ AsyncAPINatsOperationBinding,
+)
+from propan.asyncapi.bindings.redis import (
+ AsyncAPIRedisChannelBinding,
+ AsyncAPIRedisOperationBinding,
+)
+from propan.asyncapi.bindings.sqs import (
+ AsyncAPISQSChannelBinding,
+ AsyncAPISQSOperationBinding,
+)
+
+
+class AsyncAPIChannelBinding(BaseModel):
+ amqp: Optional[AsyncAPIAmqpChannelBinding] = None
+ kafka: Optional[AsyncAPIKafkaChannelBinding] = None
+ sqs: Optional[AsyncAPISQSChannelBinding] = None
+ nats: Optional[AsyncAPINatsChannelBinding] = None
+ redis: Optional[AsyncAPIRedisChannelBinding] = None
+
+
+class AsyncAPIOperationBinding(BaseModel):
+ amqp: Optional[AsyncAPIAmqpOperationBinding] = None
+ kafka: Optional[AsyncAPIKafkaOperationBinding] = None
+ sqs: Optional[AsyncAPISQSOperationBinding] = None
+ nats: Optional[AsyncAPINatsOperationBinding] = None
+ redis: Optional[AsyncAPIRedisOperationBinding] = None
diff --git a/propan/asyncapi/bindings/nats.py b/propan/asyncapi/bindings/nats.py
new file mode 100644
index 00000000..5a57e54a
--- /dev/null
+++ b/propan/asyncapi/bindings/nats.py
@@ -0,0 +1,28 @@
+from typing import Optional
+
+from pydantic import BaseModel, Field
+
+from propan.types import AnyDict
+
+
+class AsyncAPINatsChannelBinding(BaseModel):
+ subject: str
+ queue: Optional[str] = None
+ version: str = Field(
+ default="custom",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPINatsOperationBinding(BaseModel):
+ reply_to: Optional[AnyDict] = Field(default=None, alias="replyTo")
+ version: str = Field(
+ default="custom",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/bindings/redis.py b/propan/asyncapi/bindings/redis.py
new file mode 100644
index 00000000..41e5c39f
--- /dev/null
+++ b/propan/asyncapi/bindings/redis.py
@@ -0,0 +1,29 @@
+from typing import Optional
+
+from pydantic import BaseModel, Field
+from typing_extensions import Literal
+
+from propan.types import AnyDict
+
+
+class AsyncAPIRedisChannelBinding(BaseModel):
+ channel: str
+ method: Literal["ssubscribe", "psubscribe", "subscribe"] = "subscribe"
+ version: str = Field(
+ default="custom",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPIRedisOperationBinding(BaseModel):
+ reply_to: Optional[AnyDict] = Field(default=None, alias="replyTo")
+ version: str = Field(
+ default="custom",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/bindings/sqs.py b/propan/asyncapi/bindings/sqs.py
new file mode 100644
index 00000000..cdd51a5c
--- /dev/null
+++ b/propan/asyncapi/bindings/sqs.py
@@ -0,0 +1,28 @@
+from typing import Optional
+
+from pydantic import BaseModel, Field
+
+from propan.types import AnyDict
+
+
+class AsyncAPISQSChannelBinding(BaseModel):
+ queue: AnyDict
+ version: str = Field(
+ default="custom",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPISQSOperationBinding(BaseModel):
+ reply_to: Optional[AnyDict] = Field(default=None, alias="replyTo")
+
+ version: str = Field(
+ default="custom",
+ alias="bindingVersion",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/channels.py b/propan/asyncapi/channels.py
new file mode 100644
index 00000000..7d1d1c55
--- /dev/null
+++ b/propan/asyncapi/channels.py
@@ -0,0 +1,24 @@
+from typing import List, Optional
+
+from pydantic import BaseModel
+
+from propan.asyncapi.bindings import AsyncAPIChannelBinding, AsyncAPIOperationBinding
+from propan.asyncapi.subscription import AsyncAPISubscription
+
+
+class AsyncAPIPublish(BaseModel):
+ bindings: Optional[AsyncAPIOperationBinding] = None
+
+
+class AsyncAPIChannelParameters(BaseModel):
+ # TODO
+ ...
+
+
+class AsyncAPIChannel(BaseModel):
+ description: Optional[str] = None
+ servers: Optional[List[str]] = None
+ bindings: Optional[AsyncAPIChannelBinding] = None
+ subscribe: Optional[AsyncAPISubscription] = None
+ publish: Optional[AsyncAPIPublish] = None
+ parameters: Optional[AsyncAPIChannelParameters] = None
diff --git a/propan/asyncapi/info.py b/propan/asyncapi/info.py
new file mode 100644
index 00000000..22a05807
--- /dev/null
+++ b/propan/asyncapi/info.py
@@ -0,0 +1,35 @@
+import importlib.util
+from typing import Optional
+
+from pydantic import BaseModel, Field, HttpUrl
+
+if importlib.util.find_spec("email_validator"):
+ from pydantic import EmailStr
+else:
+ EmailStr = str
+
+
+class AsyncAPIContact(BaseModel):
+ name: str
+ url: HttpUrl
+ email: Optional[EmailStr] = None
+
+
+class AsyncAPILicense(BaseModel):
+ name: str
+ url: HttpUrl
+
+
+class AsyncAPIInfo(BaseModel):
+ title: str
+ version: str = "1.0.0"
+ description: str = ""
+ terms: Optional[HttpUrl] = Field(
+ default=None,
+ alias="termsOfService",
+ )
+ contact: Optional[AsyncAPIContact] = None
+ license: Optional[AsyncAPILicense] = None
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/main.py b/propan/asyncapi/main.py
new file mode 100644
index 00000000..c88f644f
--- /dev/null
+++ b/propan/asyncapi/main.py
@@ -0,0 +1,57 @@
+from typing import Dict, List, Optional
+
+from pydantic import BaseModel, Field
+
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.info import AsyncAPIInfo
+from propan.asyncapi.message import AsyncAPIMessage
+from propan.asyncapi.servers import AsyncAPIServer
+from propan.asyncapi.utils import AsyncAPIExternalDocs, AsyncAPITag
+from propan.brokers._model.schemas import ContentTypes
+from propan.types import AnyDict
+
+ASYNC_API_VERSION = "2.6.0"
+
+
+class AsyncAPIComponents(BaseModel):
+ # TODO
+ # servers
+ # serverVariavles
+ # channels
+ messages: Optional[Dict[str, AsyncAPIMessage]] = None
+ schemas: Optional[Dict[str, AnyDict]] = None
+
+ # securitySchemes
+ # parameters
+ # correlationIds
+ # operationTraits
+ # messageTraits
+ # serverBindings
+ # channelBindings
+ # operationBindings
+ # messageBindings
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPISchema(BaseModel):
+ asyncapi: str = ASYNC_API_VERSION
+ default_content_type: str = Field(
+ default=ContentTypes.json.value,
+ alias="defaultContentType",
+ )
+ info: AsyncAPIInfo
+ servers: Optional[Dict[str, AsyncAPIServer]] = None
+ channels: Dict[str, AsyncAPIChannel]
+ tags: Optional[List[AsyncAPITag]] = None
+ external_docs: Optional[AsyncAPIExternalDocs] = Field(
+ default=None,
+ alias="externalDocs",
+ )
+
+ # TODO:
+ # id
+ components: Optional[AsyncAPIComponents] = None
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/message.py b/propan/asyncapi/message.py
new file mode 100644
index 00000000..cb40908a
--- /dev/null
+++ b/propan/asyncapi/message.py
@@ -0,0 +1,46 @@
+from typing import Any, Dict, List, Optional
+
+from pydantic import BaseModel, Field
+
+from propan.asyncapi.utils import AsyncAPIExternalDocs, AsyncAPITag
+
+
+class AsyncAPICorrelationId(BaseModel):
+ description: Optional[str] = None
+ location: str
+
+
+class AsyncAPIMessage(BaseModel):
+ title: Optional[str] = None
+ name: Optional[str] = None
+ summary: Optional[str] = None
+ description: Optional[str] = None
+ message_id: Optional[str] = Field(
+ default=None,
+ alias="messageId",
+ )
+ correlation_id: Optional[AsyncAPICorrelationId] = Field(
+ default=None,
+ alias="correlationId",
+ )
+ content_type: Optional[str] = Field(
+ default=None,
+ alias="contentType",
+ )
+
+ payload: Dict[str, Any]
+ # TODO:
+ # headers
+ # schemaFormat
+ # bindings
+ # examples
+ # traits
+
+ tags: Optional[List[AsyncAPITag]] = None
+ external_docs: Optional[AsyncAPIExternalDocs] = Field(
+ default=None,
+ alias="externalDocs",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/security.py b/propan/asyncapi/security.py
new file mode 100644
index 00000000..b469befb
--- /dev/null
+++ b/propan/asyncapi/security.py
@@ -0,0 +1,73 @@
+from typing import Dict, Optional
+
+from pydantic import BaseModel, Field, HttpUrl
+from typing_extensions import Literal
+
+
+class AsyncAPIOauthFlowObj(BaseModel):
+ authorization_url: Optional[HttpUrl] = Field(
+ default=None,
+ alias="authorizationUrl",
+ )
+ token_url: Optional[HttpUrl] = Field(
+ default=None,
+ alias="tokenUrl",
+ )
+ refresh_url: Optional[HttpUrl] = Field(
+ default=None,
+ alias="refreshUrl",
+ )
+ scopes: Dict[str, str]
+
+
+class AsyncAPIOauthFlows(BaseModel):
+ implicit: Optional[AsyncAPIOauthFlowObj] = None
+ password: Optional[AsyncAPIOauthFlowObj] = None
+ client_credentials: Optional[AsyncAPIOauthFlowObj] = Field(
+ default=None,
+ alias="clientCredentials",
+ )
+ authorization_code: Optional[AsyncAPIOauthFlowObj] = Field(
+ default=None,
+ alias="authorizationCode",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+class AsyncAPISecuritySchemeComponent(BaseModel):
+ type: Literal[
+ "userPassword",
+ "apikey",
+ "X509",
+ "symmetricEncryption",
+ "asymmetricEncryption",
+ "httpApiKey",
+ "http",
+ "oauth2",
+ "openIdConnect",
+ "plain",
+ "scramSha256",
+ "scramSha512",
+ "gssapi",
+ ]
+ name: Optional[str] = None
+ description: Optional[str] = None
+ in_: Optional[str] = Field(
+ default=None,
+ alias="in",
+ )
+ scheme: Optional[str] = None
+ bearer_format: Optional[str] = Field(
+ default=None,
+ alias="bearerFormat",
+ )
+ openid_connect_url: Optional[str] = Field(
+ default=None,
+ alias="openIdConnectUrl",
+ )
+ flows: Optional[AsyncAPIOauthFlows] = None
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/servers.py b/propan/asyncapi/servers.py
new file mode 100644
index 00000000..9842750d
--- /dev/null
+++ b/propan/asyncapi/servers.py
@@ -0,0 +1,24 @@
+from typing import Dict, List, Optional
+
+from pydantic import BaseModel, Field
+
+from propan.asyncapi.utils import AsyncAPITag
+
+
+class AsyncAPIServer(BaseModel):
+ url: str
+ protocol: str
+ description: Optional[str] = None
+ protocol_version: Optional[str] = Field(
+ default=None,
+ alias="protocolVersion",
+ )
+ tags: Optional[List[AsyncAPITag]] = None
+ security: Optional[Dict[str, List[str]]] = None
+
+ # TODO:
+ # variables
+ # bindings
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/subscription.py b/propan/asyncapi/subscription.py
new file mode 100644
index 00000000..b72b314e
--- /dev/null
+++ b/propan/asyncapi/subscription.py
@@ -0,0 +1,35 @@
+from typing import Dict, List, Optional
+
+from pydantic import BaseModel, Field
+
+from propan.asyncapi.bindings import AsyncAPIOperationBinding
+from propan.asyncapi.message import AsyncAPIMessage
+from propan.asyncapi.utils import AsyncAPIExternalDocs, AsyncAPITag
+
+
+class AsyncAPISubscription(BaseModel):
+ operation_id: Optional[str] = Field(
+ default=None,
+ alias="operationId",
+ )
+
+ summary: Optional[str] = None
+ description: Optional[str] = None
+
+ bindings: Optional[AsyncAPIOperationBinding] = None
+
+ message: AsyncAPIMessage
+
+ security: Optional[Dict[str, List[str]]] = None
+
+ # TODO
+ # traits
+
+ tags: Optional[List[AsyncAPITag]] = None
+ external_docs: Optional[AsyncAPIExternalDocs] = Field(
+ default=None,
+ alias="externalDocs",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
diff --git a/propan/asyncapi/utils.py b/propan/asyncapi/utils.py
new file mode 100644
index 00000000..9596352c
--- /dev/null
+++ b/propan/asyncapi/utils.py
@@ -0,0 +1,49 @@
+import json
+import sys
+from typing import Optional, Type
+
+from pydantic import BaseModel, Field
+
+
+class AsyncAPIExternalDocs(BaseModel):
+ url: str = ""
+ description: str = ""
+
+
+class AsyncAPITag(BaseModel):
+ name: str
+ description: str = ""
+ external_docs: Optional[AsyncAPIExternalDocs] = Field(
+ default=None,
+ alias="externalDocs",
+ )
+
+ class Config:
+ allow_population_by_field_name = True
+
+
+def add_example_to_model(model: Type[BaseModel]) -> Type[BaseModel]:
+ if sys.version_info >= (3, 8):
+ from polyfactory.factories.pydantic_factory import ModelFactory
+
+ factory = type(
+ f"{model.__name__}_factory", (ModelFactory,), {"__model__": model}
+ )
+
+ return type(
+ model.__name__,
+ (model,),
+ {
+ "Config": type(
+ "Config",
+ (model.Config,),
+ {
+ "schema_extra": {
+ "example": json.loads(factory.build().json()),
+ },
+ },
+ )
+ },
+ )
+ else: # pragma: no cover
+ return model
diff --git a/propan/brokers/__init__.py b/propan/brokers/__init__.py
index e69de29b..ad9239a1 100644
--- a/propan/brokers/__init__.py
+++ b/propan/brokers/__init__.py
@@ -0,0 +1,3 @@
+from propan.brokers._model.schemas import PropanMessage
+
+__all__ = ("PropanMessage",)
diff --git a/propan/brokers/_model/broker_usecase.py b/propan/brokers/_model/broker_usecase.py
index 4527cf5c..bda310a2 100644
--- a/propan/brokers/_model/broker_usecase.py
+++ b/propan/brokers/_model/broker_usecase.py
@@ -2,11 +2,13 @@
import logging
from abc import ABC, abstractmethod
from functools import wraps
+from itertools import chain
from typing import (
Any,
Awaitable,
Callable,
Dict,
+ Generic,
List,
Mapping,
Optional,
@@ -14,15 +16,16 @@
Tuple,
TypeVar,
Union,
+ cast,
)
from fast_depends.construct import get_dependant
-from fast_depends.model import Dependant
+from fast_depends.model import Dependant, Depends
from fast_depends.utils import args_to_kwargs
-from pydantic.fields import ModelField
-from typing_extensions import Self
+from typing_extensions import Self, TypeAlias
from propan.brokers._model.schemas import (
+ BaseHandler,
ContentType,
ContentTypes,
PropanMessage,
@@ -37,26 +40,40 @@
from propan.brokers.exceptions import SkipMessage
from propan.brokers.push_back_watcher import BaseWatcher
from propan.log import access_logger
-from propan.types import (
- AnyCallable,
- AnyDict,
- DecodedMessage,
- DecoratedAsync,
- HandlerWrapper,
- SendableMessage,
- Wrapper,
-)
+from propan.types import AnyDict, DecodedMessage, HandlerWrapper, SendableMessage
from propan.utils import apply_types, context
-from propan.utils.functions import get_function_arguments, to_async
-
-T = TypeVar("T")
-
-
-class BrokerUsecase(ABC):
+from propan.utils.functions import get_function_positional_arguments, to_async
+
+T = TypeVar("T", bound=DecodedMessage)
+
+MsgType = TypeVar("MsgType")
+ConnectionType = TypeVar("ConnectionType")
+
+CustomParser: TypeAlias = Optional[
+ Callable[
+ [MsgType, Callable[[MsgType], Awaitable[PropanMessage[MsgType]]]],
+ Awaitable[PropanMessage[MsgType]],
+ ]
+]
+CustomDecoder: TypeAlias = Optional[
+ Callable[
+ [
+ PropanMessage[MsgType],
+ Callable[[PropanMessage[MsgType]], Awaitable[DecodedMessage]],
+ ],
+ Awaitable[DecodedMessage],
+ ]
+]
+
+
+class BrokerUsecase(ABC, Generic[MsgType, ConnectionType]):
logger: Optional[logging.Logger]
log_level: int
- handlers: List[Any]
- _connection: Any
+ handlers: Sequence[BaseHandler]
+ dependencies: Sequence[Depends]
+ _global_parser: CustomParser[MsgType]
+ _global_decoder: CustomDecoder[MsgType]
+ _connection: Optional[ConnectionType]
_fmt: Optional[str]
def __init__(
@@ -66,6 +83,13 @@ def __init__(
logger: Optional[logging.Logger] = access_logger,
log_level: int = logging.INFO,
log_fmt: Optional[str] = "%(asctime)s %(levelname)s - %(message)s",
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[MsgType] = None,
+ parse_message: CustomParser[MsgType] = None,
+ # AsyncAPI
+ protocol: str = "",
+ protocol_version: Optional[str] = None,
+ url_: Union[str, List[str]] = "",
**kwargs: Any,
) -> None:
self.logger = logger
@@ -75,16 +99,24 @@ def __init__(
self._connection = None
self._is_apply_types = apply_types
self.handlers = []
+ self.dependencies = dependencies
self._connection_args = args
self._connection_kwargs = kwargs
+ self._global_parser = parse_message
+ self._global_decoder = decode_message
+
context.set_global("logger", logger)
context.set_global("broker", self)
- async def connect(self, *args: Any, **kwargs: Any) -> Any:
+ self.protocol = protocol
+ self.protocol_version = protocol_version
+ self.url = url_
+
+ async def connect(self, *args: Any, **kwargs: Any) -> ConnectionType:
if self._connection is None:
- arguments = get_function_arguments(self.__init__) # type: ignore
+ arguments = get_function_positional_arguments(self.__init__) # type: ignore
init_kwargs = args_to_kwargs(
arguments,
*self._connection_args,
@@ -96,7 +128,7 @@ async def connect(self, *args: Any, **kwargs: Any) -> Any:
return self._connection
@abstractmethod
- async def _connect(self, **kwargs: Any) -> Any:
+ async def _connect(self, **kwargs: Any) -> ConnectionType:
raise NotImplementedError()
@abstractmethod
@@ -117,20 +149,20 @@ async def close(self) -> None:
raise NotImplementedError()
@abstractmethod
- async def _parse_message(self, message: Any) -> PropanMessage:
+ async def _parse_message(self, message: MsgType) -> PropanMessage[MsgType]:
raise NotImplementedError()
@abstractmethod
def _process_message(
self,
- func: Callable[[PropanMessage], T],
+ func: Callable[[PropanMessage[MsgType]], Awaitable[T]],
watcher: Optional[BaseWatcher],
- ) -> Callable[[PropanMessage], T]:
+ ) -> Callable[[PropanMessage[MsgType]], Awaitable[T]]:
raise NotImplementedError()
def _get_log_context(
self,
- message: Optional[PropanMessage],
+ message: Optional[PropanMessage[MsgType]],
**kwargs: Dict[str, str],
) -> Dict[str, Any]:
return {
@@ -142,13 +174,18 @@ def handle(
self,
*broker_args: Any,
retry: Union[bool, int] = False,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[MsgType] = None,
+ parse_message: CustomParser[MsgType] = None,
+ description: str = "",
_raw: bool = False,
+ _get_dependant: Callable[[Callable[..., Any]], Dependant] = get_dependant,
**broker_kwargs: Any,
) -> HandlerWrapper:
raise NotImplementedError()
@staticmethod
- async def _decode_message(message: PropanMessage) -> DecodedMessage:
+ async def _decode_message(message: PropanMessage[MsgType]) -> DecodedMessage:
body = message.body
m: DecodedMessage = body
if message.content_type is not None:
@@ -181,48 +218,83 @@ async def __aexit__(self, *args: Any, **kwargs: Any) -> None:
def _wrap_handler(
self,
- func: AnyCallable,
+ func: Union[Callable[..., T], Callable[..., Awaitable[T]]],
retry: Union[bool, int] = False,
+ extra_dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[MsgType] = None,
+ parse_message: CustomParser[MsgType] = None,
_raw: bool = False,
- **broker_args: Any,
- ) -> DecoratedAsync:
- dependant: Dependant = get_dependant(path="", call=func)
+ _get_dependant: Callable[..., Dependant] = get_dependant,
+ **broker_log_context_kwargs: Any,
+ ) -> Tuple[Callable[[MsgType, bool], Awaitable[Optional[T]]], Dependant]:
+ dependant = _get_dependant(path="", call=func)
+ extra = [
+ _get_dependant(path="", call=d.dependency)
+ for d in chain(extra_dependencies, self.dependencies)
+ ]
+ dependant.dependencies.extend(extra)
+
+ if getattr(dependant, "flat_params", None) is None: # handle FastAPI Dependant
+ params = dependant.path_params + dependant.body_params
+
+ for d in dependant.dependencies:
+ params.extend(d.path_params + d.body_params)
- f = to_async(func)
+ params_unique = []
+ for p in params:
+ if p not in params_unique:
+ params_unique.append(p)
+
+ dependant.flat_params = params_unique
+
+ f = cast(Callable[..., Awaitable[T]], to_async(func))
if self._is_apply_types is True:
- f = apply_types(f)
+ f = apply_types(
+ func=f,
+ wrap_dependant=extend_dependencies(extra),
+ )
f = self._wrap_decode_message(
- f,
+ func=f,
_raw=_raw,
- params=dependant.real_params,
+ params=tuple(chain(dependant.flat_params, extra)),
+ decoder=decode_message or self._global_decoder,
)
if self.logger is not None:
- f = self._log_execution(**broker_args)(f)
+ f = self._log_execution(**broker_log_context_kwargs)(f)
- f = self._process_message(f, get_watcher(self.logger, retry))
+ f = self._process_message(
+ func=f,
+ watcher=get_watcher(self.logger, retry),
+ )
- f = self._wrap_parse_message(f)
+ f = self._wrap_parse_message(
+ func=f,
+ parser=parse_message or self._global_parser,
+ )
f = set_message_context(f)
- f = suppress_decor(f)
-
- return f
+ return suppress_decor(f), dependant
def _wrap_decode_message(
self,
func: Callable[..., Awaitable[T]],
- params: Sequence[ModelField] = (),
+ decoder: CustomDecoder[MsgType],
+ params: Sequence[Any] = (),
_raw: bool = False,
- ) -> Callable[[PropanMessage], Awaitable[T]]:
+ ) -> Callable[[PropanMessage[MsgType]], Awaitable[T]]:
is_unwrap = len(params) > 1
@wraps(func)
- async def wrapper(message: PropanMessage) -> T:
- msg = await self._decode_message(message)
+ async def wrapper(message: PropanMessage[MsgType]) -> T:
+ if decoder is not None:
+ msg = await decoder(message, self._decode_message)
+ else:
+ msg = await self._decode_message(message)
+
message.decoded_body = msg
if _raw is True:
@@ -237,23 +309,32 @@ async def wrapper(message: PropanMessage) -> T:
return wrapper
def _wrap_parse_message(
- self, func: Callable[[PropanMessage], Awaitable[T]]
- ) -> Callable[[Any], Awaitable[T]]:
+ self,
+ func: Callable[..., Awaitable[T]],
+ parser: CustomParser[MsgType],
+ ) -> Callable[[MsgType], Awaitable[T]]:
@wraps(func)
async def parse_wrapper(message: Any) -> T:
- return await func(await self._parse_message(message))
+ if parser is not None:
+ m = await parser(message, self._parse_message)
+ else:
+ m = await self._parse_message(message)
+ return await func(m)
return parse_wrapper
def _log_execution(
self,
**broker_args: Any,
- ) -> Wrapper:
+ ) -> Callable[
+ [Callable[[PropanMessage[MsgType]], Awaitable[T]]],
+ Callable[[PropanMessage[MsgType]], Awaitable[T]],
+ ]:
def decor(
- func: Callable[[PropanMessage], Awaitable[T]]
- ) -> Callable[[PropanMessage], Awaitable[T]]:
+ func: Callable[[PropanMessage[MsgType]], Awaitable[T]]
+ ) -> Callable[[PropanMessage[MsgType]], Awaitable[T]]:
@wraps(func)
- async def log_wrapper(message: PropanMessage) -> T:
+ async def log_wrapper(message: PropanMessage[MsgType]) -> T:
log_context = self._get_log_context(message=message, **broker_args)
with context.scope("log_context", log_context):
@@ -285,3 +366,11 @@ def _log(
self.logger.log(
level=(log_level or self.log_level), msg=message, extra=extra
)
+
+
+def extend_dependencies(extra: Sequence[Dependant]) -> Callable[[Dependant], Dependant]:
+ def dependant_wrapper(dependant: Dependant) -> Dependant:
+ dependant.dependencies.extend(extra)
+ return dependant
+
+ return dependant_wrapper
diff --git a/propan/brokers/_model/schemas.py b/propan/brokers/_model/schemas.py
index de677b46..9a60f64a 100644
--- a/propan/brokers/_model/schemas.py
+++ b/propan/brokers/_model/schemas.py
@@ -1,21 +1,148 @@
import json
-from dataclasses import dataclass
+from abc import abstractmethod
+from dataclasses import dataclass, field
from enum import Enum
-from typing import Any, Dict, Optional, Sequence, Tuple, Union
+from typing import Any, Dict, Generic, Optional, Sequence, Tuple, TypeVar, Union
from uuid import uuid4
-from pydantic import BaseModel, Field, Json
+from fast_depends.model import Dependant
+from pydantic import BaseModel, Field, Json, create_model
from pydantic.dataclasses import dataclass as pydantic_dataclass
from typing_extensions import TypeAlias, assert_never
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.utils import add_example_to_model
from propan.types import AnyDict, DecodedMessage, DecoratedCallable, SendableMessage
ContentType: TypeAlias = str
+Msg = TypeVar("Msg")
+
@dataclass
class BaseHandler:
callback: DecoratedCallable
+ dependant: Dependant
+ _description: str = field(default="", kw_only=True) # type: ignore
+
+ @abstractmethod
+ def get_schema(self) -> Dict[str, AsyncAPIChannel]:
+ raise NotImplementedError()
+
+ @property
+ def title(self) -> str:
+ return self.callback.__name__.replace("_", " ").title().replace(" ", "")
+
+ @property
+ def description(self) -> Optional[str]:
+ return self._description or self.callback.__doc__
+
+ def get_message_object(self) -> Tuple[str, AnyDict, Optional[AnyDict]]:
+ import jsonref # hide it there to remove docs dependencies from main package
+
+ dependant = self.dependant
+
+ if getattr(dependant, "return_field", None) is not None:
+ return_field = dependant.return_field
+
+ if issubclass(return_field.type_, BaseModel):
+ return_model = return_field.type_
+ if not return_model.Config.schema_extra.get("example"):
+ return_model = add_example_to_model(return_model)
+ return_info = jsonref.replace_refs(
+ return_model.schema(), jsonschema=True, proxies=False
+ )
+ return_info["examples"] = [return_info.pop("example")]
+
+ else:
+ return_model = create_model( # type: ignore
+ f"{self.title}Reply",
+ **{return_field.name: (return_field.annotation, ...)},
+ )
+ return_model = add_example_to_model(return_model)
+ return_info = jsonref.replace_refs(
+ return_model.schema(), jsonschema=True, proxies=False
+ )
+ return_info.pop("required")
+ return_info.update(
+ {
+ "type": return_info.pop("properties", {})
+ .get(return_field.name, {})
+ .get("type"),
+ "examples": [
+ return_info.pop("example", {}).get(return_field.name)
+ ],
+ }
+ )
+
+ else:
+ return_info = None
+
+ payload_title = f"{self.title}Payload"
+ params = dependant.flat_params
+ params_number = len(params)
+
+ gen_examples: bool
+ use_original_model = False
+ if params_number == 0:
+ model = None
+
+ elif params_number == 1:
+ param = params[0]
+
+ if issubclass(param.annotation, BaseModel):
+ model = param.annotation
+ gen_examples = model.Config.schema_extra.get("example") is None
+ use_original_model = True
+
+ else:
+ model = create_model( # type: ignore
+ payload_title,
+ **{
+ param.name: (
+ param.annotation,
+ ... if param.required else param.default,
+ )
+ },
+ )
+ gen_examples = True
+
+ else:
+ model = create_model( # type: ignore
+ payload_title,
+ **{
+ p.name: (p.annotation, ... if p.required else p.default)
+ for p in params
+ },
+ )
+ gen_examples = True
+
+ body: AnyDict
+ if model is None:
+ body = {"title": payload_title, "type": "null"}
+ else:
+ if gen_examples is True:
+ model = add_example_to_model(model)
+ body = jsonref.replace_refs(model.schema(), jsonschema=True, proxies=False)
+
+ body.pop("definitions", None)
+ if return_info is not None:
+ return_info.pop("definitions", None)
+
+ if params_number == 1 and not use_original_model:
+ param_body: AnyDict = body.get("properties", {})
+ key = list(param_body.keys())[0]
+ param_body = param_body[key]
+ param_body.update(
+ {
+ "example": body.get("example", {}).get(key),
+ "title": body.get("title", param_body.get("title")),
+ }
+ )
+ param_body["example"] = body.get("example", {}).get(key)
+ body = param_body
+
+ return f"{self.title}Message", body, return_info
class ContentTypes(str, Enum):
@@ -64,9 +191,9 @@ class RawDecoced(BaseModel):
@pydantic_dataclass
-class PropanMessage:
- body: bytes
- raw_message: Any
+class PropanMessage(Generic[Msg]):
+ body: Union[bytes, Any]
+ raw_message: Msg
content_type: Optional[str] = None
reply_to: str = ""
headers: AnyDict = Field(default_factory=dict)
diff --git a/propan/brokers/_model/utils.py b/propan/brokers/_model/utils.py
index 7de4a40b..047a6f61 100644
--- a/propan/brokers/_model/utils.py
+++ b/propan/brokers/_model/utils.py
@@ -1,6 +1,6 @@
import logging
from functools import wraps
-from typing import Any, Awaitable, Callable, Optional, TypeVar, Union
+from typing import Awaitable, Callable, Optional, TypeVar, Union
from propan.brokers.push_back_watcher import (
BaseWatcher,
@@ -40,10 +40,10 @@ def get_watcher(
def suppress_decor(
- func: Callable[[Any], Awaitable[T]]
-) -> Callable[[Any, bool], Awaitable[Optional[T]]]:
+ func: Callable[[T], Awaitable[P]]
+) -> Callable[[T, bool], Awaitable[Optional[P]]]:
@wraps(func)
- async def wrapper(message: Any, reraise_exc: bool = False) -> Optional[T]:
+ async def wrapper(message: T, reraise_exc: bool = False) -> Optional[P]:
try:
return await func(message)
except Exception as e:
@@ -55,10 +55,10 @@ async def wrapper(message: Any, reraise_exc: bool = False) -> Optional[T]:
def set_message_context(
- func: Callable[[Any], Awaitable[T]]
-) -> Callable[[Any], Awaitable[T]]:
+ func: Callable[[T], Awaitable[P]]
+) -> Callable[[T], Awaitable[P]]:
@wraps(func)
- async def wrapper(message: Any) -> T:
+ async def wrapper(message: T) -> P:
with context.scope("message", message):
return await func(message)
diff --git a/propan/brokers/kafka/__init__.py b/propan/brokers/kafka/__init__.py
index 03601b41..3b727957 100644
--- a/propan/brokers/kafka/__init__.py
+++ b/propan/brokers/kafka/__init__.py
@@ -1,3 +1,3 @@
-from propan.brokers.kafka.kafka_broker import KafkaBroker
+from propan.brokers.kafka.kafka_broker import KafkaBroker, KafkaMessage
-__all__ = ("KafkaBroker",)
+__all__ = ("KafkaBroker", "KafkaMessage")
diff --git a/propan/brokers/kafka/kafka_broker.py b/propan/brokers/kafka/kafka_broker.py
index eed6e504..1edcd84f 100644
--- a/propan/brokers/kafka/kafka_broker.py
+++ b/propan/brokers/kafka/kafka_broker.py
@@ -1,12 +1,26 @@
import asyncio
import logging
from functools import partial, wraps
-from typing import Any, Callable, Dict, List, NoReturn, Optional, Sequence, Tuple, Union
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ NoReturn,
+ Optional,
+ Sequence,
+ Tuple,
+ Union,
+)
from uuid import uuid4
from aiokafka import AIOKafkaConsumer, AIOKafkaProducer
from aiokafka.structs import ConsumerRecord
-from typing_extensions import TypeAlias, TypeVar
+from fast_depends.model import Depends
+from kafka.coordinator.assignors.abstract import AbstractPartitionAssignor
+from kafka.coordinator.assignors.roundrobin import RoundRobinPartitionAssignor
+from typing_extensions import Literal, TypeAlias, TypeVar
from propan.__about__ import __version__
from propan.brokers._model.broker_usecase import BrokerUsecase
@@ -26,11 +40,13 @@
T = TypeVar("T")
CorrelationId: TypeAlias = str
+KafkaMessage: TypeAlias = PropanMessage[ConsumerRecord]
-class KafkaBroker(BrokerUsecase):
+class KafkaBroker(
+ BrokerUsecase[ConsumerRecord, Callable[[Tuple[str, ...]], AIOKafkaConsumer]]
+):
_publisher: Optional[AIOKafkaProducer]
- _connection: Callable[[Tuple[str, ...]], AIOKafkaConsumer]
__max_topic_len: int
response_topic: str
response_callbacks: Dict[CorrelationId, "asyncio.Future[DecodedMessage]"]
@@ -42,9 +58,18 @@ def __init__(
*,
response_topic: str = "",
log_fmt: Optional[str] = None,
+ protocol: str = "kafka",
+ api_version: str = "auto",
**kwargs: AnyDict,
) -> None:
- super().__init__(bootstrap_servers, log_fmt=log_fmt, **kwargs)
+ super().__init__(
+ bootstrap_servers,
+ log_fmt=log_fmt,
+ url_=bootstrap_servers,
+ protocol=protocol,
+ protocol_version=api_version,
+ **kwargs,
+ )
self.__max_topic_len = 4
self._publisher = None
self.response_topic = response_topic
@@ -106,18 +131,76 @@ async def close(self) -> None:
def handle(
self,
*topics: str,
- _raw: bool = False,
- **kwargs: AnyDict,
+ group_id: Optional[str] = None,
+ key_deserializer: Optional[Callable[[bytes], Any]] = None,
+ value_deserializer: Optional[Callable[[bytes], Any]] = None,
+ fetch_max_wait_ms: int = 500,
+ fetch_max_bytes: int = 52428800,
+ fetch_min_bytes: int = 1,
+ max_partition_fetch_bytes: int = 1 * 1024 * 1024,
+ auto_offset_reset: Literal[
+ "latest",
+ "earliest",
+ "none",
+ ] = "latest",
+ enable_auto_commit: bool = True,
+ auto_commit_interval_ms: int = 5000,
+ check_crcs: bool = True,
+ partition_assignment_strategy: Sequence[AbstractPartitionAssignor] = (
+ RoundRobinPartitionAssignor,
+ ),
+ max_poll_interval_ms: int = 300000,
+ rebalance_timeout_ms: Optional[int] = None,
+ session_timeout_ms: int = 10000,
+ heartbeat_interval_ms: int = 3000,
+ consumer_timeout_ms: int = 200,
+ max_poll_records: Optional[int] = None,
+ exclude_internal_topics: bool = True,
+ isolation_level: Literal[
+ "read_uncommitted",
+ "read_committed",
+ ] = "read_uncommitted",
+ ## broker
+ dependencies: Sequence[Depends] = (),
+ description: str = "",
+ **original_kwargs: AnyDict,
) -> Wrapper:
def wrapper(func: AnyCallable) -> DecoratedCallable:
for t in topics:
self.__max_topic_len = max((self.__max_topic_len, len(t)))
- func = self._wrap_handler(func, _raw=_raw)
+ func, dependant = self._wrap_handler(
+ func,
+ extra_dependencies=dependencies,
+ **original_kwargs,
+ )
handler = Handler(
callback=func,
topics=topics,
- consumer_kwargs=kwargs,
+ _description=description,
+ group_id=group_id,
+ consumer_kwargs={
+ "key_deserializer": key_deserializer,
+ "value_deserializer": value_deserializer,
+ "fetch_max_wait_ms": fetch_max_wait_ms,
+ "fetch_max_bytes": fetch_max_bytes,
+ "fetch_min_bytes": fetch_min_bytes,
+ "max_partition_fetch_bytes": max_partition_fetch_bytes,
+ "auto_offset_reset": auto_offset_reset,
+ "enable_auto_commit": enable_auto_commit,
+ "auto_commit_interval_ms": auto_commit_interval_ms,
+ "check_crcs": check_crcs,
+ "partition_assignment_strategy": partition_assignment_strategy,
+ "max_poll_interval_ms": max_poll_interval_ms,
+ "rebalance_timeout_ms": rebalance_timeout_ms,
+ "session_timeout_ms": session_timeout_ms,
+ "heartbeat_interval_ms": heartbeat_interval_ms,
+ "consumer_timeout_ms": consumer_timeout_ms,
+ "max_poll_records": max_poll_records,
+ "exclude_internal_topics": exclude_internal_topics,
+ "isolation_level": isolation_level,
+ },
+ dependant=dependant,
)
self.handlers.append(handler)
@@ -144,13 +227,17 @@ async def start(self) -> None:
c = self._get_log_context(None, handler.topics)
self._log(f"`{handler.callback.__name__}` waiting for messages", extra=c)
- consumer = self._connection(*handler.topics, **handler.consumer_kwargs)
+ consumer = self._connection(
+ *handler.topics,
+ group_id=handler.group_id,
+ **handler.consumer_kwargs,
+ )
await consumer.start()
handler.consumer = consumer
handler.task = asyncio.create_task(self._consume(handler))
@staticmethod
- async def _parse_message(message: ConsumerRecord) -> PropanMessage:
+ async def _parse_message(message: ConsumerRecord) -> KafkaMessage:
headers = {i: j.decode() for i, j in message.headers}
return PropanMessage(
body=message.value,
@@ -162,10 +249,12 @@ async def _parse_message(message: ConsumerRecord) -> PropanMessage:
)
def _process_message(
- self, func: Callable[[PropanMessage], T], watcher: Optional[BaseWatcher]
- ) -> Callable[[PropanMessage], T]:
+ self,
+ func: Callable[[KafkaMessage], Awaitable[T]],
+ watcher: Optional[BaseWatcher],
+ ) -> Callable[[KafkaMessage], Awaitable[T]]:
@wraps(func)
- async def wrapper(message: PropanMessage) -> T:
+ async def wrapper(message: KafkaMessage) -> T:
r = await func(message)
if message.reply_to:
@@ -255,7 +344,7 @@ def fmt(self) -> str:
def _get_log_context(
self,
- message: Optional[PropanMessage],
+ message: Optional[KafkaMessage],
topics: Sequence[str] = (),
) -> Dict[str, Any]:
if topics:
@@ -273,15 +362,25 @@ def _get_log_context(
async def _consume(self, handler: Handler) -> NoReturn:
c = self._get_log_context(None, handler.topics)
+ connected = True
while True:
try:
msg = await handler.consumer.getone()
+
except Exception as e:
- self._log(e, logging.WATNING, c)
+ if connected is True:
+ self._log(e, logging.WATNING, c)
+ connected = False
+ await asyncio.sleep(5)
+
else:
+ if connected is False:
+ self._log("Connection established", logging.INFO, c)
+ connected = True
+
await handler.callback(msg)
- async def _consume_response(self, message: PropanMessage):
+ async def _consume_response(self, message: KafkaMessage):
correlation_id = message.headers.get("correlation_id")
if correlation_id is not None:
callback = self.response_callbacks.pop(correlation_id, None)
diff --git a/propan/brokers/kafka/kafka_broker.pyi b/propan/brokers/kafka/kafka_broker.pyi
index 4068aa44..0d8c37ce 100644
--- a/propan/brokers/kafka/kafka_broker.pyi
+++ b/propan/brokers/kafka/kafka_broker.pyi
@@ -1,32 +1,49 @@
import logging
from asyncio import AbstractEventLoop, Future
from ssl import SSLContext
-from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ Optional,
+ Sequence,
+ Tuple,
+ Union,
+)
from aiokafka import AIOKafkaConsumer, AIOKafkaProducer
from aiokafka.abc import AbstractTokenProvider
from aiokafka.producer.producer import _missing
from aiokafka.structs import ConsumerRecord
+from fast_depends.model import Depends
from kafka.coordinator.assignors.abstract import AbstractPartitionAssignor
from kafka.coordinator.assignors.roundrobin import RoundRobinPartitionAssignor
from kafka.partitioner.default import DefaultPartitioner
from typing_extensions import Literal, TypeAlias, TypeVar
from propan.__about__ import __version__
-from propan.brokers._model.broker_usecase import BrokerUsecase
+from propan.brokers._model.broker_usecase import (
+ BrokerUsecase,
+ CustomDecoder,
+ CustomParser,
+)
from propan.brokers._model.schemas import PropanMessage
from propan.brokers.kafka.schemas import Handler
from propan.brokers.push_back_watcher import BaseWatcher
from propan.log import access_logger
-from propan.types import DecodedMessage, SendableMessage, Wrapper
+from propan.types import DecodedMessage, HandlerWrapper, SendableMessage
T = TypeVar("T")
Partition = TypeVar("Partition")
CorrelationId: TypeAlias = str
+KafkaMessage: TypeAlias = PropanMessage[ConsumerRecord]
-class KafkaBroker(BrokerUsecase):
+class KafkaBroker(
+ BrokerUsecase[ConsumerRecord, Callable[[Tuple[str, ...]], AIOKafkaConsumer]]
+):
_publisher: Optional[AIOKafkaProducer]
- _connection: Callable[[Tuple[str, ...]], AIOKafkaConsumer]
__max_topic_len: int
response_topic: str
response_callbacks: Dict[CorrelationId, "Future[DecodedMessage]"]
@@ -80,9 +97,13 @@ class KafkaBroker(BrokerUsecase):
loop: Optional[AbstractEventLoop] = None,
# broker
logger: Optional[logging.Logger] = access_logger,
+ decode_message: CustomDecoder[ConsumerRecord] = None,
+ parse_message: CustomParser[ConsumerRecord] = None,
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ dependencies: Sequence[Depends] = (),
+ protocol: str = "kafka",
) -> None: ...
async def connect(
self,
@@ -165,13 +186,15 @@ class KafkaBroker(BrokerUsecase):
"read_committed",
] = "read_uncommitted",
retry: Union[bool, int] = False,
- ) -> Wrapper: ...
- async def start(self) -> None: ...
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[ConsumerRecord] = None,
+ parse_message: CustomParser[ConsumerRecord] = None,
+ description: str = "",
+ ) -> HandlerWrapper: ...
@staticmethod
- async def _parse_message(message: ConsumerRecord) -> PropanMessage: ...
- def _process_message(
- self, func: Callable[[PropanMessage], T], watcher: Optional[BaseWatcher]
- ) -> Callable[[PropanMessage], T]: ...
+ async def _parse_message(
+ message: ConsumerRecord,
+ ) -> KafkaMessage: ...
async def publish( # type: ignore[override]
self,
message: SendableMessage,
@@ -186,10 +209,13 @@ class KafkaBroker(BrokerUsecase):
callback_timeout: Optional[float] = None,
raise_timeout: bool = False,
) -> Optional[DecodedMessage]: ...
- @property
- def fmt(self) -> str: ...
def _get_log_context( # type: ignore[override]
self,
- message: Optional[PropanMessage],
+ message: Optional[KafkaMessage],
topics: Sequence[str] = (),
) -> Dict[str, Any]: ...
+ def _process_message(
+ self,
+ func: Callable[[KafkaMessage], Awaitable[T]],
+ watcher: Optional[BaseWatcher],
+ ) -> Callable[[KafkaMessage], Awaitable[T]]: ...
diff --git a/propan/brokers/kafka/schemas.py b/propan/brokers/kafka/schemas.py
index 3ba1c215..31149e35 100644
--- a/propan/brokers/kafka/schemas.py
+++ b/propan/brokers/kafka/schemas.py
@@ -1,9 +1,17 @@
import asyncio
from dataclasses import dataclass, field
-from typing import Any, List, Optional
+from typing import Any, Dict, List, Optional
from aiokafka import AIOKafkaConsumer
+from propan.asyncapi.bindings import (
+ AsyncAPIChannelBinding,
+ AsyncAPIOperationBinding,
+ kafka,
+)
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.message import AsyncAPICorrelationId, AsyncAPIMessage
+from propan.asyncapi.subscription import AsyncAPISubscription
from propan.brokers._model.schemas import BaseHandler
from propan.types import AnyDict
@@ -11,7 +19,47 @@
@dataclass
class Handler(BaseHandler):
topics: List[str]
+ group_id: Optional[str] = None
consumer: Optional[AIOKafkaConsumer] = None
task: Optional["asyncio.Task[Any]"] = None
consumer_kwargs: AnyDict = field(default_factory=dict)
+
+ def get_schema(self) -> Dict[str, AsyncAPIChannel]:
+ message_title, body, reply_to = self.get_message_object()
+
+ if reply_to:
+ kafka_kwargs = {"replyTo": reply_to}
+ else:
+ kafka_kwargs = {}
+
+ if self.group_id is not None:
+ kafka_kwargs["groupId"] = {"type": "string", "enum": [self.group_id]}
+
+ if kafka_kwargs:
+ operation_binding = AsyncAPIOperationBinding(
+ kafka=kafka.AsyncAPIKafkaOperationBinding(**kafka_kwargs) # type: ignore
+ )
+ else:
+ operation_binding = None
+
+ return {
+ self.title: AsyncAPIChannel(
+ subscribe=AsyncAPISubscription(
+ description=self.description,
+ bindings=operation_binding,
+ message=AsyncAPIMessage(
+ title=message_title,
+ payload=body,
+ correlationId=AsyncAPICorrelationId(
+ location="$message.header#/correlation_id"
+ ),
+ ),
+ ),
+ bindings=AsyncAPIChannelBinding(
+ kafka=kafka.AsyncAPIKafkaChannelBinding(
+ topic=self.topics,
+ )
+ ),
+ ),
+ }
diff --git a/propan/brokers/nats/__init__.py b/propan/brokers/nats/__init__.py
index ce45a20f..c07c8a7b 100644
--- a/propan/brokers/nats/__init__.py
+++ b/propan/brokers/nats/__init__.py
@@ -1,3 +1,3 @@
-from propan.brokers.nats.nats_broker import NatsBroker
+from propan.brokers.nats.nats_broker import NatsBroker, NatsMessage
-__all__ = ("NatsBroker",)
+__all__ = ("NatsBroker", "NatsMessage")
diff --git a/propan/brokers/nats/nats_broker.py b/propan/brokers/nats/nats_broker.py
index ed65b413..a12306ca 100644
--- a/propan/brokers/nats/nats_broker.py
+++ b/propan/brokers/nats/nats_broker.py
@@ -2,11 +2,23 @@
import logging
from functools import wraps
from secrets import token_hex
-from typing import Any, Callable, Dict, List, Optional, TypeVar, Union
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ Optional,
+ Sequence,
+ TypeVar,
+ Union,
+)
import nats
+from fast_depends.model import Depends
from nats.aio.client import Callback, Client, ErrorCallback
from nats.aio.msg import Msg
+from typing_extensions import TypeAlias
from propan.brokers._model import BrokerUsecase
from propan.brokers._model.schemas import PropanMessage
@@ -16,11 +28,11 @@
from propan.utils import context
T = TypeVar("T")
+NatsMessage: TypeAlias = PropanMessage[Msg]
-class NatsBroker(BrokerUsecase):
+class NatsBroker(BrokerUsecase[Msg, Client]):
handlers: List[Handler]
- _connection: Optional[Client]
__max_queue_len: int
__max_subject_len: int
@@ -31,9 +43,12 @@ def __init__(
servers: Union[str, List[str]] = ["nats://localhost:4222"], # noqa: B006
*,
log_fmt: Optional[str] = None,
+ protocol: str = "nats",
**kwargs: AnyDict,
) -> None:
- super().__init__(servers, log_fmt=log_fmt, **kwargs)
+ super().__init__(
+ servers, log_fmt=log_fmt, url_=servers, protocol=protocol, **kwargs
+ )
self._connection = None
@@ -62,21 +77,28 @@ def handle(
subject: str,
queue: str = "",
*,
- retry: Union[bool, int] = False,
- _raw: bool = False,
+ dependencies: Sequence[Depends] = (),
+ description: str = "",
+ **original_kwargs: AnyDict,
) -> Callable[[DecoratedCallable], None]:
self.__max_subject_len = max((self.__max_subject_len, len(subject)))
self.__max_queue_len = max((self.__max_queue_len, len(queue)))
def wrapper(func: DecoratedCallable) -> None:
- func = self._wrap_handler(
+ func, dependant = self._wrap_handler(
func,
queue=queue,
subject=subject,
- retry=retry,
- _raw=_raw,
+ extra_dependencies=dependencies,
+ **original_kwargs,
+ )
+ handler = Handler(
+ callback=func,
+ subject=subject,
+ queue=queue,
+ _description=description,
+ dependant=dependant,
)
- handler = Handler(callback=func, subject=subject, queue=queue)
self.handlers.append(handler)
return func
@@ -172,7 +194,7 @@ async def close(self) -> None:
def _get_log_context(
self,
- message: Optional[PropanMessage],
+ message: Optional[NatsMessage],
subject: str,
queue: str = "",
) -> Dict[str, Any]:
@@ -193,7 +215,7 @@ def fmt(self) -> str:
"- %(message)s"
)
- async def _parse_message(self, message: Msg) -> PropanMessage:
+ async def _parse_message(self, message: Msg) -> NatsMessage:
return PropanMessage(
body=message.data,
content_type=message.header.get("content-type", ""),
@@ -203,10 +225,12 @@ async def _parse_message(self, message: Msg) -> PropanMessage:
)
def _process_message(
- self, func: Callable[[PropanMessage], T], watcher: Optional[BaseWatcher] = None
- ) -> Callable[[PropanMessage], T]:
+ self,
+ func: Callable[[NatsMessage], Awaitable[T]],
+ watcher: Optional[BaseWatcher] = None,
+ ) -> Callable[[NatsMessage], Awaitable[T]]:
@wraps(func)
- async def wrapper(message: PropanMessage) -> T:
+ async def wrapper(message: NatsMessage) -> T:
r = await func(message)
if message.reply_to:
diff --git a/propan/brokers/nats/nats_broker.pyi b/propan/brokers/nats/nats_broker.pyi
index 318e0c08..3a729910 100644
--- a/propan/brokers/nats/nats_broker.pyi
+++ b/propan/brokers/nats/nats_broker.pyi
@@ -1,7 +1,18 @@
import logging
import ssl
-from typing import Any, Callable, Dict, List, Optional, TypeVar, Union
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ Optional,
+ Sequence,
+ TypeVar,
+ Union,
+)
+from fast_depends.model import Depends
from nats.aio.client import (
DEFAULT_CONNECT_TIMEOUT,
DEFAULT_DRAIN_TIMEOUT,
@@ -20,8 +31,10 @@ from nats.aio.client import (
SignatureCallback,
)
from nats.aio.msg import Msg
+from typing_extensions import TypeAlias
from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.brokers._model.schemas import PropanMessage
from propan.brokers.nats.schemas import Handler
from propan.brokers.push_back_watcher import BaseWatcher
@@ -29,11 +42,11 @@ from propan.log import access_logger
from propan.types import DecodedMessage, HandlerWrapper, SendableMessage
T = TypeVar("T")
+NatsMessage: TypeAlias = PropanMessage[Msg]
-class NatsBroker(BrokerUsecase):
+class NatsBroker(BrokerUsecase[Msg, Client]):
logger: logging.Logger
handlers: List[Handler]
- _connection: Optional[Client]
def __init__(
self,
@@ -73,6 +86,10 @@ class NatsBroker(BrokerUsecase):
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[Msg] = None,
+ parse_message: CustomParser[Msg] = None,
+ protocol: str = "nats",
) -> None: ...
async def connect(
self,
@@ -126,19 +143,29 @@ class NatsBroker(BrokerUsecase):
queue: str = "",
*,
retry: Union[bool, int] = False,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[Msg] = None,
+ parse_message: CustomParser[Msg] = None,
+ description: str = "",
) -> HandlerWrapper: ...
- async def _connect(self, *args: Any, **kwargs: Any) -> Client: ...
- async def close(self) -> None: ...
def _get_log_context( # type: ignore[override]
self,
- message: Optional[PropanMessage],
+ message: Optional[NatsMessage],
subject: str,
queue: str = "",
) -> Dict[str, Any]: ...
+ async def _connect(
+ self,
+ *,
+ url: Optional[str] = None,
+ error_cb: Optional[ErrorCallback] = None,
+ reconnected_cb: Optional[Callback] = None,
+ **kwargs: Any,
+ ) -> Client: ...
+ async def close(self) -> None: ...
+ async def _parse_message(self, message: Msg) -> NatsMessage: ...
def _process_message(
self,
- func: Callable[[PropanMessage], T],
- watcher: Optional[BaseWatcher],
- ) -> Callable[[PropanMessage], T]: ...
- @staticmethod
- async def _parse_message(message: Msg) -> PropanMessage: ...
+ func: Callable[[NatsMessage], Awaitable[T]],
+ watcher: Optional[BaseWatcher] = None,
+ ) -> Callable[[NatsMessage], Awaitable[T]]: ...
diff --git a/propan/brokers/nats/nats_js_broker.py b/propan/brokers/nats/nats_js_broker.py
index 23b7efa6..7c261ce1 100644
--- a/propan/brokers/nats/nats_js_broker.py
+++ b/propan/brokers/nats/nats_js_broker.py
@@ -1,16 +1,18 @@
# TODO: remove mypy ignore at complete
# type: ignore
from functools import wraps
-from typing import Any, Optional
+from typing import Any, Awaitable, Callable, Optional, TypeVar
import nats
-from nats.aio.msg import Msg
from nats.js.client import JetStreamContext
+from propan.brokers._model.schemas import PropanMessage
from propan.brokers.nats.nats_broker import NatsBroker
from propan.brokers.nats.schemas import JetStream
from propan.brokers.push_back_watcher import BaseWatcher, WatcherContext
-from propan.types import AnyDict, DecoratedCallable
+from propan.types import AnyDict
+
+T = TypeVar("T")
class NatsJSBroker(NatsBroker):
@@ -34,19 +36,20 @@ async def _connect(self, *args: Any, **kwargs: AnyDict) -> JetStreamContext:
@staticmethod
def _process_message(
- func: DecoratedCallable, watcher: Optional[BaseWatcher] = None
- ) -> DecoratedCallable:
+ func: Callable[[PropanMessage], Awaitable[T]],
+ watcher: Optional[BaseWatcher] = None,
+ ) -> Callable[[PropanMessage], Awaitable[T]]:
@wraps(func)
- async def wrapper(message: Msg) -> Any:
+ async def wrapper(message: PropanMessage) -> T:
if watcher is None:
return await func(message)
else:
async with WatcherContext(
watcher,
message.message_id,
- on_success=message.ack,
- on_error=message.nak,
- on_max=message.term,
+ on_success=message.raw_message.ack,
+ on_error=message.raw_message.nak,
+ on_max=message.raw_message.term,
):
await message.in_progress()
return await func(message)
diff --git a/propan/brokers/nats/schemas.py b/propan/brokers/nats/schemas.py
index 3c3441e6..71175080 100644
--- a/propan/brokers/nats/schemas.py
+++ b/propan/brokers/nats/schemas.py
@@ -1,10 +1,18 @@
from dataclasses import dataclass
-from typing import Optional, Sequence
+from typing import Dict, Optional, Sequence
from nats.aio.subscription import Subscription
from nats.js.api import DEFAULT_PREFIX
from pydantic import BaseModel
+from propan.asyncapi.bindings import (
+ AsyncAPIChannelBinding,
+ AsyncAPIOperationBinding,
+ nats,
+)
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.message import AsyncAPIMessage
+from propan.asyncapi.subscription import AsyncAPISubscription
from propan.brokers._model.schemas import BaseHandler
@@ -15,6 +23,32 @@ class Handler(BaseHandler):
subscription: Optional[Subscription] = None
+ def get_schema(self) -> Dict[str, AsyncAPIChannel]:
+ message_title, body, reply_to = self.get_message_object()
+
+ return {
+ self.title: AsyncAPIChannel(
+ subscribe=AsyncAPISubscription(
+ description=self.description,
+ bindings=AsyncAPIOperationBinding(
+ nats=nats.AsyncAPINatsOperationBinding(
+ replyTo=reply_to,
+ ),
+ ),
+ message=AsyncAPIMessage(
+ title=message_title,
+ payload=body,
+ ),
+ ),
+ bindings=AsyncAPIChannelBinding(
+ nats=nats.AsyncAPINatsChannelBinding(
+ subject=self.subject,
+ queue=self.queue or None,
+ )
+ ),
+ ),
+ }
+
class JetStream(BaseModel):
prefix: str = DEFAULT_PREFIX
diff --git a/propan/brokers/push_back_watcher.py b/propan/brokers/push_back_watcher.py
index f65bdf90..b5021a26 100644
--- a/propan/brokers/push_back_watcher.py
+++ b/propan/brokers/push_back_watcher.py
@@ -112,7 +112,7 @@ async def __aexit__(
await call_or_await(self.on_success)
self.watcher.remove(self._message_id)
- elif isinstance(exc_val, SkipMessage) is True:
+ elif isinstance(exc_val, SkipMessage):
self.watcher.remove(self._message_id)
elif self.watcher.is_max(self._message_id):
diff --git a/propan/brokers/rabbit/__init__.py b/propan/brokers/rabbit/__init__.py
index e6219987..aaa2a391 100644
--- a/propan/brokers/rabbit/__init__.py
+++ b/propan/brokers/rabbit/__init__.py
@@ -1,4 +1,4 @@
-from propan.brokers.rabbit.rabbit_broker import RabbitBroker
+from propan.brokers.rabbit.rabbit_broker import RabbitBroker, RabbitMessage
from propan.brokers.rabbit.schemas import ExchangeType, RabbitExchange, RabbitQueue
__all__ = (
@@ -6,4 +6,5 @@
"RabbitQueue",
"RabbitExchange",
"ExchangeType",
+ "RabbitMessage",
)
diff --git a/propan/brokers/rabbit/rabbit_broker.py b/propan/brokers/rabbit/rabbit_broker.py
index 7857b2dd..7146e81c 100644
--- a/propan/brokers/rabbit/rabbit_broker.py
+++ b/propan/brokers/rabbit/rabbit_broker.py
@@ -1,12 +1,25 @@
import asyncio
import warnings
from functools import wraps
-from typing import Any, Callable, Dict, List, Optional, Type, TypeVar, Union
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ Optional,
+ Sequence,
+ Type,
+ TypeVar,
+ Union,
+)
from uuid import uuid4
import aio_pika
import aiormq
from aio_pika.abc import DeliveryMode
+from fast_depends.model import Depends
+from typing_extensions import TypeAlias
from yarl import URL
from propan.brokers._model import BrokerUsecase
@@ -17,7 +30,8 @@
from propan.utils import context
TimeoutType = Optional[Union[int, float]]
-PikaSendableMessage = Union[aio_pika.message.Message, SendableMessage]
+PikaSendableMessage: TypeAlias = Union[aio_pika.message.Message, SendableMessage]
+RabbitMessage: TypeAlias = PropanMessage[aio_pika.message.IncomingMessage]
T = TypeVar("T")
@@ -37,9 +51,18 @@ def __init__(
*,
log_fmt: Optional[str] = None,
consumers: Optional[int] = None,
+ protocol: str = "amqp",
+ protocol_version: str = "0.9.1",
**kwargs: AnyDict,
) -> None:
- super().__init__(url, log_fmt=log_fmt, **kwargs)
+ super().__init__(
+ url,
+ log_fmt=log_fmt,
+ url_=url or "amqp://guest:guest@localhost:5672/",
+ protocol=protocol,
+ protocol_version=protocol_version,
+ **kwargs,
+ )
self._max_consumers = consumers
self._channel = None
@@ -85,22 +108,29 @@ def handle(
queue: Union[str, RabbitQueue],
exchange: Union[str, RabbitExchange, None] = None,
*,
- retry: Union[bool, int] = False,
- _raw: bool = False,
+ dependencies: Sequence[Depends] = (),
+ description: str = "",
+ **original_kwargs: AnyDict,
) -> HandlerWrapper:
queue, exchange = _validate_queue(queue), _validate_exchange(exchange)
self.__setup_log_context(queue, exchange)
def wrapper(func: DecoratedCallable) -> Any:
- func = self._wrap_handler(
+ func, dependant = self._wrap_handler(
func,
queue=queue,
exchange=exchange,
- retry=retry,
- _raw=_raw,
+ extra_dependencies=dependencies,
+ **original_kwargs,
+ )
+ handler = Handler(
+ callback=func,
+ queue=queue,
+ exchange=exchange,
+ _description=description,
+ dependant=dependant,
)
- handler = Handler(callback=func, queue=queue, exchange=exchange)
self.handlers.append(handler)
return func
@@ -232,7 +262,7 @@ async def declare_exchange(
def _get_log_context(
self,
- message: Optional[PropanMessage],
+ message: Optional[RabbitMessage],
queue: RabbitQueue,
exchange: Optional[RabbitExchange] = None,
) -> Dict[str, Any]:
@@ -256,8 +286,8 @@ def fmt(self) -> str:
@staticmethod
async def _parse_message(
message: aio_pika.message.IncomingMessage,
- ) -> PropanMessage:
- return PropanMessage(
+ ) -> RabbitMessage:
+ return RabbitMessage(
body=message.body,
headers=message.headers,
reply_to=message.reply_to or "",
@@ -267,10 +297,12 @@ async def _parse_message(
)
def _process_message(
- self, func: Callable[[PropanMessage], T], watcher: Optional[BaseWatcher]
- ) -> Callable[[PropanMessage], T]:
+ self,
+ func: Callable[[RabbitMessage], Awaitable[T]],
+ watcher: Optional[BaseWatcher],
+ ) -> Callable[[RabbitMessage], Awaitable[T]]:
@wraps(func)
- async def wrapper(message: PropanMessage) -> T:
+ async def wrapper(message: RabbitMessage) -> T:
pika_message = message.raw_message
if watcher is None:
context = pika_message.process()
@@ -345,7 +377,7 @@ async def _init_queue(
) -> aio_pika.abc.AbstractRobustQueue:
warnings.warn(
"The `_init_queue` method is deprecated, " # noqa: E501
- "and will be removed in version 1.3.0. " # noqa: E501
+ "and will be removed in version 1.4.0. " # noqa: E501
"Use `declare_queue` instead.", # noqa: E501
category=DeprecationWarning,
stacklevel=1,
@@ -362,7 +394,7 @@ async def _init_exchange(
) -> aio_pika.abc.AbstractRobustExchange:
warnings.warn(
"The `_init_exchange` method is deprecated, " # noqa: E501
- "and will be removed in version 1.3.0. " # noqa: E501
+ "and will be removed in version 1.4.0. " # noqa: E501
"Use `declare_exchange` instead.", # noqa: E501
category=DeprecationWarning,
stacklevel=1,
diff --git a/propan/brokers/rabbit/rabbit_broker.pyi b/propan/brokers/rabbit/rabbit_broker.pyi
index 8140ba7b..ea20dfe5 100644
--- a/propan/brokers/rabbit/rabbit_broker.pyi
+++ b/propan/brokers/rabbit/rabbit_broker.pyi
@@ -1,14 +1,28 @@
import logging
from ssl import SSLContext
-from typing import Any, Callable, Coroutine, Dict, List, Optional, Type, TypeVar, Union
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ Optional,
+ Sequence,
+ Type,
+ TypeVar,
+ Union,
+)
import aio_pika
import aiormq
+from aio_pika.message import IncomingMessage
+from fast_depends.model import Depends
from pamqp.common import FieldTable
-from typing_extensions import ParamSpec
+from typing_extensions import ParamSpec, TypeAlias
from yarl import URL
from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.brokers._model.schemas import PropanMessage
from propan.brokers.push_back_watcher import BaseWatcher
from propan.brokers.rabbit.schemas import Handler, RabbitExchange, RabbitQueue
@@ -17,11 +31,11 @@ from propan.types import DecodedMessage, SendableMessage
P = ParamSpec("P")
T = TypeVar("T")
-PikaSendableMessage = Union[aio_pika.message.Message, SendableMessage]
+PikaSendableMessage: TypeAlias = Union[aio_pika.message.Message, SendableMessage]
+RabbitMessage: TypeAlias = PropanMessage[IncomingMessage]
-class RabbitBroker(BrokerUsecase):
+class RabbitBroker(BrokerUsecase[IncomingMessage, aio_pika.RobustConnection]):
handlers: List[Handler]
- _connection: Optional[aio_pika.RobustConnection]
_channel: Optional[aio_pika.RobustChannel]
__max_queue_len: int
@@ -41,11 +55,18 @@ class RabbitBroker(BrokerUsecase):
ssl_context: Optional[SSLContext] = None,
timeout: aio_pika.abc.TimeoutType = None,
client_properties: Optional[FieldTable] = None,
+ # broker
logger: Optional[logging.Logger] = access_logger,
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
consumers: Optional[int] = None,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[IncomingMessage] = None,
+ parse_message: CustomParser[IncomingMessage] = None,
+ # AsyncAPI
+ protocol: str = "amqp",
+ protocol_version: str = "0.9.1",
) -> None:
"""RabbitMQ Propan broker
@@ -70,6 +91,9 @@ class RabbitBroker(BrokerUsecase):
log_fmt: custom log formatting string
apply_types: wrap brokers handlers to FastDepends decorator
consumers: max messages to proccess at the same time
+ dependencies: dependencies applied to all broker hadlers
+ decode_message: custom RabbitMessage decoder
+ parse_message: custom IncomingMessage to RabbitMessage parser
.. _RFC3986: https://goo.gl/MzgYAs
.. _official Python documentation: https://goo.gl/pty9xA
@@ -88,7 +112,7 @@ class RabbitBroker(BrokerUsecase):
ssl_context: Optional[SSLContext] = None,
timeout: aio_pika.abc.TimeoutType = None,
client_properties: Optional[FieldTable] = None,
- ) -> aio_pika.Connection:
+ ) -> aio_pika.RobustConnection:
"""Connect to RabbitMQ
URL string might be contain ssl parameters e.g.
@@ -185,10 +209,16 @@ class RabbitBroker(BrokerUsecase):
exchange: Union[str, RabbitExchange, None] = None,
*,
retry: Union[bool, int] = False,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[IncomingMessage] = None,
+ parse_message: CustomParser[IncomingMessage] = None,
+ # AsyncAPI
+ description: str = "",
) -> Callable[
[
- Callable[
- P, Union[PikaSendableMessage, Coroutine[Any, Any, PikaSendableMessage]]
+ Union[
+ Callable[P, PikaSendableMessage],
+ Callable[P, Awaitable[PikaSendableMessage]],
]
],
Callable[P, PikaSendableMessage],
@@ -199,6 +229,10 @@ class RabbitBroker(BrokerUsecase):
queue: queue to consume messages
exchange: exchange to bind queue
retry: at message exception will returns to queue `int` times or endless if `True`
+ dependencies: wrap handler dependencies
+ decode_message: custom RabbitMessage decoder
+ parse_message: custom IncomingMessage to RabbitMessage parser
+ description: AsyncAPI channel object description
Returns:
Async or sync function decorator
@@ -217,11 +251,13 @@ class RabbitBroker(BrokerUsecase):
def channel(self) -> aio_pika.RobustChannel:
"""Access to brokers' aio-pika channel object"""
def _process_message(
- self, func: Callable[[PropanMessage], T], watcher: Optional[BaseWatcher]
- ) -> Callable[[PropanMessage], T]: ...
+ self,
+ func: Callable[[RabbitMessage], Awaitable[T]],
+ watcher: Optional[BaseWatcher],
+ ) -> Callable[[RabbitMessage], Awaitable[T]]: ...
def _get_log_context( # type: ignore[override]
self,
- message: Optional[PropanMessage],
+ message: Optional[RabbitMessage],
queue: RabbitQueue,
exchange: Optional[RabbitExchange] = None,
) -> Dict[str, Any]: ...
@@ -238,8 +274,8 @@ class RabbitBroker(BrokerUsecase):
) -> aio_pika.Message: ...
@staticmethod
async def _parse_message(
- message: aio_pika.message.IncomingMessage,
- ) -> PropanMessage: ...
+ message: IncomingMessage,
+ ) -> RabbitMessage: ...
async def _connect(
self,
*args: Any,
diff --git a/propan/brokers/rabbit/schemas.py b/propan/brokers/rabbit/schemas.py
index fd0d6d87..9d2ce948 100644
--- a/propan/brokers/rabbit/schemas.py
+++ b/propan/brokers/rabbit/schemas.py
@@ -1,9 +1,17 @@
-from dataclasses import dataclass
+from dataclasses import dataclass, field
from typing import Any, Dict, Optional
from aio_pika.abc import ExchangeType, TimeoutType
from pydantic import Field
+from propan.asyncapi.bindings import (
+ AsyncAPIChannelBinding,
+ AsyncAPIOperationBinding,
+ amqp,
+)
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.message import AsyncAPICorrelationId, AsyncAPIMessage
+from propan.asyncapi.subscription import AsyncAPISubscription
from propan.brokers._model.schemas import BaseHandler, NameRequired, Queue
__all__ = (
@@ -28,12 +36,13 @@ class RabbitQueue(Queue):
routing_key: str = Field(default="", exclude=True)
def __hash__(self) -> int:
- return (
- hash(self.name)
- + int(self.durable)
- + int(self.passive)
- + int(self.exclusive)
- + int(self.auto_delete)
+ return sum(
+ (
+ hash(self.name),
+ int(self.durable),
+ int(self.exclusive),
+ int(self.auto_delete),
+ )
)
@property
@@ -82,12 +91,13 @@ class RabbitExchange(NameRequired):
routing_key: str = Field(default="", exclude=True)
def __hash__(self) -> int:
- return (
- hash(self.name)
- + hash(self.type.value)
- + int(self.durable)
- + int(self.passive)
- + int(self.auto_delete)
+ return sum(
+ (
+ hash(self.name),
+ hash(self.type.value),
+ int(self.durable),
+ int(self.auto_delete),
+ )
)
def __init__(
@@ -124,4 +134,61 @@ def __init__(
@dataclass
class Handler(BaseHandler):
queue: RabbitQueue
- exchange: Optional[RabbitExchange] = None
+ exchange: Optional[RabbitExchange] = field(default=None, kw_only=True) # type: ignore
+
+ def get_schema(self) -> Dict[str, AsyncAPIChannel]:
+ message_title, body, reply_to = self.get_message_object()
+
+ return {
+ self.title: AsyncAPIChannel(
+ subscribe=AsyncAPISubscription(
+ description=self.description,
+ bindings=AsyncAPIOperationBinding(
+ amqp=amqp.AsyncAPIAmqpOperationBinding(
+ cc=None
+ if (
+ self.exchange
+ and self.exchange.type
+ in (ExchangeType.FANOUT, ExchangeType.HEADERS)
+ )
+ else self.queue.name,
+ replyTo=reply_to,
+ ),
+ ),
+ message=AsyncAPIMessage(
+ title=message_title,
+ payload=body,
+ correlationId=AsyncAPICorrelationId(
+ location="$message.header#/correlation_id"
+ ),
+ ),
+ ),
+ bindings=AsyncAPIChannelBinding(
+ amqp=amqp.AsyncAPIAmqpChannelBinding(
+ is_="routingKey", # type: ignore
+ queue=None
+ if (
+ self.exchange
+ and self.exchange.type
+ in (ExchangeType.FANOUT, ExchangeType.HEADERS)
+ )
+ else amqp.AsyncAPIAmqpQueue(
+ name=self.queue.name,
+ durable=self.queue.durable,
+ exclusive=self.queue.exclusive,
+ autoDelete=self.queue.auto_delete,
+ ),
+ exchange=(
+ amqp.AsyncAPIAmqpExchange(type="default")
+ if self.exchange is None
+ else amqp.AsyncAPIAmqpExchange(
+ type=self.exchange.type.value, # type: ignore
+ name=self.exchange.name,
+ durable=self.exchange.durable,
+ autoDelete=self.exchange.auto_delete,
+ )
+ ),
+ )
+ ),
+ ),
+ }
diff --git a/propan/brokers/redis/__init__.py b/propan/brokers/redis/__init__.py
index 519811df..afc01254 100644
--- a/propan/brokers/redis/__init__.py
+++ b/propan/brokers/redis/__init__.py
@@ -1,3 +1,3 @@
-from propan.brokers.redis.redis_broker import RedisBroker
+from propan.brokers.redis.redis_broker import RedisBroker, RedisMessage
-__all__ = ("RedisBroker",)
+__all__ = ("RedisBroker", "RedisMessage")
diff --git a/propan/brokers/redis/redis_broker.py b/propan/brokers/redis/redis_broker.py
index 2b2859a2..4ea4c005 100644
--- a/propan/brokers/redis/redis_broker.py
+++ b/propan/brokers/redis/redis_broker.py
@@ -1,18 +1,32 @@
import asyncio
import logging
from functools import wraps
-from typing import Any, Callable, Dict, List, NoReturn, Optional, TypeVar
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ NoReturn,
+ Optional,
+ Sequence,
+ TypeVar,
+)
from uuid import uuid4
+from fast_depends.model import Depends
from redis.asyncio.client import PubSub, Redis
from redis.asyncio.connection import ConnectionPool, parse_url
+from typing_extensions import TypeAlias
from propan.brokers._model import BrokerUsecase
from propan.brokers._model.schemas import PropanMessage, RawDecoced
from propan.brokers.push_back_watcher import BaseWatcher
-from propan.brokers.redis.schemas import Handler, RedisMessage
+from propan.brokers.redis.schemas import Handler
+from propan.brokers.redis.schemas import RedisMessage as RM
from propan.types import (
AnyCallable,
+ AnyDict,
DecodedMessage,
DecoratedCallable,
HandlerWrapper,
@@ -21,6 +35,7 @@
from propan.utils import context
T = TypeVar("T")
+RedisMessage: TypeAlias = PropanMessage[AnyDict]
class RedisBroker(BrokerUsecase):
@@ -35,9 +50,16 @@ def __init__(
*,
polling_interval: float = 1.0,
log_fmt: Optional[str] = None,
+ protocol: str = "redis",
**kwargs: Any,
) -> None:
- super().__init__(url=url, log_fmt=log_fmt, **kwargs)
+ super().__init__(
+ url,
+ log_fmt=log_fmt,
+ url_=url,
+ protocol=protocol,
+ **kwargs,
+ )
self.__max_channel_len = 0
self._polling_interval = polling_interval
@@ -68,15 +90,15 @@ async def close(self) -> None:
def _process_message(
self,
- func: Callable[[PropanMessage], T],
+ func: Callable[[RedisMessage], Awaitable[T]],
watcher: Optional[BaseWatcher],
- ) -> Callable[[PropanMessage], T]:
+ ) -> Callable[[RedisMessage], Awaitable[T]]:
@wraps(func)
- async def wrapper(message: PropanMessage) -> T:
+ async def wrapper(message: RedisMessage) -> T:
r = await func(message)
msg = message.raw_message
- if isinstance(msg, RedisMessage) and message.reply_to:
+ if isinstance(msg, RM) and message.reply_to:
await self.publish(r or "", message.reply_to)
return r
@@ -88,17 +110,26 @@ def handle(
channel: str = "",
*,
pattern: bool = False,
- _raw: bool = False,
+ dependencies: Sequence[Depends] = (),
+ description: str = "",
+ **original_kwargs: AnyDict,
) -> HandlerWrapper:
self.__max_channel_len = max(self.__max_channel_len, len(channel))
def wrapper(func: AnyCallable) -> DecoratedCallable:
- func = self._wrap_handler(
+ func, dependant = self._wrap_handler(
func,
channel=channel,
- _raw=_raw,
+ extra_dependencies=dependencies,
+ **original_kwargs,
+ )
+ handler = Handler(
+ callback=func,
+ channel=channel,
+ pattern=pattern,
+ _description=description,
+ dependant=dependant,
)
- handler = Handler(callback=func, channel=channel, pattern=pattern)
self.handlers.append(handler)
return func
@@ -156,7 +187,7 @@ async def publish(
await self._connection.publish(
channel,
- RedisMessage(
+ RM(
data=msg,
headers={
"content-type": content_type or "",
@@ -183,18 +214,18 @@ async def publish(
task.cancel()
@staticmethod
- async def _parse_message(message: Any) -> PropanMessage:
+ async def _parse_message(message: AnyCallable) -> RedisMessage:
data = message.get("data", b"")
try:
- obj = RedisMessage.parse_raw(data)
+ obj = RM.parse_raw(data)
except Exception:
- msg = PropanMessage(
+ msg = RedisMessage(
body=data,
raw_message=message,
)
else:
- msg = PropanMessage(
+ msg = RedisMessage(
body=obj.data,
content_type=obj.headers.get("content-type", ""),
reply_to=obj.reply_to,
@@ -204,14 +235,14 @@ async def _parse_message(message: Any) -> PropanMessage:
return msg
- async def _decode_message(self, message: PropanMessage) -> DecodedMessage:
+ async def _decode_message(self, message: RedisMessage) -> DecodedMessage:
if message.headers.get("content-type") is not None:
return await super()._decode_message(message)
else:
return RawDecoced(message=message.body).message
def _get_log_context(
- self, message: Optional[PropanMessage], channel: str
+ self, message: Optional[RedisMessage], channel: str
) -> Dict[str, Any]:
context = {
"channel": channel,
diff --git a/propan/brokers/redis/redis_broker.pyi b/propan/brokers/redis/redis_broker.pyi
index 19c04854..a11ca4ca 100644
--- a/propan/brokers/redis/redis_broker.pyi
+++ b/propan/brokers/redis/redis_broker.pyi
@@ -1,21 +1,36 @@
import logging
-from typing import Any, Callable, Dict, List, Mapping, Optional, Type, TypeVar, Union
-
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ Dict,
+ List,
+ Mapping,
+ Optional,
+ Sequence,
+ Type,
+ TypeVar,
+ Union,
+)
+
+from fast_depends.model import Depends
from redis.asyncio.client import Redis
from redis.asyncio.connection import BaseParser, Connection, DefaultParser, Encoder
+from typing_extensions import TypeAlias
from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.brokers._model.schemas import PropanMessage
from propan.brokers.push_back_watcher import BaseWatcher
from propan.brokers.redis.schemas import Handler
from propan.log import access_logger
-from propan.types import DecodedMessage, HandlerWrapper, SendableMessage
+from propan.types import AnyDict, DecodedMessage, HandlerWrapper, SendableMessage
T = TypeVar("T")
+RedisMessage: TypeAlias = PropanMessage[AnyDict]
-class RedisBroker(BrokerUsecase):
+class RedisBroker(BrokerUsecase[AnyDict, Redis[bytes]]):
handlers: List[Handler]
- _connection: Redis[bytes]
__max_channel_len: int
def __init__(
@@ -49,6 +64,10 @@ class RedisBroker(BrokerUsecase):
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ protocol: str = "redis",
) -> None:
"""Redis Pub/sub Propan broker
@@ -122,12 +141,20 @@ class RedisBroker(BrokerUsecase):
channel: str,
*,
pattern: bool = False,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ description: str = "",
) -> HandlerWrapper:
"""Register channel consumer method
Args:
channel: channel to consume messages
pattern: use psubscribe or subscribe method
+ dependencies: wrap handler dependencies
+ decode_message: custom PropanMessage[AnyDict] decoder
+ parse_message: custom redis message to PropanMessage[AnyDict] parser
+ description: AsyncAPI channel object description
Returns:
Async or sync function decorator
@@ -161,16 +188,14 @@ class RedisBroker(BrokerUsecase):
`DecodedMessage` | `None` if response is expected
"""
def _get_log_context( # type: ignore[override]
- self, message: Optional[PropanMessage], channel: str
+ self, message: Optional[RedisMessage], channel: str
) -> Dict[str, Any]: ...
@staticmethod
- async def _decode_message(message: PropanMessage) -> DecodedMessage: ...
+ async def _decode_message(message: RedisMessage) -> DecodedMessage: ...
@staticmethod
- async def _parse_message(message: Any) -> PropanMessage: ...
+ async def _parse_message(message: AnyDict) -> RedisMessage: ...
def _process_message(
self,
- func: Callable[[PropanMessage], T],
+ func: Callable[[RedisMessage], Awaitable[T]],
watcher: Optional[BaseWatcher],
- ) -> Callable[[PropanMessage], T]: ...
- @property
- def fmt(self) -> str: ...
+ ) -> Callable[[RedisMessage], Awaitable[T]]: ...
diff --git a/propan/brokers/redis/schemas.py b/propan/brokers/redis/schemas.py
index 50c83554..33262bb9 100644
--- a/propan/brokers/redis/schemas.py
+++ b/propan/brokers/redis/schemas.py
@@ -5,6 +5,14 @@
from pydantic import BaseModel, Field
from redis.asyncio.client import PubSub
+from propan.asyncapi.bindings import (
+ AsyncAPIChannelBinding,
+ AsyncAPIOperationBinding,
+ redis,
+)
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.message import AsyncAPIMessage
+from propan.asyncapi.subscription import AsyncAPISubscription
from propan.brokers._model.schemas import BaseHandler
@@ -16,6 +24,32 @@ class Handler(BaseHandler):
task: Optional["asyncio.Task[Any]"] = None
subscription: Optional[PubSub] = None
+ def get_schema(self) -> Dict[str, AsyncAPIChannel]:
+ message_title, body, reply_to = self.get_message_object()
+
+ return {
+ self.title: AsyncAPIChannel(
+ subscribe=AsyncAPISubscription(
+ description=self.description,
+ bindings=AsyncAPIOperationBinding(
+ redis=redis.AsyncAPIRedisOperationBinding(
+ replyTo=reply_to,
+ ),
+ ),
+ message=AsyncAPIMessage(
+ title=message_title,
+ payload=body,
+ ),
+ ),
+ bindings=AsyncAPIChannelBinding(
+ redis=redis.AsyncAPIRedisChannelBinding(
+ channel=self.channel,
+ method="psubscribe" if self.pattern else "subscribe",
+ )
+ ),
+ ),
+ }
+
class RedisMessage(BaseModel):
data: bytes
diff --git a/propan/brokers/sqs/__init__.py b/propan/brokers/sqs/__init__.py
index f1b256c4..98e74af7 100644
--- a/propan/brokers/sqs/__init__.py
+++ b/propan/brokers/sqs/__init__.py
@@ -2,10 +2,9 @@
FifoQueue,
RedriveAllowPolicy,
RedrivePolicy,
- SQSMessage,
SQSQueue,
)
-from propan.brokers.sqs.sqs_broker import SQSBroker
+from propan.brokers.sqs.sqs_broker import SQSBroker, SQSMessage
__all__ = (
"SQSBroker",
diff --git a/propan/brokers/sqs/schema.py b/propan/brokers/sqs/schema.py
index d34da571..b6d16807 100644
--- a/propan/brokers/sqs/schema.py
+++ b/propan/brokers/sqs/schema.py
@@ -6,6 +6,14 @@
from pydantic import BaseModel, Field, PositiveInt
from typing_extensions import Literal
+from propan.asyncapi.bindings import (
+ AsyncAPIChannelBinding,
+ AsyncAPIOperationBinding,
+ sqs,
+)
+from propan.asyncapi.channels import AsyncAPIChannel
+from propan.asyncapi.message import AsyncAPICorrelationId, AsyncAPIMessage
+from propan.asyncapi.subscription import AsyncAPISubscription
from propan.brokers._model import BrokerUsecase
from propan.brokers._model.schemas import BaseHandler, Queue
from propan.types import SendableMessage
@@ -205,6 +213,34 @@ class Handler(BaseHandler):
task: Optional["asyncio.Task[Any]"] = None
+ def get_schema(self) -> Dict[str, AsyncAPIChannel]:
+ message_title, body, reply_to = self.get_message_object()
+
+ return {
+ self.title: AsyncAPIChannel(
+ subscribe=AsyncAPISubscription(
+ description=self.description,
+ bindings=AsyncAPIOperationBinding(
+ sqs=sqs.AsyncAPISQSOperationBinding(
+ replyTo=reply_to,
+ ),
+ ),
+ message=AsyncAPIMessage(
+ title=message_title,
+ correlationId=AsyncAPICorrelationId(
+ location="$message.header#/correlation_id"
+ ),
+ payload=body,
+ ),
+ ),
+ bindings=AsyncAPIChannelBinding(
+ sqs=sqs.AsyncAPISQSChannelBinding(
+ queue=self.queue.dict(include={"name", "fifo"}),
+ )
+ ),
+ ),
+ }
+
@dataclass
class SQSMessage:
diff --git a/propan/brokers/sqs/sqs_broker.py b/propan/brokers/sqs/sqs_broker.py
index a934fd98..0b274a9a 100644
--- a/propan/brokers/sqs/sqs_broker.py
+++ b/propan/brokers/sqs/sqs_broker.py
@@ -3,6 +3,7 @@
from functools import wraps
from typing import (
Any,
+ Awaitable,
Callable,
Dict,
List,
@@ -16,6 +17,7 @@
from aiobotocore.client import AioBaseClient
from aiobotocore.session import get_session
+from fast_depends.model import Depends
from typing_extensions import TypeAlias
from propan.brokers._model import BrokerUsecase
@@ -26,9 +28,12 @@
NotPushBackWatcher,
WatcherContext,
)
-from propan.brokers.sqs.schema import Handler, SQSMessage, SQSQueue
+from propan.brokers.sqs.schema import Handler
+from propan.brokers.sqs.schema import SQSMessage as SM
+from propan.brokers.sqs.schema import SQSQueue
from propan.types import (
AnyCallable,
+ AnyDict,
DecodedMessage,
DecoratedCallable,
HandlerWrapper,
@@ -39,6 +44,7 @@
T = TypeVar("T")
QueueUrl: TypeAlias = str
CorrelationId: TypeAlias = str
+SQSMessage: TypeAlias = PropanMessage[AnyDict]
class SQSBroker(BrokerUsecase):
@@ -55,9 +61,16 @@ def __init__(
*,
log_fmt: Optional[str] = None,
response_queue: str = "",
+ protocol: str = "sqs",
**kwargs: Any,
) -> None:
- super().__init__(url, log_fmt=log_fmt, **kwargs)
+ super().__init__(
+ url,
+ log_fmt=log_fmt,
+ url_=url,
+ protocol=protocol,
+ **kwargs,
+ )
self._queues = {}
self.__max_queue_len = 4
self.response_queue = response_queue
@@ -86,12 +99,12 @@ async def close(self) -> None:
await self._connection.__aexit__(None, None, None)
self._connection = None
- async def _parse_message(self, message: Dict[str, Any]) -> PropanMessage:
+ async def _parse_message(self, message: Dict[str, Any]) -> SQSMessage:
attributes = message.get("MessageAttributes", {})
headers = {i: j.get("StringValue") for i, j in attributes.items()}
- return PropanMessage(
+ return SQSMessage(
body=message.get("Body", "").encode(),
message_id=message.get("MessageId"),
content_type=headers.pop("content-type", None),
@@ -102,14 +115,14 @@ async def _parse_message(self, message: Dict[str, Any]) -> PropanMessage:
def _process_message(
self,
- func: Callable[[PropanMessage], T],
+ func: Callable[[SQSMessage], Awaitable[T]],
watcher: Optional[BaseWatcher],
- ) -> Callable[[PropanMessage], T]:
+ ) -> Callable[[SQSMessage], Awaitable[T]]:
if watcher is None:
watcher = NotPushBackWatcher()
@wraps(func)
- async def process_wrapper(message: PropanMessage) -> T:
+ async def process_wrapper(message: SQSMessage) -> T:
context = WatcherContext(
watcher,
message.message_id,
@@ -143,8 +156,9 @@ def handle(
message_attributes: Sequence[str] = (),
request_attempt_id: Optional[str] = None,
visibility_timeout: int = 0,
- retry: Union[bool, int] = False,
- _raw: bool = False,
+ dependencies: Sequence[Depends] = (),
+ description: str = "",
+ **original_kwargs: AnyDict,
) -> HandlerWrapper:
if isinstance(queue, str):
queue = SQSQueue(queue)
@@ -167,13 +181,19 @@ def handle(
params["ReceiveRequestAttemptId"] = request_attempt_id
def wrapper(func: AnyCallable) -> DecoratedCallable:
- func = self._wrap_handler(
+ func, dependant = self._wrap_handler(
func,
queue=queue.name,
- retry=retry,
- _raw=_raw,
+ extra_dependencies=dependencies,
+ **original_kwargs,
+ )
+ handler = Handler(
+ callback=func,
+ queue=queue,
+ consumer_params=params,
+ _description=description,
+ dependant=dependant,
)
- handler = Handler(callback=func, queue=queue, consumer_params=params)
self.handlers.append(handler)
return func
@@ -239,7 +259,7 @@ async def publish(
else:
response_future = None
- params = SQSMessage(
+ params = SM(
message=message,
headers=headers or {},
delay_seconds=delay_seconds,
@@ -347,7 +367,7 @@ async def _consume(self, queue_url: str, handler: Handler) -> NoReturn:
handler.consumer_params.get("WaitTimeSeconds", 1.0)
)
- async def _consume_response(self, message: PropanMessage):
+ async def _consume_response(self, message: SQSMessage):
correlation_id = message.headers.get("correlation_id")
if correlation_id is not None:
callback = self.response_callbacks.pop(correlation_id, None)
@@ -367,7 +387,7 @@ def fmt(self) -> str:
)
def _get_log_context(
- self, message: Optional[PropanMessage], queue: str
+ self, message: Optional[SQSMessage], queue: str
) -> Dict[str, Any]:
context = {
"queue": queue,
diff --git a/propan/brokers/sqs/sqs_broker.pyi b/propan/brokers/sqs/sqs_broker.pyi
index 2a1cf078..33036076 100644
--- a/propan/brokers/sqs/sqs_broker.pyi
+++ b/propan/brokers/sqs/sqs_broker.pyi
@@ -14,20 +14,22 @@ from typing import (
from aiobotocore.client import AioBaseClient
from aiobotocore.config import AioConfig
+from fast_depends.model import Depends
from typing_extensions import TypeAlias
from propan.brokers._model import BrokerUsecase
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.brokers._model.schemas import PropanMessage
from propan.brokers.push_back_watcher import BaseWatcher
from propan.brokers.sqs.schema import Handler, SQSQueue
from propan.log import access_logger
-from propan.types import DecodedMessage, HandlerWrapper, SendableMessage
+from propan.types import AnyDict, DecodedMessage, HandlerWrapper, SendableMessage
T = TypeVar("T")
QueueUrl: TypeAlias = str
+SQSMessage: TypeAlias = PropanMessage[AnyDict]
-class SQSBroker(BrokerUsecase):
- _connection: AioBaseClient
+class SQSBroker(BrokerUsecase[AnyDict, AioBaseClient]):
_queues: Dict[str, QueueUrl]
response_queue: str
response_callbacks: Dict[str, "asyncio.Future[DecodedMessage]"]
@@ -51,6 +53,10 @@ class SQSBroker(BrokerUsecase):
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ protocol: str = "sqs",
) -> None:
""""""
async def connect(
@@ -97,6 +103,10 @@ class SQSBroker(BrokerUsecase):
request_attempt_id: Optional[str] = None,
visibility_timeout: int = 0,
retry: Union[bool, int] = False,
+ dependencies: Sequence[Depends] = (),
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ description: str = "",
) -> HandlerWrapper:
""""""
async def start(self) -> None:
@@ -113,7 +123,7 @@ class SQSBroker(BrokerUsecase):
@property
def fmt(self) -> str: ...
def _get_log_context( # type: ignore[override]
- self, message: Optional[PropanMessage], queue: str
+ self, message: Optional[SQSMessage], queue: str
) -> Dict[str, Any]: ...
@classmethod
def _build_message(
@@ -131,10 +141,10 @@ class SQSBroker(BrokerUsecase):
# broker
reply_to: str = "",
) -> Dict[str, Any]: ...
- async def _parse_message(self, message: Dict[str, Any]) -> PropanMessage: ...
+ async def _parse_message(self, message: Dict[str, Any]) -> SQSMessage: ...
def _process_message(
self,
- func: Callable[[PropanMessage], T],
+ func: Callable[[SQSMessage], T],
watcher: Optional[BaseWatcher],
- ) -> Callable[[PropanMessage], T]: ...
+ ) -> Callable[[SQSMessage], T]: ...
async def _connect(self, *args: Any, **kwargs: Any) -> AioBaseClient: ...
diff --git a/propan/cli/app.py b/propan/cli/app.py
index 0059d8e4..cccd550c 100644
--- a/propan/cli/app.py
+++ b/propan/cli/app.py
@@ -6,6 +6,7 @@
from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
from typing_extensions import Protocol
+from propan.asyncapi.info import AsyncAPIContact, AsyncAPILicense
from propan.cli.supervisors.utils import set_exit
from propan.cli.utils.parser import SettingField
from propan.log import logger
@@ -30,11 +31,19 @@ class PropanApp:
_stop_stream: Optional[MemoryObjectSendStream[bool]]
_receive_stream: Optional[MemoryObjectReceiveStream[bool]]
+ license: Optional[AsyncAPILicense]
+ contact: Optional[AsyncAPIContact]
def __init__(
self,
broker: Optional[Runnable] = None,
logger: Optional[logging.Logger] = logger,
+ # AsyncAPI args,
+ title: str = "Propan",
+ version: str = "0.1.0",
+ description: str = "",
+ license: Optional[AsyncAPILicense] = None,
+ contact: Optional[AsyncAPIContact] = None,
):
self.broker = broker
self.logger = logger
@@ -49,6 +58,12 @@ def __init__(
self._receive_stream = None
self._command_line_options: Dict[str, SettingField] = {}
+ self.title = title
+ self.version = version
+ self.description = description
+ self.license = license
+ self.contact = contact
+
def set_broker(self, broker: Runnable) -> None:
self.broker = broker
diff --git a/propan/cli/docs/__init__.py b/propan/cli/docs/__init__.py
new file mode 100644
index 00000000..e179343d
--- /dev/null
+++ b/propan/cli/docs/__init__.py
@@ -0,0 +1,3 @@
+from propan.cli.docs.app import docs_app
+
+__all__ = ("docs_app",)
diff --git a/propan/cli/docs/app.py b/propan/cli/docs/app.py
new file mode 100644
index 00000000..2fa471de
--- /dev/null
+++ b/propan/cli/docs/app.py
@@ -0,0 +1,79 @@
+import sys
+from pathlib import Path
+
+import typer
+
+from propan.cli.docs.gen import (
+ generate_doc_file,
+ get_app_schema,
+ json_schema_to_yaml,
+ schema_to_json,
+)
+from propan.cli.docs.serve import serve_docs
+from propan.cli.utils.imports import get_app_path, try_import_propan
+
+docs_app = typer.Typer(pretty_exceptions_short=True)
+
+
+@docs_app.command(name="gen")
+def gen(
+ app: str = typer.Argument(
+ ...,
+ help="[python_module:PropanApp] - path to your application",
+ ),
+ filename: str = typer.Option(
+ "asyncapi.yaml",
+ "-f",
+ "--f",
+ case_sensitive=False,
+ show_default=True,
+ help="generated document filename",
+ ),
+) -> None:
+ """Generate an AsyncAPI scheme.yaml for your project"""
+ current_dir = Path.cwd()
+ generated_filepath = current_dir / filename
+
+ module, app = get_app_path(app)
+ app_dir = module.parent
+ sys.path.insert(0, str(app_dir))
+ propan_app = try_import_propan(module, app)
+
+ generate_doc_file(propan_app, generated_filepath)
+
+
+@docs_app.command(name="serve")
+def serve(
+ app: str = typer.Argument(
+ ...,
+ help="[python_module:PropanApp] or [asyncapi.yaml] - path to your application documentation",
+ ),
+ host: str = typer.Option(
+ "localhost",
+ help="documentation hosting address",
+ ),
+ port: int = typer.Option(
+ 8000,
+ help="documentation hosting port",
+ ),
+) -> None:
+ """Serve project AsyncAPI scheme"""
+ if ":" in app:
+ module, app = get_app_path(app)
+ app_dir = module.parent
+ sys.path.insert(0, str(app_dir))
+ propan_app = try_import_propan(module, app)
+ raw_schema = get_app_schema(propan_app)
+ schema = json_schema_to_yaml(schema_to_json(raw_schema))
+
+ else:
+ schema_filepath = Path.cwd() / app
+ schema = schema_filepath.read_text()
+ raw_schema = None
+
+ serve_docs(
+ schema=schema,
+ host=host,
+ port=port,
+ raw_schema=raw_schema,
+ )
diff --git a/propan/cli/docs/gen.py b/propan/cli/docs/gen.py
new file mode 100644
index 00000000..ae6c1fb0
--- /dev/null
+++ b/propan/cli/docs/gen.py
@@ -0,0 +1,136 @@
+import json
+from io import StringIO
+from pathlib import Path
+from typing import Any, Dict, cast
+
+import typer
+
+from propan.asyncapi import (
+ AsyncAPIChannel,
+ AsyncAPIComponents,
+ AsyncAPIInfo,
+ AsyncAPIMessage,
+ AsyncAPISchema,
+ AsyncAPIServer,
+)
+from propan.brokers._model import BrokerUsecase
+from propan.cli.app import PropanApp
+from propan.types import AnyDict
+
+
+def generate_doc_file(app: PropanApp, filename: Path) -> None:
+ schema = get_app_schema(app)
+ json_schema = schema_to_json(schema)
+ yaml_schema = json_schema_to_yaml(json_schema)
+ filename.write_text(yaml_schema)
+ typer.echo(f"Your project AsyncAPI scheme was placed to `{filename}`")
+
+
+def gen_app_schema_yaml(app: PropanApp) -> str:
+ json_schema = gen_app_schema_json(app)
+ return json_schema_to_yaml(json_schema)
+
+
+def gen_app_schema_json(app: PropanApp) -> AnyDict:
+ schema = get_app_schema(app)
+ return schema_to_json(schema)
+
+
+def json_schema_to_yaml(schema: AnyDict) -> str:
+ try:
+ import yaml
+ except ImportError as e: # pragma: no cover
+ typer.echo(
+ "To generate documentation, please install the dependencies\n"
+ 'pip install "propan[doc]"'
+ )
+ raise typer.Exit(1) from e
+
+ io = StringIO(initial_value="", newline="\n")
+ yaml.dump(schema, io, sort_keys=False)
+ return io.getvalue()
+
+
+def schema_to_json(schema: AsyncAPISchema) -> AnyDict:
+ return cast(
+ AnyDict,
+ json.loads(
+ schema.json(
+ by_alias=True,
+ exclude_none=True,
+ )
+ ),
+ )
+
+
+def get_app_schema(app: PropanApp) -> AsyncAPISchema:
+ if not isinstance(app.broker, BrokerUsecase):
+ raise typer.BadParameter("Your PropanApp broker is invalid")
+
+ servers = _get_broker_servers(app.broker)
+
+ messages: Dict[str, AsyncAPIMessage] = {}
+ payloads: Dict[str, AnyDict] = {}
+
+ channels = _get_broker_channels(app.broker)
+ for ch in channels.values():
+ ch.servers = list(servers.keys())
+
+ if ch.subscribe is not None: # pragma: no branch
+ m = ch.subscribe.message
+ m_title = m.title or "Message"
+
+ p = m.payload
+ p_title = p.get("title", m_title)
+ payloads[p_title] = p
+
+ m.payload = {"$ref": f"#/components/schemas/{p_title}"}
+
+ messages[m_title] = m
+ ch.subscribe.message = {
+ "$ref": f"#/components/messages/{m_title}"
+ } # type: ignore
+
+ schema = AsyncAPISchema(
+ info=_get_app_info(app),
+ servers=servers,
+ channels=channels,
+ components=AsyncAPIComponents(
+ messages=messages,
+ schemas=payloads,
+ ),
+ )
+ return schema
+
+
+def _get_app_info(app: PropanApp) -> AsyncAPIInfo:
+ return AsyncAPIInfo(
+ title=app.title,
+ version=app.version,
+ description=app.description,
+ license=getattr(app, "license", None),
+ contact=getattr(app, "contact", None),
+ )
+
+
+def _get_broker_servers(broker: BrokerUsecase[Any, Any]) -> Dict[str, AsyncAPIServer]:
+ if isinstance(broker.url, str):
+ url = broker.url
+ else:
+ url = broker.url[0]
+
+ return {
+ "dev": AsyncAPIServer(
+ url=url,
+ protocol=broker.protocol,
+ protocolVersion=broker.protocol_version,
+ )
+ }
+
+
+def _get_broker_channels(broker: BrokerUsecase[Any, Any]) -> Dict[str, AsyncAPIChannel]:
+ channels = {}
+ for handler in broker.handlers:
+ channels.update(handler.get_schema())
+
+ return channels
diff --git a/propan/cli/docs/serve.py b/propan/cli/docs/serve.py
new file mode 100644
index 00000000..d2e472e7
--- /dev/null
+++ b/propan/cli/docs/serve.py
@@ -0,0 +1,167 @@
+import json
+from typing import Any, Callable, Dict, Optional, Union
+
+from propan.asyncapi import AsyncAPISchema
+from propan.cli.docs.gen import schema_to_json
+
+
+def serve_docs(
+ schema: str = "",
+ host: str = "0.0.0.0",
+ port: int = 8000,
+ raw_schema: Optional[AsyncAPISchema] = None,
+) -> None:
+ if not any((schema, raw_schema)):
+ raise ValueError("You should set `shema` or `raw_schema`")
+
+ import uvicorn
+ from fastapi import FastAPI
+
+ app = FastAPI()
+
+ app.get("/")(asyncapi_html_endpoint(schema, raw_schema))
+
+ if raw_schema is not None:
+ app.get("/asyncapi.json")(download_json_endpoint(raw_schema))
+
+ app.get("/asyncapi.yaml")(download_yaml_endpoint(schema))
+ uvicorn.run(app, host=host, port=port)
+
+
+def download_yaml_endpoint(schema: str) -> Callable[[], Any]:
+ from fastapi.responses import Response
+
+ def download_yaml() -> Response:
+ return Response(
+ content=schema,
+ headers={
+ "Content-Type": "application/octet-stream",
+ },
+ )
+
+ return download_yaml
+
+
+def download_json_endpoint(raw_schema: AsyncAPISchema) -> Callable[[], Any]:
+ from fastapi.responses import Response
+
+ def download_json() -> Response:
+ return Response(
+ content=json.dumps(
+ schema_to_json(raw_schema),
+ indent=4,
+ ),
+ headers={
+ "Content-Type": "application/octet-stream",
+ },
+ )
+
+ return download_json
+
+
+def asyncapi_html_endpoint(
+ schema: str, raw_schema: Optional[AsyncAPISchema] = None
+) -> Callable[[], Any]:
+ from fastapi.responses import HTMLResponse
+
+ def asyncapi(
+ sidebar: bool = True,
+ info: bool = True,
+ servers: bool = True,
+ operations: bool = True,
+ messages: bool = True,
+ schemas: bool = True,
+ errors: bool = True,
+ expandMessageExamples: bool = True,
+ ) -> HTMLResponse:
+ return HTMLResponse(
+ content=get_asyncapi_html(
+ schema,
+ sidebar=sidebar,
+ info=info,
+ servers=servers,
+ operations=operations,
+ messages=messages,
+ schemas=schemas,
+ errors=errors,
+ expand_message_examples=expandMessageExamples,
+ title=raw_schema.info.title if raw_schema else "Propan",
+ )
+ )
+
+ return asyncapi
+
+
+def get_asyncapi_html(
+ schema: Union[str, Dict[str, Any]],
+ sidebar: bool = True,
+ info: bool = True,
+ servers: bool = True,
+ operations: bool = True,
+ messages: bool = True,
+ schemas: bool = True,
+ errors: bool = True,
+ expand_message_examples: bool = True,
+ title: str = "Propan",
+) -> str:
+ config = {
+ "schema": schema,
+ "config": {
+ "show": {
+ "sidebar": sidebar,
+ "info": info,
+ "servers": servers,
+ "operations": operations,
+ "messages": messages,
+ "schemas": schemas,
+ "errors": errors,
+ },
+ "expand": {
+ "messageExamples": expand_message_examples,
+ },
+ "sidebar": {
+ "showServers": "byDefault",
+ "showOperations": "byDefault",
+ },
+ },
+ }
+
+ return (
+ """
+
+
+
+ """
+ f"""
+ {title} AsyncAPI
+ """
+ """
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ """
+ )
diff --git a/propan/cli/main.py b/propan/cli/main.py
index 56dd524a..acd9fa58 100644
--- a/propan/cli/main.py
+++ b/propan/cli/main.py
@@ -7,9 +7,9 @@
import typer
from propan.__about__ import __version__
-from propan.cli.app import PropanApp
+from propan.cli.docs import docs_app
from propan.cli.startproject import create_app
-from propan.cli.utils.imports import get_app_path, import_object
+from propan.cli.utils.imports import get_app_path, try_import_propan
from propan.cli.utils.logs import LogLevels, get_log_level, set_log_level
from propan.cli.utils.parser import SettingField, parse_cli_args
from propan.log import logger
@@ -18,6 +18,7 @@
cli.add_typer(
create_app, name="create", help="Create a new Propan project at [APPNAME] directory"
)
+cli.add_typer(docs_app, name="docs", help="AsyncAPI scheme commands")
def version_callback(version: bool) -> None:
@@ -106,28 +107,17 @@ def _run(
log_level: int = logging.INFO,
app_level: int = logging.INFO,
) -> None:
- try:
- propan_app = import_object(module, app)
+ propan_app = try_import_propan(module, app)
+ set_log_level(log_level, propan_app)
- if not isinstance(propan_app, PropanApp):
- raise FileNotFoundError(f"{propan_app} is not a PropanApp")
+ propan_app._command_line_options = extra_options
- except (FileNotFoundError, AttributeError) as e:
- logger.error(e)
- logger.error("Please, input module like [python_file:propan_app_name]")
- exit()
+ if sys.platform not in ("win32", "cygwin", "cli"):
+ try:
+ import uvloop
+ except Exception:
+ logger.warning("You have no installed `uvloop`")
+ else:
+ uvloop.install()
- else:
- set_log_level(log_level, propan_app)
-
- propan_app._command_line_options = extra_options
-
- if sys.platform not in ("win32", "cygwin", "cli"):
- try:
- import uvloop
- except Exception:
- logger.warning("You have no installed `uvloop`")
- else:
- uvloop.install()
-
- asyncio.run(propan_app.run(log_level=app_level))
+ asyncio.run(propan_app.run(log_level=app_level))
diff --git a/propan/cli/startproject/async_app/core.py b/propan/cli/startproject/async_app/core.py
index 581fe077..1f60305e 100644
--- a/propan/cli/startproject/async_app/core.py
+++ b/propan/cli/startproject/async_app/core.py
@@ -10,6 +10,8 @@ def create_app_file(
imports: Sequence[str] = (),
broker_init: Sequence[str] = (" await broker.connect(settings.broker.url)",),
) -> None:
+ write_file(app_dir / "__init__.py")
+
write_file(
app_dir / "serve.py",
"import logging",
diff --git a/propan/cli/startproject/core.py b/propan/cli/startproject/core.py
index 95b6a067..105cbe9d 100644
--- a/propan/cli/startproject/core.py
+++ b/propan/cli/startproject/core.py
@@ -142,7 +142,7 @@ def create_apps_dir(apps: Path) -> Path:
write_file(
apps_dir / "__init__.py",
- "from .handlers import base_handler",
+ "from .handlers import *",
)
return apps_dir
diff --git a/propan/cli/startproject/utils.py b/propan/cli/startproject/utils.py
index 7d41c4f4..60bcf2a3 100644
--- a/propan/cli/startproject/utils.py
+++ b/propan/cli/startproject/utils.py
@@ -1,13 +1,8 @@
from pathlib import Path
-from typing import Union, cast
-def touch_dir(dir: Union[Path, str]) -> Path:
- if isinstance(dir, str) is True:
- dir = Path(dir).resolve()
-
- dir = cast(Path, dir)
- if dir.exists() is False:
+def touch_dir(dir: Path) -> Path:
+ if not dir.exists(): # pragma: no branch
dir.mkdir()
return dir
diff --git a/propan/cli/utils/imports.py b/propan/cli/utils/imports.py
index 2d0f7b64..c7abaded 100644
--- a/propan/cli/utils/imports.py
+++ b/propan/cli/utils/imports.py
@@ -2,9 +2,31 @@
from pathlib import Path
from typing import Any, Tuple
+import typer
+
+from propan.cli.app import PropanApp
+
+
+def try_import_propan(module: Path, app: str) -> PropanApp:
+ try:
+ propan_app = import_object(module, app)
+
+ except (ValueError, FileNotFoundError, AttributeError) as e:
+ typer.echo(e, err=True)
+ raise typer.BadParameter(
+ "Please, input module like [python_file:propan_app_name]"
+ ) from e
+
+ else:
+ return propan_app # type: ignore
+
def import_object(module: Path, app: str) -> Any:
- spec = spec_from_file_location("mode", f"{module}.py")
+ spec = spec_from_file_location(
+ "mode",
+ f"{module}.py",
+ submodule_search_locations=[str(module.parent.absolute())],
+ )
if spec is None: # pragma: no cover
raise FileNotFoundError(module)
@@ -17,7 +39,6 @@ def import_object(module: Path, app: str) -> Any:
loader.exec_module(mod)
obj = getattr(mod, app)
-
return obj
diff --git a/propan/fastapi/core/route.py b/propan/fastapi/core/route.py
index 8de5426b..3d5f6021 100644
--- a/propan/fastapi/core/route.py
+++ b/propan/fastapi/core/route.py
@@ -1,5 +1,6 @@
import asyncio
import inspect
+from functools import wraps
from itertools import dropwhile
from typing import Any, Callable, Coroutine, Optional, Union
@@ -22,7 +23,7 @@ def __init__(
path: Union[Queue, str],
*extra: Union[Queue, str],
endpoint: Callable[..., Any],
- broker: BrokerUsecase,
+ broker: BrokerUsecase[Any, Any],
dependency_overrides_provider: Optional[Any] = None,
**handle_kwargs: AnyDict,
) -> None:
@@ -33,15 +34,18 @@ def __init__(
call=endpoint,
)
- handler = PropanMessage.get_session(
- self.dependant,
- dependency_overrides_provider,
+ handler = wraps(endpoint)(
+ PropanMessage.get_session(
+ self.dependant,
+ dependency_overrides_provider,
+ )
)
broker.handle(
path,
*extra,
_raw=True,
+ _get_dependant=get_dependant, # type: ignore
**handle_kwargs, # type: ignore
)(handler)
@@ -69,7 +73,7 @@ def get_session(
cls,
dependant: Dependant,
dependency_overrides_provider: Optional[Any] = None,
- ) -> Callable[[NativeMessage], Any]:
+ ) -> Callable[[NativeMessage[Any]], Any]:
assert dependant.call
func = get_app(dependant, dependency_overrides_provider)
@@ -83,7 +87,7 @@ def get_session(
None,
)
- async def app(message: NativeMessage) -> Any:
+ async def app(message: NativeMessage[Any]) -> Any:
body = message.decoded_body
if first_arg is not None:
if not isinstance(body, dict): # pragma: no branch
diff --git a/propan/fastapi/core/router.py b/propan/fastapi/core/router.py
index 3d345e85..cbb6e118 100644
--- a/propan/fastapi/core/router.py
+++ b/propan/fastapi/core/router.py
@@ -1,3 +1,4 @@
+import json
from contextlib import asynccontextmanager
from enum import Enum
from typing import (
@@ -15,24 +16,32 @@
Union,
)
-from fastapi import APIRouter, FastAPI, params
+from fastapi import APIRouter, FastAPI, Request, params
from fastapi.datastructures import Default
from fastapi.routing import APIRoute
from fastapi.types import DecoratedCallable
from fastapi.utils import generate_unique_id
from starlette import routing
-from starlette.responses import JSONResponse, Response
+from starlette.responses import HTMLResponse, JSONResponse, Response
from starlette.routing import _DefaultLifespan
from starlette.types import AppType, ASGIApp, Lifespan
from typing_extensions import AsyncIterator, TypeVar
from propan.brokers._model import BrokerUsecase
from propan.brokers._model.schemas import Queue
+from propan.cli.docs.gen import (
+ gen_app_schema_json,
+ gen_app_schema_yaml,
+ get_app_schema,
+ json_schema_to_yaml,
+ schema_to_json,
+)
+from propan.cli.docs.serve import get_asyncapi_html
from propan.fastapi.core.route import PropanRoute
from propan.types import AnyDict
from propan.utils.functions import to_async
-Broker = TypeVar("Broker", bound=BrokerUsecase)
+Broker = TypeVar("Broker", bound=BrokerUsecase[Any, Any])
class PropanRouter(APIRouter, Generic[Broker]):
@@ -60,6 +69,7 @@ def __init__(
on_shutdown: Optional[Sequence[Callable[[], Any]]] = None,
deprecated: Optional[bool] = None,
include_in_schema: bool = True,
+ schema_url: str = "/asyncapi",
lifespan: Optional[Lifespan[Any]] = None,
generate_unique_id_function: Callable[[APIRoute], str] = Default(
generate_unique_id
@@ -96,6 +106,11 @@ def __init__(
on_shutdown=on_shutdown,
)
+ if self.include_in_schema is True: # pragma: no branch
+ self.get(schema_url)(serve_asyncapi_schema)
+ self.get(f"{schema_url}.json")(download_app_json_schema)
+ self.get(f"{schema_url}.yaml")(download_app_yaml_schema)
+
self._after_startup_hooks = []
def add_api_mq_route(
@@ -156,7 +171,7 @@ async def start_broker_lifespan(
for h in self._after_startup_hooks:
h_context = await h(app)
- if h_context:
+ if h_context: # pragma: no branch
context.update(h_context)
yield context
@@ -174,3 +189,54 @@ def after_startup(
],
) -> None:
self._after_startup_hooks.append(to_async(func)) # type: ignore
+
+
+def download_app_json_schema(r: Request) -> Response:
+ return Response(
+ content=json.dumps(
+ gen_app_schema_json(r.app),
+ indent=4,
+ ),
+ headers={
+ "Content-Type": "application/octet-stream",
+ },
+ )
+
+
+def download_app_yaml_schema(r: Request) -> Response:
+ return Response(
+ content=gen_app_schema_yaml(r.app),
+ headers={
+ "Content-Type": "application/octet-stream",
+ },
+ )
+
+
+def serve_asyncapi_schema(
+ r: Request,
+ sidebar: bool = True,
+ info: bool = True,
+ servers: bool = True,
+ operations: bool = True,
+ messages: bool = True,
+ schemas: bool = True,
+ errors: bool = True,
+ expandMessageExamples: bool = True,
+) -> HTMLResponse:
+ raw_schema = get_app_schema(r.app)
+ json_schema = schema_to_json(raw_schema)
+ schema = json_schema_to_yaml(json_schema)
+ return HTMLResponse(
+ content=get_asyncapi_html(
+ schema,
+ sidebar=sidebar,
+ info=info,
+ servers=servers,
+ operations=operations,
+ messages=messages,
+ schemas=schemas,
+ errors=errors,
+ expand_message_examples=expandMessageExamples,
+ title=raw_schema.info.title if raw_schema else "Propan",
+ )
+ )
diff --git a/propan/fastapi/kafka/router.pyi b/propan/fastapi/kafka/router.pyi
index 626f1092..8d7d60c1 100644
--- a/propan/fastapi/kafka/router.pyi
+++ b/propan/fastapi/kafka/router.pyi
@@ -6,6 +6,7 @@ from typing import Any, Callable, Dict, List, Optional, Sequence, Type, Union
from aiokafka.abc import AbstractTokenProvider
from aiokafka.producer.producer import _missing
+from aiokafka.structs import ConsumerRecord
from fastapi import params
from fastapi.datastructures import Default
from fastapi.routing import APIRoute
@@ -20,6 +21,7 @@ from typing_extensions import Literal, TypeVar
from propan import KafkaBroker
from propan.__about__ import __version__
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.fastapi.core import PropanRouter
from propan.log import access_logger
from propan.types import AnyCallable
@@ -92,10 +94,14 @@ class KafkaRouter(PropanRouter[KafkaBroker]):
),
loop: Optional[AbstractEventLoop] = None,
# Broker kwargs
+ schema_url: str = "/asyncapi",
logger: Optional[logging.Logger] = access_logger,
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ decode_message: CustomDecoder[ConsumerRecord] = None,
+ parse_message: CustomParser[ConsumerRecord] = None,
+ protocol: str = "kafka",
) -> None:
pass
def add_api_mq_route( # type: ignore[override]
@@ -131,7 +137,11 @@ class KafkaRouter(PropanRouter[KafkaBroker]):
"read_uncommitted",
"read_committed",
] = "read_uncommitted",
+ # broker kwargs
retry: Union[bool, int] = False,
+ decode_message: CustomDecoder[ConsumerRecord] = None,
+ parse_message: CustomParser[ConsumerRecord] = None,
+ description: str = "",
) -> None:
pass
def event( # type: ignore[override]
@@ -166,6 +176,10 @@ class KafkaRouter(PropanRouter[KafkaBroker]):
"read_uncommitted",
"read_committed",
] = "read_uncommitted",
+ # broker kwargs
retry: Union[bool, int] = False,
+ decode_message: CustomDecoder[ConsumerRecord] = None,
+ parse_message: CustomParser[ConsumerRecord] = None,
+ description: str = "",
) -> None:
pass
diff --git a/propan/fastapi/nats/router.pyi b/propan/fastapi/nats/router.pyi
index b52a5dca..73701451 100644
--- a/propan/fastapi/nats/router.pyi
+++ b/propan/fastapi/nats/router.pyi
@@ -23,11 +23,13 @@ from nats.aio.client import (
JWTCallback,
SignatureCallback,
)
+from nats.aio.msg import Msg
from starlette import routing
from starlette.responses import JSONResponse, Response
from starlette.types import ASGIApp
from propan import NatsBroker
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.fastapi.core.router import PropanRouter
from propan.log import access_logger
from propan.types import AnyCallable
@@ -86,10 +88,14 @@ class NatsRouter(PropanRouter[NatsBroker]):
generate_unique_id
),
# Broker kwargs
+ schema_url: str = "/asyncapi",
logger: Optional[logging.Logger] = access_logger,
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ decode_message: CustomDecoder[Msg] = None,
+ parse_message: CustomParser[Msg] = None,
+ protocol: str = "nats",
) -> None:
pass
def add_api_mq_route( # type: ignore[override]
@@ -99,6 +105,9 @@ class NatsRouter(PropanRouter[NatsBroker]):
queue: str = "",
endpoint: AnyCallable,
retry: Union[bool, int] = False,
+ decode_message: CustomDecoder[Msg] = None,
+ parse_message: CustomParser[Msg] = None,
+ description: str = "",
) -> None:
pass
def event( # type: ignore[override]
@@ -107,5 +116,8 @@ class NatsRouter(PropanRouter[NatsBroker]):
*,
queue: str = "",
retry: Union[bool, int] = False,
+ decode_message: CustomDecoder[Msg] = None,
+ parse_message: CustomParser[Msg] = None,
+ description: str = "",
) -> None:
pass
diff --git a/propan/fastapi/rabbit/router.pyi b/propan/fastapi/rabbit/router.pyi
index df667cc6..893ff8a5 100644
--- a/propan/fastapi/rabbit/router.pyi
+++ b/propan/fastapi/rabbit/router.pyi
@@ -4,6 +4,7 @@ from ssl import SSLContext
from typing import Any, Callable, Dict, List, Optional, Sequence, Type, Union
import aio_pika
+from aio_pika.message import IncomingMessage
from fastapi import params
from fastapi.datastructures import Default
from fastapi.routing import APIRoute
@@ -14,6 +15,7 @@ from starlette.responses import JSONResponse, Response
from starlette.types import ASGIApp
from propan import RabbitBroker
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.brokers.rabbit import RabbitExchange, RabbitQueue
from propan.fastapi.core import PropanRouter
from propan.log import access_logger
@@ -57,6 +59,11 @@ class RabbitRouter(PropanRouter[RabbitBroker]):
log_fmt: Optional[str] = None,
apply_types: bool = True,
consumers: Optional[int] = None,
+ decode_message: CustomDecoder[IncomingMessage] = None,
+ parse_message: CustomParser[IncomingMessage] = None,
+ schema_url: str = "/asyncapi",
+ protocol: str = "amqp",
+ protocol_version: str = "0.9.1",
) -> None:
pass
def add_api_mq_route( # type: ignore[override]
@@ -66,6 +73,9 @@ class RabbitRouter(PropanRouter[RabbitBroker]):
endpoint: AnyCallable,
exchange: Union[str, RabbitExchange, None] = None,
retry: Union[bool, int] = False,
+ decode_message: CustomDecoder[IncomingMessage] = None,
+ parse_message: CustomParser[IncomingMessage] = None,
+ description: str = "",
) -> None:
pass
def event( # type: ignore[override]
@@ -74,5 +84,8 @@ class RabbitRouter(PropanRouter[RabbitBroker]):
*,
exchange: Union[str, RabbitExchange, None] = None,
retry: Union[bool, int] = False,
+ decode_message: CustomDecoder[IncomingMessage] = None,
+ parse_message: CustomParser[IncomingMessage] = None,
+ description: str = "",
) -> None:
pass
diff --git a/propan/fastapi/redis/router.pyi b/propan/fastapi/redis/router.pyi
index 835b27a7..7bbb16be 100644
--- a/propan/fastapi/redis/router.pyi
+++ b/propan/fastapi/redis/router.pyi
@@ -12,9 +12,10 @@ from starlette.responses import JSONResponse, Response
from starlette.types import ASGIApp
from propan import RedisBroker
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.fastapi.core.router import PropanRouter
from propan.log import access_logger
-from propan.types import AnyCallable
+from propan.types import AnyCallable, AnyDict
class RedisRouter(PropanRouter[RedisBroker]):
def __init__(
@@ -60,10 +61,14 @@ class RedisRouter(PropanRouter[RedisBroker]):
generate_unique_id
),
# Broker kwargs
+ schema_url: str = "/asyncapi",
logger: Optional[logging.Logger] = access_logger,
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ protocol: str = "redis",
) -> None:
pass
def add_api_mq_route( # type: ignore[override]
@@ -72,6 +77,9 @@ class RedisRouter(PropanRouter[RedisBroker]):
*,
endpoint: AnyCallable,
pattern: bool = False,
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ description: str = "",
) -> None:
pass
def event( # type: ignore[override]
@@ -79,5 +87,8 @@ class RedisRouter(PropanRouter[RedisBroker]):
channel: str,
*,
pattern: bool = False,
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ description: str = "",
) -> None:
pass
diff --git a/propan/fastapi/sqs/router.pyi b/propan/fastapi/sqs/router.pyi
index a9d3e17a..f289e541 100644
--- a/propan/fastapi/sqs/router.pyi
+++ b/propan/fastapi/sqs/router.pyi
@@ -12,10 +12,11 @@ from starlette.responses import JSONResponse, Response
from starlette.types import ASGIApp
from propan import SQSBroker
+from propan.brokers._model.broker_usecase import CustomDecoder, CustomParser
from propan.brokers.sqs.schema import SQSQueue
from propan.fastapi.core.router import PropanRouter
from propan.log import access_logger
-from propan.types import AnyCallable
+from propan.types import AnyCallable, AnyDict
class SQSRouter(PropanRouter[SQSBroker]):
def __init__(
@@ -50,10 +51,14 @@ class SQSRouter(PropanRouter[SQSBroker]):
generate_unique_id
),
# Broker kwargs
+ schema_url: str = "/asyncapi",
logger: Optional[logging.Logger] = access_logger,
log_level: int = logging.INFO,
log_fmt: Optional[str] = None,
apply_types: bool = True,
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ protocol: str = "sqs",
) -> None:
pass
def add_api_mq_route( # type: ignore[override]
@@ -68,6 +73,9 @@ class SQSRouter(PropanRouter[SQSBroker]):
visibility_timeout: int = 0,
retry: Union[bool, int] = False,
endpoint: AnyCallable,
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ description: str = "",
) -> None:
pass
def event( # type: ignore[override]
@@ -81,5 +89,8 @@ class SQSRouter(PropanRouter[SQSBroker]):
request_attempt_id: Optional[str] = None,
visibility_timeout: int = 0,
retry: Union[bool, int] = False,
+ decode_message: CustomDecoder[AnyDict] = None,
+ parse_message: CustomParser[AnyDict] = None,
+ description: str = "",
) -> None:
pass
diff --git a/propan/test/sqs.py b/propan/test/sqs.py
index 893a62f5..b17cb19f 100644
--- a/propan/test/sqs.py
+++ b/propan/test/sqs.py
@@ -11,7 +11,7 @@
from unittest.mock import AsyncMock
from propan import SQSBroker
-from propan.brokers.sqs import SQSMessage
+from propan.brokers.sqs.schema import SQSMessage
from propan.test.utils import call_handler
from propan.types import SendableMessage
diff --git a/propan/utils/__init__.py b/propan/utils/__init__.py
index b2889b7c..cb43ae93 100644
--- a/propan/utils/__init__.py
+++ b/propan/utils/__init__.py
@@ -1,7 +1,8 @@
from fast_depends import Depends
from fast_depends import inject as apply_types
-from .context import Context, ContextRepo, context
+from propan.utils.context import Context, ContextRepo, context
+from propan.utils.no_cast import NoCast
__all__ = (
"apply_types",
@@ -9,4 +10,5 @@
"Context",
"ContextRepo",
"Depends",
+ "NoCast",
)
diff --git a/propan/utils/functions.py b/propan/utils/functions.py
index d5f9718f..3d809707 100644
--- a/propan/utils/functions.py
+++ b/propan/utils/functions.py
@@ -25,13 +25,13 @@ async def wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
return wrapper
-def get_function_arguments(func: Callable[P, T]) -> List[str]:
+def get_function_positional_arguments(func: Callable[P, T]) -> List[str]:
signature = inspect.signature(func)
- arg_kinds = [
+ arg_kinds = (
inspect.Parameter.POSITIONAL_ONLY,
inspect.Parameter.POSITIONAL_OR_KEYWORD,
- ]
+ )
return [
param.name for param in signature.parameters.values() if param.kind in arg_kinds
diff --git a/propan/utils/no_cast.py b/propan/utils/no_cast.py
new file mode 100644
index 00000000..158fa25b
--- /dev/null
+++ b/propan/utils/no_cast.py
@@ -0,0 +1,11 @@
+from fast_depends.library import CustomField
+
+from propan.types import AnyDict
+
+
+class NoCast(CustomField): # type: ignore
+ def __init__(self) -> None:
+ super().__init__(cast=False)
+
+ def use(self, **kwargs: AnyDict) -> AnyDict:
+ return kwargs
diff --git a/pyproject.toml b/pyproject.toml
index 70e99116..b18ab5e3 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -70,18 +70,28 @@ async-nats = [
]
async-redis = [
- "redis>=4.2.0rc1"
+ "redis>=4.2.0rc1",
]
async-kafka = [
- "aiokafka>=0.8"
+ "aiokafka>=0.8",
+]
+
+doc = [
+ "PyYAML",
+ "pytest[email]",
+ "polyfactory",
+ "jsonref",
+ "fastapi",
+ "uvicorn",
]
async-sqs = [
- "aiobotocore"
+ "aiobotocore",
]
test = [
+ "propan[doc]",
"propan[async-rabbit]",
"propan[async-nats]",
"propan[async-redis]",
@@ -93,24 +103,27 @@ test = [
"pytest-asyncio>=0.21",
"fastapi",
+ "python-dotenv",
"asyncmock; python_version < '3.8'",
]
-doc = [
+dev-doc = [
"mkdocs-material >=8.1.4,<9.0.0",
"mkdocs-static-i18n",
"mdx-include >=1.4.1,<2.0.0",
"mkdocs-macros-plugin",
+ "mkdocs-glightbox",
"typer[all]",
]
dev = [
"propan[test]",
- "propan[doc]",
+ "propan[dev-doc]",
"types-redis",
+ "types-PyYAML",
"mypy==1.1.1",
"black==23.3.0",
diff --git a/tests/asyncapi/__init__.py b/tests/asyncapi/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/tests/asyncapi/handler/__init__.py b/tests/asyncapi/handler/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/tests/asyncapi/handler/test_base_arguments.py b/tests/asyncapi/handler/test_base_arguments.py
new file mode 100644
index 00000000..fc31bef4
--- /dev/null
+++ b/tests/asyncapi/handler/test_base_arguments.py
@@ -0,0 +1,188 @@
+from fast_depends.construct import get_dependant
+from pydantic import BaseModel
+
+from propan.brokers._model.schemas import BaseHandler
+
+
+def test_base():
+ def func(a: int):
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+
+ assert isinstance(result.pop("example"), int)
+ assert result == {
+ "title": "FuncPayload",
+ "type": "integer",
+ }
+
+ assert response is None
+
+
+def test_multi_args():
+ def func(a: int, b: float):
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+
+ example = result.pop("example")
+ assert isinstance(example["a"], int)
+ assert isinstance(example["b"], float)
+ assert result == {
+ "properties": {
+ "a": {"title": "A", "type": "integer"},
+ "b": {"title": "B", "type": "number"},
+ },
+ "required": ["a", "b"],
+ "title": "FuncPayload",
+ "type": "object",
+ }
+
+ assert response is None
+
+
+def test_pydantic_args():
+ class Message(BaseModel):
+ a: int
+ b: float
+
+ def func(a: Message):
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+
+ example = result.pop("example")
+ assert isinstance(example["a"], int)
+ assert isinstance(example["b"], float)
+ assert result == {
+ "properties": {
+ "a": {"title": "A", "type": "integer"},
+ "b": {"title": "B", "type": "number"},
+ },
+ "required": ["a", "b"],
+ "title": "Message",
+ "type": "object",
+ }
+
+ assert response is None
+
+
+def test_pydantic_example():
+ class Message(BaseModel):
+ a: int
+
+ class Config:
+ schema_extra = {"example": {"a": 1}}
+
+ def func(a: Message):
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+ assert result == {
+ "example": {"a": 1},
+ "properties": {
+ "a": {"title": "A", "type": "integer"},
+ },
+ "required": ["a"],
+ "title": "Message",
+ "type": "object",
+ }
+
+ assert response is None
+
+
+def test_response_base():
+ def func() -> str:
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+ assert result == {
+ "title": "FuncPayload",
+ "type": "null",
+ }
+
+ for r in response.pop("examples"):
+ assert isinstance(r, str)
+
+ assert response == {"title": "FuncReply", "type": "string"}
+
+
+def test_pydantic_response():
+ class Message(BaseModel):
+ a: int
+
+ class Config:
+ schema_extra = {"example": {"a": 1}}
+
+ def func() -> Message:
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+ assert result == {
+ "title": "FuncPayload",
+ "type": "null",
+ }
+
+ assert response == {
+ "examples": [{"a": 1}],
+ "properties": {
+ "a": {"title": "A", "type": "integer"},
+ },
+ "required": ["a"],
+ "title": "Message",
+ "type": "object",
+ }
+
+
+def test_pydantic_gen_response_examples():
+ class Message(BaseModel):
+ a: int
+
+ def func() -> Message:
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+ assert result == {
+ "title": "FuncPayload",
+ "type": "null",
+ }
+
+ for r in response.pop("examples"):
+ assert isinstance(r["a"], int)
+
+ assert response == {
+ "properties": {
+ "a": {"title": "A", "type": "integer"},
+ },
+ "required": ["a"],
+ "title": "Message",
+ "type": "object",
+ }
diff --git a/tests/asyncapi/handler/test_dependencies_arguments.py b/tests/asyncapi/handler/test_dependencies_arguments.py
new file mode 100644
index 00000000..de29362e
--- /dev/null
+++ b/tests/asyncapi/handler/test_dependencies_arguments.py
@@ -0,0 +1,60 @@
+from fast_depends.construct import get_dependant
+
+from propan import Depends
+from propan.brokers._model.schemas import BaseHandler
+
+
+def test_base():
+ def dep(a: int):
+ ...
+
+ def func(a=Depends(dep)):
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+
+ assert isinstance(result.pop("example"), int)
+ assert result == {
+ "title": "FuncPayload",
+ "type": "integer",
+ }
+
+ assert response is None
+
+
+def test_multi_args():
+ def dep2(c: int):
+ ...
+
+ def dep(a: int, _=Depends(dep2)):
+ ...
+
+ def func(b: float, _=Depends(dep)):
+ ...
+
+ handler = BaseHandler(func, get_dependant(path="", call=func))
+
+ message_title, result, response = handler.get_message_object()
+
+ assert message_title == "FuncMessage"
+
+ example = result.pop("example")
+ assert isinstance(example["a"], int)
+ assert isinstance(example["c"], int)
+ assert isinstance(example["b"], float)
+ assert result == {
+ "properties": {
+ "a": {"title": "A", "type": "integer"},
+ "b": {"title": "B", "type": "number"},
+ "c": {"title": "C", "type": "integer"},
+ },
+ "required": ["b", "a", "c"],
+ "title": "FuncPayload",
+ "type": "object",
+ }
+
+ assert response is None
diff --git a/tests/asyncapi/kafka/__init__.py b/tests/asyncapi/kafka/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/tests/asyncapi/kafka/test_handler.py b/tests/asyncapi/kafka/test_handler.py
new file mode 100644
index 00000000..635b314e
--- /dev/null
+++ b/tests/asyncapi/kafka/test_handler.py
@@ -0,0 +1,56 @@
+from propan import KafkaBroker, PropanApp
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_base_handler():
+ broker = KafkaBroker()
+
+ @broker.handle("test")
+ async def handler(a: int):
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {"kafka": {"bindingVersion": "0.4.0", "topic": ["test"]}},
+ "servers": ["dev"],
+ "subscribe": {"message": {"$ref": "#/components/messages/HandlerMessage"}},
+ }
+ }
+
+
+def test_group_handler():
+ broker = KafkaBroker()
+
+ @broker.handle("test", group_id="workers")
+ async def handler(a: int) -> str:
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+
+ assert isinstance(
+ schema["channels"]["Handler"]["subscribe"]["bindings"]["kafka"]["replyTo"].pop(
+ "examples"
+ )[0],
+ str,
+ )
+
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {"kafka": {"bindingVersion": "0.4.0", "topic": ["test"]}},
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {
+ "kafka": {
+ "bindingVersion": "0.4.0",
+ "groupId": {"enum": ["workers"], "type": "string"},
+ "replyTo": {
+ "title": "HandlerReply",
+ "type": "string",
+ },
+ }
+ },
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
diff --git a/tests/asyncapi/kafka/test_server.py b/tests/asyncapi/kafka/test_server.py
new file mode 100644
index 00000000..9602243e
--- /dev/null
+++ b/tests/asyncapi/kafka/test_server.py
@@ -0,0 +1,28 @@
+from propan import KafkaBroker, PropanApp
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_server_info():
+ schema = gen_app_schema_json(PropanApp(KafkaBroker()))
+ assert schema["servers"]["dev"] == {
+ "protocol": "kafka",
+ "url": "localhost",
+ "protocolVersion": "auto",
+ }
+
+
+def test_server_custom_info():
+ schema = gen_app_schema_json(
+ PropanApp(
+ KafkaBroker(
+ "kafka:9092",
+ protocol="kafka-secury",
+ api_version="1.0.0",
+ )
+ )
+ )
+ assert schema["servers"]["dev"] == {
+ "protocol": "kafka-secury",
+ "url": "kafka:9092",
+ "protocolVersion": "1.0.0",
+ }
diff --git a/tests/asyncapi/nats/__init__.py b/tests/asyncapi/nats/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/tests/asyncapi/nats/test_handler.py b/tests/asyncapi/nats/test_handler.py
new file mode 100644
index 00000000..ef091e51
--- /dev/null
+++ b/tests/asyncapi/nats/test_handler.py
@@ -0,0 +1,61 @@
+from propan import NatsBroker, PropanApp
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_base_handler():
+ broker = NatsBroker()
+
+ @broker.handle("test")
+ async def handler(a: int):
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {"nats": {"bindingVersion": "custom", "subject": "test"}},
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {"nats": {"bindingVersion": "custom"}},
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
+
+
+def test_group_handler():
+ broker = NatsBroker()
+
+ @broker.handle("*test", queue="workers")
+ async def handler(a: int) -> str:
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+
+ assert isinstance(
+ schema["channels"]["Handler"]["subscribe"]["bindings"]["nats"]["replyTo"].pop(
+ "examples"
+ )[0],
+ str,
+ )
+
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "nats": {
+ "bindingVersion": "custom",
+ "queue": "workers",
+ "subject": "*test",
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {
+ "nats": {
+ "bindingVersion": "custom",
+ "replyTo": {"title": "HandlerReply", "type": "string"},
+ }
+ },
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
diff --git a/tests/asyncapi/nats/test_server.py b/tests/asyncapi/nats/test_server.py
new file mode 100644
index 00000000..845b39e1
--- /dev/null
+++ b/tests/asyncapi/nats/test_server.py
@@ -0,0 +1,10 @@
+from propan import NatsBroker, PropanApp
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_server_info():
+ schema = gen_app_schema_json(PropanApp(NatsBroker()))
+ assert schema["servers"]["dev"] == {
+ "protocol": "nats",
+ "url": "nats://localhost:4222",
+ }
diff --git a/tests/asyncapi/rabbit/__init__.py b/tests/asyncapi/rabbit/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/tests/asyncapi/rabbit/test_handler.py b/tests/asyncapi/rabbit/test_handler.py
new file mode 100644
index 00000000..2f9b5d0d
--- /dev/null
+++ b/tests/asyncapi/rabbit/test_handler.py
@@ -0,0 +1,115 @@
+from propan import PropanApp, RabbitBroker
+from propan.brokers.rabbit import ExchangeType, RabbitExchange
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_base_handler():
+ broker = RabbitBroker()
+
+ @broker.handle("test")
+ async def handler(a: int):
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "amqp": {
+ "bindingVersion": "0.2.0",
+ "exchange": {"type": "default", "vhost": "/"},
+ "is": "routingKey",
+ "queue": {
+ "autoDelete": False,
+ "durable": False,
+ "exclusive": False,
+ "name": "test",
+ "vhost": "/",
+ },
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {
+ "amqp": {"ack": True, "bindingVersion": "0.2.0", "cc": "test"}
+ },
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
+
+
+def test_fanout_exchange_handler():
+ broker = RabbitBroker()
+
+ @broker.handle("test", RabbitExchange("test", type=ExchangeType.FANOUT))
+ async def handler(a: int):
+ """Test description"""
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "amqp": {
+ "bindingVersion": "0.2.0",
+ "exchange": {
+ "autoDelete": False,
+ "durable": False,
+ "name": "test",
+ "type": "fanout",
+ "vhost": "/",
+ },
+ "is": "routingKey",
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {"amqp": {"ack": True, "bindingVersion": "0.2.0"}},
+ "description": "Test description",
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
+
+
+def test_direct_exchange_handler():
+ broker = RabbitBroker()
+
+ @broker.handle("test", RabbitExchange("test"), description="Test description")
+ async def handler(a: int):
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "amqp": {
+ "bindingVersion": "0.2.0",
+ "exchange": {
+ "autoDelete": False,
+ "durable": False,
+ "name": "test",
+ "type": "direct",
+ "vhost": "/",
+ },
+ "is": "routingKey",
+ "queue": {
+ "autoDelete": False,
+ "durable": False,
+ "exclusive": False,
+ "name": "test",
+ "vhost": "/",
+ },
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {
+ "amqp": {"ack": True, "bindingVersion": "0.2.0", "cc": "test"}
+ },
+ "description": "Test description",
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
diff --git a/tests/asyncapi/rabbit/test_server.py b/tests/asyncapi/rabbit/test_server.py
new file mode 100644
index 00000000..64a0d4d3
--- /dev/null
+++ b/tests/asyncapi/rabbit/test_server.py
@@ -0,0 +1,26 @@
+from propan import PropanApp, RabbitBroker
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_server_info():
+ schema = gen_app_schema_json(PropanApp(RabbitBroker()))
+ assert schema["servers"]["dev"] == {
+ "protocol": "amqp",
+ "url": "amqp://guest:guest@localhost:5672/",
+ "protocolVersion": "0.9.1",
+ }
+
+
+def test_server_custom_info():
+ schema = gen_app_schema_json(
+ PropanApp(
+ RabbitBroker(
+ "amqps://rabbithost.com", protocol="amqps", protocol_version="0.8.0"
+ )
+ )
+ )
+ assert schema["servers"]["dev"] == {
+ "protocol": "amqps",
+ "url": "amqps://rabbithost.com",
+ "protocolVersion": "0.8.0",
+ }
diff --git a/tests/asyncapi/redis/__init__.py b/tests/asyncapi/redis/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/tests/asyncapi/redis/test_handler.py b/tests/asyncapi/redis/test_handler.py
new file mode 100644
index 00000000..f12a8380
--- /dev/null
+++ b/tests/asyncapi/redis/test_handler.py
@@ -0,0 +1,68 @@
+from propan import PropanApp, RedisBroker
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_base_handler():
+ broker = RedisBroker()
+
+ @broker.handle("test")
+ async def handler(a: int):
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "redis": {
+ "bindingVersion": "custom",
+ "channel": "test",
+ "method": "subscribe",
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {"redis": {"bindingVersion": "custom"}},
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
+
+
+def test_group_handler():
+ broker = RedisBroker()
+
+ @broker.handle("*test", pattern=True)
+ async def handler(a: int) -> str:
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+
+ assert isinstance(
+ schema["channels"]["Handler"]["subscribe"]["bindings"]["redis"]["replyTo"].pop(
+ "examples"
+ )[0],
+ str,
+ )
+
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "redis": {
+ "bindingVersion": "custom",
+ "channel": "*test",
+ "method": "psubscribe",
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {
+ "redis": {
+ "bindingVersion": "custom",
+ "replyTo": {"title": "HandlerReply", "type": "string"},
+ }
+ },
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
diff --git a/tests/asyncapi/redis/test_server.py b/tests/asyncapi/redis/test_server.py
new file mode 100644
index 00000000..0b80fc95
--- /dev/null
+++ b/tests/asyncapi/redis/test_server.py
@@ -0,0 +1,10 @@
+from propan import PropanApp, RedisBroker
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_server_info():
+ schema = gen_app_schema_json(PropanApp(RedisBroker()))
+ assert schema["servers"]["dev"] == {
+ "protocol": "redis",
+ "url": "redis://localhost:6379",
+ }
diff --git a/tests/asyncapi/sqs/__init__.py b/tests/asyncapi/sqs/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/tests/asyncapi/sqs/test_handler.py b/tests/asyncapi/sqs/test_handler.py
new file mode 100644
index 00000000..317fd790
--- /dev/null
+++ b/tests/asyncapi/sqs/test_handler.py
@@ -0,0 +1,66 @@
+from propan import PropanApp, SQSBroker
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_base_handler():
+ broker = SQSBroker()
+
+ @broker.handle("test")
+ async def handler(a: int):
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "sqs": {
+ "bindingVersion": "custom",
+ "queue": {"fifo": False, "name": "test"},
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {"sqs": {"bindingVersion": "custom"}},
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
+
+
+def test_group_handler():
+ broker = SQSBroker()
+
+ @broker.handle("test")
+ async def handler(a: int) -> str:
+ ...
+
+ schema = gen_app_schema_json(PropanApp(broker))
+
+ assert isinstance(
+ schema["channels"]["Handler"]["subscribe"]["bindings"]["sqs"]["replyTo"].pop(
+ "examples"
+ )[0],
+ str,
+ )
+
+ assert schema["channels"] == {
+ "Handler": {
+ "bindings": {
+ "sqs": {
+ "bindingVersion": "custom",
+ "queue": {"fifo": False, "name": "test"},
+ }
+ },
+ "servers": ["dev"],
+ "subscribe": {
+ "bindings": {
+ "sqs": {
+ "bindingVersion": "custom",
+ "replyTo": {"title": "HandlerReply", "type": "string"},
+ }
+ },
+ "message": {"$ref": "#/components/messages/HandlerMessage"},
+ },
+ }
+ }
diff --git a/tests/asyncapi/sqs/test_server.py b/tests/asyncapi/sqs/test_server.py
new file mode 100644
index 00000000..d4467da8
--- /dev/null
+++ b/tests/asyncapi/sqs/test_server.py
@@ -0,0 +1,10 @@
+from propan import PropanApp, SQSBroker
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_server_info():
+ schema = gen_app_schema_json(PropanApp(SQSBroker()))
+ assert schema["servers"]["dev"] == {
+ "protocol": "sqs",
+ "url": "http://localhost:9324/",
+ }
diff --git a/tests/asyncapi/test_app_info.py b/tests/asyncapi/test_app_info.py
new file mode 100644
index 00000000..ab93fb0e
--- /dev/null
+++ b/tests/asyncapi/test_app_info.py
@@ -0,0 +1,59 @@
+from propan import PropanApp, RabbitBroker
+from propan.asyncapi import AsyncAPIContact, AsyncAPILicense
+from propan.cli.docs.gen import gen_app_schema_json
+
+
+def test_app_default_info():
+ schema = gen_app_schema_json(PropanApp(RabbitBroker()))
+ assert schema["info"] == {
+ "description": "",
+ "title": "Propan",
+ "version": "0.1.0",
+ }
+
+
+def test_app_base_info():
+ schema = gen_app_schema_json(
+ PropanApp(
+ RabbitBroker(),
+ title="My App",
+ description="description",
+ version="1.0.0",
+ )
+ )
+ assert schema["info"] == {
+ "description": "description",
+ "title": "My App",
+ "version": "1.0.0",
+ }
+
+
+def test_app_detail_info():
+ schema = gen_app_schema_json(
+ PropanApp(
+ RabbitBroker(),
+ title="My App",
+ description="description",
+ version="1.0.0",
+ license=AsyncAPILicense(name="MIT", url="http://mit.com"),
+ contact=AsyncAPIContact(
+ name="Developer",
+ url="http://my-domain.com",
+ email="my-domain@gmail.com",
+ ),
+ )
+ )
+ assert schema["info"] == {
+ "contact": {
+ "email": "my-domain@gmail.com",
+ "name": "Developer",
+ "url": "http://my-domain.com",
+ },
+ "description": "description",
+ "license": {
+ "name": "MIT",
+ "url": "http://mit.com",
+ },
+ "title": "My App",
+ "version": "1.0.0",
+ }
diff --git a/tests/brokers/base/rpc.py b/tests/brokers/base/rpc.py
index a1d606c4..6ca9558c 100644
--- a/tests/brokers/base/rpc.py
+++ b/tests/brokers/base/rpc.py
@@ -80,3 +80,25 @@ async def m(): # pragma: no cover
await asyncio.wait_for(consume.wait(), 3)
mock.assert_called_with("1")
+
+ @pytest.mark.asyncio
+ async def test_unwrap(self, queue: str, full_broker: BrokerUsecase):
+ @full_broker.handle(queue)
+ async def m(a: int, b: int): # pragma: no cover
+ assert a == 1
+ assert b == 1
+ return "1"
+
+ async with full_broker:
+ await full_broker.start()
+
+ r = await full_broker.publish(
+ {
+ "a": 1,
+ "b": 1,
+ },
+ queue,
+ callback_timeout=3,
+ callback=True,
+ )
+ assert r == "1"
diff --git a/tests/brokers/base/test_pushback.py b/tests/brokers/base/test_pushback.py
index 0bfd74a0..13e857c0 100644
--- a/tests/brokers/base/test_pushback.py
+++ b/tests/brokers/base/test_pushback.py
@@ -65,10 +65,10 @@ async def test_push_back_watcher(async_mock):
on_max=async_mock.on_max,
)
- async_mock.side_effect = Exception("Ooops!")
+ async_mock.side_effect = ValueError("Ooops!")
while not async_mock.on_max.called:
- with pytest.raises(Exception):
+ with pytest.raises(ValueError):
async with context:
await async_mock()
@@ -90,10 +90,10 @@ async def test_push_endless_back_watcher(async_mock):
on_max=async_mock.on_max,
)
- async_mock.side_effect = Exception("Ooops!")
+ async_mock.side_effect = ValueError("Ooops!")
while async_mock.on_error.call_count < 10:
- with pytest.raises(Exception):
+ with pytest.raises(ValueError):
async with context:
await async_mock()
diff --git a/tests/brokers/kafka/test_connect.py b/tests/brokers/kafka/test_connect.py
index 480d9f8a..22b8e56d 100644
--- a/tests/brokers/kafka/test_connect.py
+++ b/tests/brokers/kafka/test_connect.py
@@ -9,7 +9,7 @@ class TestKafkaConnect(BrokerConnectionTestcase):
broker = KafkaBroker
@pytest.mark.asyncio
- async def test_connect_merge_args_and_kwargs(self, settings):
+ async def test_connect_merge_args_and_kwargs_native(self, settings):
broker = self.broker("fake-url") # will be ignored
assert await broker.connect(bootstrap_servers=settings.url)
await broker.close()
diff --git a/tests/brokers/nats/test_connect.py b/tests/brokers/nats/test_connect.py
index 1cfa4ab0..02874474 100644
--- a/tests/brokers/nats/test_connect.py
+++ b/tests/brokers/nats/test_connect.py
@@ -9,7 +9,7 @@ class TestNatsConnect(BrokerConnectionTestcase):
broker = NatsBroker
@pytest.mark.asyncio
- async def test_connect_merge_args_and_kwargs(self, settings):
+ async def test_connect_merge_args_and_kwargs_native(self, settings):
broker = self.broker("fake-url") # will be ignored
assert await broker.connect(servers=settings.url)
await broker.close()
diff --git a/tests/brokers/rabbit/test_connect.py b/tests/brokers/rabbit/test_connect.py
index 0613bd53..adb4d743 100644
--- a/tests/brokers/rabbit/test_connect.py
+++ b/tests/brokers/rabbit/test_connect.py
@@ -42,7 +42,7 @@ async def test_connect_merge_kwargs_with_priority(self, settings):
await broker.close()
@pytest.mark.asyncio
- async def test_connect_merge_args_and_kwargs(self, settings):
+ async def test_connect_merge_args_and_kwargs_native(self, settings):
broker = self.broker("fake-url") # will be ignored
assert await broker.connect(url=settings.url)
await broker.close()
diff --git a/tests/brokers/redis/test_connect.py b/tests/brokers/redis/test_connect.py
index bb1fbc09..edc1899b 100644
--- a/tests/brokers/redis/test_connect.py
+++ b/tests/brokers/redis/test_connect.py
@@ -31,7 +31,7 @@ async def test_connect_merge_kwargs_with_priority(self, settings):
await broker.close()
@pytest.mark.asyncio
- async def test_connect_merge_args_and_kwargs(self, settings):
+ async def test_connect_merge_args_and_kwargs_native(self, settings):
broker = self.broker("fake-url") # will be ignored
assert await broker.connect(url=settings.url)
await broker.close()
diff --git a/tests/brokers/sqs/test_connect.py b/tests/brokers/sqs/test_connect.py
index f8cf02eb..10658d38 100644
--- a/tests/brokers/sqs/test_connect.py
+++ b/tests/brokers/sqs/test_connect.py
@@ -17,7 +17,7 @@ def get_broker_args(self, settings):
}
@pytest.mark.asyncio
- async def test_connect_merge_args_and_kwargs(self, settings):
+ async def test_connect_merge_args_and_kwargs_native(self, settings):
args, kwargs = self.get_broker_args(settings)
broker = self.broker("fake-url") # will be ignored
assert await broker.connect(url=settings.url, **kwargs)
diff --git a/tests/cli/conftest.py b/tests/cli/conftest.py
index d0de9468..5b08c368 100644
--- a/tests/cli/conftest.py
+++ b/tests/cli/conftest.py
@@ -1,67 +1,70 @@
-from tempfile import TemporaryDirectory
+from pathlib import Path
import pytest
from typer.testing import CliRunner
from propan import PropanApp
from propan.brokers.rabbit import RabbitBroker
-from propan.cli.startproject.async_app.kafka import create_kafka
-from propan.cli.startproject.async_app.nats import create_nats
-from propan.cli.startproject.async_app.rabbit import create_rabbit
-from propan.cli.startproject.async_app.redis import create_redis
-from propan.cli.startproject.async_app.sqs import create_sqs
+from propan.cli import cli
-@pytest.fixture
+@pytest.fixture()
def broker():
yield RabbitBroker()
-@pytest.fixture
+@pytest.fixture()
def app_without_logger(broker):
return PropanApp(broker, None)
-@pytest.fixture
+@pytest.fixture()
def app_without_broker():
return PropanApp()
-@pytest.fixture
+@pytest.fixture()
def app(broker):
return PropanApp(broker)
-@pytest.fixture
-def runner():
- return CliRunner()
+@pytest.fixture(scope="session")
+def runner() -> CliRunner:
+ runner = CliRunner()
+ with runner.isolated_filesystem():
+ yield runner
-@pytest.fixture(scope="module")
-def rabbit_async_project():
- with TemporaryDirectory() as dir:
- yield create_rabbit(dir)
+@pytest.fixture(scope="session")
+def rabbit_async_project(runner: CliRunner) -> Path:
+ project_name = "rabbit"
+ runner.invoke(cli, ["create", "async", "rabbit", project_name])
+ yield Path.cwd() / Path(project_name)
-@pytest.fixture(scope="module")
-def redis_async_project():
- with TemporaryDirectory() as dir:
- yield create_redis(dir)
+@pytest.fixture(scope="session")
+def redis_async_project(runner: CliRunner) -> Path:
+ project_name = "redis"
+ runner.invoke(cli, ["create", "async", "redis", project_name])
+ yield Path.cwd() / Path(project_name)
-@pytest.fixture(scope="module")
-def nats_async_project():
- with TemporaryDirectory() as dir:
- yield create_nats(dir)
+@pytest.fixture(scope="session")
+def nats_async_project(runner: CliRunner) -> Path:
+ project_name = "nats"
+ runner.invoke(cli, ["create", "async", "nats", project_name])
+ yield Path.cwd() / Path(project_name)
-@pytest.fixture(scope="module")
-def kafka_async_project():
- with TemporaryDirectory() as dir:
- yield create_kafka(dir)
+@pytest.fixture(scope="session")
+def kafka_async_project(runner: CliRunner) -> Path:
+ project_name = "kafka"
+ runner.invoke(cli, ["create", "async", "kafka", project_name])
+ yield Path.cwd() / Path(project_name)
-@pytest.fixture(scope="module")
-def sqs_async_project():
- with TemporaryDirectory() as dir:
- yield create_sqs(dir)
+@pytest.fixture(scope="session")
+def sqs_async_project(runner: CliRunner) -> Path:
+ project_name = "sqs"
+ runner.invoke(cli, ["create", "async", "sqs", project_name])
+ yield Path.cwd() / Path(project_name)
diff --git a/tests/cli/test_doc.py b/tests/cli/test_doc.py
new file mode 100644
index 00000000..c8949b05
--- /dev/null
+++ b/tests/cli/test_doc.py
@@ -0,0 +1,59 @@
+from pathlib import Path
+from unittest.mock import Mock
+
+import uvicorn
+import yaml
+from typer.testing import CliRunner
+
+from propan.cli.main import cli
+
+
+def test_gen_rabbit_docs(runner: CliRunner, rabbit_async_project: Path):
+ app_path = f'{rabbit_async_project / "app" / "serve"}:app'
+ r = runner.invoke(cli, ["docs", "gen", app_path])
+ assert r.exit_code == 0
+
+ schema_path = rabbit_async_project.parent / "asyncapi.yaml"
+ assert schema_path.exists()
+
+ with schema_path.open("r") as f:
+ schema = yaml.load(f, Loader=yaml.BaseLoader)
+
+ assert schema
+
+
+def test_gen_wrong_path(runner: CliRunner, rabbit_async_project: Path):
+ app_path = f'{rabbit_async_project / "app" / "serve"}:app1'
+ r = runner.invoke(cli, ["docs", "gen", app_path])
+ assert r.exit_code == 2
+ assert "Please, input module like [python_file:propan_app_name]" in r.stdout
+
+
+def test_serve_rabbit_docs(
+ runner: CliRunner,
+ rabbit_async_project: Path,
+ monkeypatch,
+ mock: Mock,
+):
+ app_path = f'{rabbit_async_project / "app" / "serve"}:app'
+
+ with monkeypatch.context() as m:
+ m.setattr(uvicorn, "run", mock)
+ r = runner.invoke(cli, ["docs", "serve", app_path])
+
+ assert r.exit_code == 0
+ mock.assert_called_once()
+
+
+def test_serve_rabbit_schema(
+ runner: CliRunner,
+ rabbit_async_project: Path,
+ monkeypatch,
+ mock: Mock,
+):
+ with monkeypatch.context() as m:
+ m.setattr(uvicorn, "run", mock)
+ r = runner.invoke(cli, ["docs", "serve", "asyncapi.yaml"])
+
+ assert r.exit_code == 0
+ mock.assert_called_once()
diff --git a/tests/cli/test_run.py b/tests/cli/test_run.py
index a33aee42..211f4faa 100644
--- a/tests/cli/test_run.py
+++ b/tests/cli/test_run.py
@@ -1,68 +1,122 @@
-import sys
-import time
-from multiprocessing import Process
+from pathlib import Path
+from unittest.mock import Mock
import pytest
+from typer.testing import CliRunner
-from propan.cli.main import _run
-from propan.cli.utils.imports import get_app_path
+from propan import PropanApp
+from propan.cli import cli
@pytest.mark.rabbit
-@pytest.mark.slow
-def test_run_rabbit_correct(rabbit_async_project):
- module, app = get_app_path(f'{rabbit_async_project / "app" / "serve"}:app')
- sys.path.insert(0, str(module.parent))
- p = Process(target=_run, args=(module, app, {}))
- p.start()
- time.sleep(0.1)
- p.terminate()
- p.join()
+def test_run_rabbit_correct(
+ runner: CliRunner,
+ rabbit_async_project: Path,
+ monkeypatch,
+ mock: Mock,
+):
+ app_path = f"{rabbit_async_project.name}.app.serve:app"
+
+ async def patched_run(self: PropanApp, *args, **kwargs):
+ await self._startup()
+ await self._shutdown()
+ mock()
+
+ with monkeypatch.context() as m:
+ m.setattr(PropanApp, "run", patched_run)
+ r = runner.invoke(cli, ["run", app_path])
+
+ assert r.exit_code == 0
+ mock.assert_called_once()
@pytest.mark.redis
-@pytest.mark.slow
-def test_run_redis_correct(redis_async_project):
- module, app = get_app_path(f'{redis_async_project / "app" / "serve"}:app')
- sys.path.insert(0, str(module.parent))
- p = Process(target=_run, args=(module, app, {}))
- p.start()
- time.sleep(0.1)
- p.terminate()
- p.join()
+@pytest.mark.xfail
+def test_run_redis_correct(
+ runner: CliRunner,
+ redis_async_project: Path,
+ monkeypatch,
+ mock: Mock,
+):
+ app_path = f"{redis_async_project.name}.app.serve:app"
+
+ async def patched_run(self: PropanApp, *args, **kwargs):
+ await self._startup()
+ await self._shutdown()
+ mock()
+
+ with monkeypatch.context() as m:
+ m.setattr(PropanApp, "run", patched_run)
+ r = runner.invoke(cli, ["run", app_path])
+
+ assert r.exit_code == 0
+ mock.assert_called_once()
@pytest.mark.nats
-@pytest.mark.slow
-def test_run_nats_correct(nats_async_project):
- module, app = get_app_path(f'{nats_async_project / "app" / "serve"}:app')
- sys.path.insert(0, str(module.parent))
- p = Process(target=_run, args=(module, app, {}))
- p.start()
- time.sleep(0.1)
- p.terminate()
- p.join()
+@pytest.mark.xfail
+def test_run_nats_correct(
+ runner: CliRunner,
+ nats_async_project: Path,
+ monkeypatch,
+ mock: Mock,
+):
+ app_path = f"{nats_async_project.name}.app.serve:app"
+
+ async def patched_run(self: PropanApp, *args, **kwargs):
+ await self._startup()
+ await self._shutdown()
+ mock()
+
+ with monkeypatch.context() as m:
+ m.setattr(PropanApp, "run", patched_run)
+ r = runner.invoke(cli, ["run", app_path])
+
+ assert r.exit_code == 0
+ mock.assert_called_once()
@pytest.mark.kafka
-@pytest.mark.slow
-def test_run_kafka_correct(kafka_async_project):
- module, app = get_app_path(f'{kafka_async_project / "app" / "serve"}:app')
- sys.path.insert(0, str(module.parent))
- p = Process(target=_run, args=(module, app, {}))
- p.start()
- time.sleep(0.1)
- p.terminate()
- p.join()
+@pytest.mark.xfail
+def test_run_kafka_correct(
+ runner: CliRunner,
+ kafka_async_project: Path,
+ monkeypatch,
+ mock: Mock,
+):
+ app_path = f"{kafka_async_project.name}.app.serve:app"
+
+ async def patched_run(self: PropanApp, *args, **kwargs):
+ await self._startup()
+ await self._shutdown()
+ mock()
+
+ with monkeypatch.context() as m:
+ m.setattr(PropanApp, "run", patched_run)
+ r = runner.invoke(cli, ["run", app_path])
+
+ assert r.exit_code == 0
+ mock.assert_called_once()
@pytest.mark.sqs
-@pytest.mark.slow
-def test_run_sqs_correct(sqs_async_project):
- module, app = get_app_path(f'{sqs_async_project / "app" / "serve"}:app')
- sys.path.insert(0, str(module.parent))
- p = Process(target=_run, args=(module, app, {}))
- p.start()
- time.sleep(0.1)
- p.terminate()
- p.join()
+@pytest.mark.xfail
+def test_run_sqs_correct(
+ runner: CliRunner,
+ sqs_async_project: Path,
+ monkeypatch,
+ mock: Mock,
+):
+ app_path = f"{sqs_async_project.name}.app.serve:app"
+
+ async def patched_run(self: PropanApp, *args, **kwargs):
+ await self._startup()
+ await self._shutdown()
+ mock()
+
+ with monkeypatch.context() as m:
+ m.setattr(PropanApp, "run", patched_run)
+ r = runner.invoke(cli, ["run", app_path])
+
+ assert r.exit_code == 0
+ mock.assert_called_once()
diff --git a/tests/cli/utils/test_imports.py b/tests/cli/utils/test_imports.py
index 47e5d53d..4bcfb314 100644
--- a/tests/cli/utils/test_imports.py
+++ b/tests/cli/utils/test_imports.py
@@ -4,14 +4,6 @@
from propan.cli.utils.imports import get_app_path, import_object
-test_object = Path(__file__)
-
-
-def test_import():
- dir, app = get_app_path("tests.cli.utils.test_imports:test_object")
- obj = import_object(dir, app)
- assert obj == test_object
-
def test_import_wrong():
dir, app = get_app_path("tests:test_object")