Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create Warning, Info modals #4518

Merged
merged 40 commits into from Jul 3, 2023
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
ad98cb4
changes
aliabid94 Jun 14, 2023
fe5dee0
Merge branch 'main' into info_modal
abidlabs Jun 14, 2023
56afa6f
changes
aliabid94 Jun 15, 2023
b196c38
changes
aliabid94 Jun 15, 2023
53d24f8
Merge branch 'info_modal' of https://github.com/gradio-app/gradio int…
aliabid94 Jun 15, 2023
256ce30
changes
aliabid94 Jun 15, 2023
e1a7242
Merge branch 'main' into info_modal
aliabid94 Jun 15, 2023
eeffeec
Merge branch 'main' into info_modal
abidlabs Jun 15, 2023
dfed2e6
changes
aliabid94 Jun 15, 2023
e7c0007
Merge branch 'info_modal' of https://github.com/gradio-app/gradio int…
aliabid94 Jun 15, 2023
07321dc
Styling for error, warning and info toasts (#4603)
hannahblair Jun 22, 2023
0a2e0d6
chagnes
aliabid94 Jun 27, 2023
8bb18a3
changes
aliabid94 Jun 27, 2023
69c05ec
changes
aliabid94 Jun 27, 2023
b96f303
changes
aliabid94 Jun 27, 2023
1f04c52
changes
aliabid94 Jun 27, 2023
76b33fc
changes
aliabid94 Jun 27, 2023
2442d4e
changes
aliabid94 Jun 27, 2023
35a1f4f
changes
aliabid94 Jun 27, 2023
4a6a00c
chanegs
aliabid94 Jun 27, 2023
350698a
changes
aliabid94 Jun 27, 2023
b9ce360
changes
aliabid94 Jun 29, 2023
2120b43
changes
aliabid94 Jun 30, 2023
2422d1d
chagnes
aliabid94 Jun 30, 2023
4832121
changes
aliabid94 Jun 30, 2023
ffd04ca
changes
aliabid94 Jun 30, 2023
29d257b
changes
aliabid94 Jun 30, 2023
434546a
changes
aliabid94 Jun 30, 2023
7600b69
changes
aliabid94 Jun 30, 2023
d905aa0
changes
aliabid94 Jun 30, 2023
c9f55d2
changes
aliabid94 Jun 30, 2023
3bf4044
changes
aliabid94 Jun 30, 2023
6530910
changes
aliabid94 Jun 30, 2023
96fcb1a
Merge branch 'main' into info_modal
abidlabs Jun 30, 2023
0b8e671
Merge branch 'main' into info_modal
abidlabs Jun 30, 2023
a48f747
changes
aliabid94 Jul 3, 2023
56b8561
changes
aliabid94 Jul 3, 2023
77043d8
changes
aliabid94 Jul 3, 2023
c103ed5
Merge remote-tracking branch 'origin' into info_modal
aliabid94 Jul 3, 2023
9a5bb6d
changes
aliabid94 Jul 3, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Expand Up @@ -43,6 +43,7 @@ demo.launch()
- Add `autoplay` kwarg to `Video` and `Audio` components by [@pngwn](https://github.com/pngwn) in [PR 4453](https://github.com/gradio-app/gradio/pull/4453)
- Add `allow_preview` parameter to `Gallery` to control whether a detailed preview is displayed on click by
[@freddyaboulton](https://github.com/freddyaboulton) in [PR 4470](https://github.com/gradio-app/gradio/pull/4470)
- Can now issue `gr.Warning` and `gr.Info` modals. Simply put the code `gr.Warning("Your warning message")` or `gr.Info("Your info message")` as a standalone line in your function. By [@aliabid94](https://github.com/aliabid94) in [PR 4518](https://github.com/gradio-app/gradio/pull/4518).
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved
- Add `latex_delimiters` parameter to `Chatbot` to control the delimiters used for LaTeX and to disable LaTeX in the `Chatbot` by [@dawoodkhan82](https://github.com/dawoodkhan82) in [PR 4516](https://github.com/gradio-app/gradio/pull/4516)

## Bug Fixes:
Expand Down
16 changes: 13 additions & 3 deletions client/js/src/client.ts
Expand Up @@ -503,6 +503,14 @@ export async function client(
endpoint: _endpoint,
fn_index
});
} else if (type === "log") {
fire_event({
type: "log",
log: data.log,
level: data.level,
endpoint: _endpoint,
fn_index
});
}
if (data) {
fire_event({
Expand Down Expand Up @@ -612,8 +620,8 @@ export async function client(

function destroy() {
for (const event_type in listener_map) {
listener_map[event_type as "data" | "status"].forEach((fn) => {
off(event_type as "data" | "status", fn);
listener_map[event_type as EventType].forEach((fn) => {
off(event_type as EventType, fn);
});
}
}
Expand Down Expand Up @@ -1180,7 +1188,7 @@ function handle_message(
data: any,
last_status: Status["stage"]
): {
type: "hash" | "data" | "update" | "complete" | "generating" | "none";
type: "hash" | "data" | "update" | "complete" | "generating" | "log" | "none";
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved
data?: any;
status?: Status;
} {
Expand Down Expand Up @@ -1225,6 +1233,8 @@ function handle_message(
success: data.success
}
};
case "log":
return { type: "log", data: data };
case "process_generating":
return {
type: "generating",
Expand Down
8 changes: 7 additions & 1 deletion client/js/src/types.ts
Expand Up @@ -55,6 +55,11 @@ export interface Status {
time?: Date;
}

export interface LogMessage {
log: string;
level: "warning" | "info";
}

export interface SpaceStatusNormal {
status: "sleeping" | "running" | "building" | "error" | "stopped";
detail:
Expand Down Expand Up @@ -83,11 +88,12 @@ export type SpaceStatus = SpaceStatusNormal | SpaceStatusError;
export type status_callback_function = (a: Status) => void;
export type SpaceStatusCallback = (a: SpaceStatus) => void;

export type EventType = "data" | "status";
export type EventType = "data" | "status" | "log";

export interface EventMap {
data: Payload;
status: Status;
log: LogMessage;
}

export type Event<K extends EventType> = {
Expand Down
2 changes: 1 addition & 1 deletion gradio/__init__.py
Expand Up @@ -57,7 +57,7 @@
)
from gradio.deploy_space import deploy
from gradio.events import SelectData
from gradio.exceptions import Error
from gradio.exceptions import Error, Info, Warning
from gradio.external import load
from gradio.flagging import (
CSVLogger,
Expand Down
6 changes: 3 additions & 3 deletions gradio/blocks.py
Expand Up @@ -1071,6 +1071,8 @@ async def call_function(
else:
fn = block_fn.fn

fn = utils.get_function_with_locals(fn, self, event_id)
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved

if inspect.iscoroutinefunction(fn):
prediction = await fn(*processed_input)
else:
Expand Down Expand Up @@ -2091,9 +2093,7 @@ def startup_events(self):
"""Events that should be run when the app containing this block starts up."""

if self.enable_queue:
utils.run_coro_in_background(
self._queue.start, self.progress_tracking, self.ssl_verify
)
utils.run_coro_in_background(self._queue.start, self.ssl_verify)
# So that processing can resume in case the queue was stopped
self._queue.stopped = False
utils.run_coro_in_background(self.create_limiter)
Expand Down
7 changes: 7 additions & 0 deletions gradio/data_classes.py
Expand Up @@ -4,6 +4,7 @@
from typing import Any, Dict, List, Optional, Union

from pydantic import BaseModel
from typing_extensions import Literal


class PredictBody(BaseModel):
Expand Down Expand Up @@ -53,3 +54,9 @@ class ProgressUnit(BaseModel):
class Progress(BaseModel):
msg: str = "progress"
progress_data: List[ProgressUnit] = []


class LogMessage(BaseModel):
msg: str = "log"
log: str
level: Literal["info", "warning"]
53 changes: 52 additions & 1 deletion gradio/exceptions.py
@@ -1,4 +1,7 @@
import warnings

from gradio_client.documentation import document, set_documentation_group
from typing_extensions import Literal
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved

set_documentation_group("helpers")

Expand Down Expand Up @@ -30,7 +33,7 @@ class Error(Exception):
Demos: calculator
"""

def __init__(self, message: str):
def __init__(self, message: str = "Error raised."):
"""
Parameters:
message: The error message to be displayed to the user.
Expand All @@ -40,3 +43,51 @@ def __init__(self, message: str):

def __str__(self):
return repr(self.message)


def log_message(message: str, level: Literal["info", "warning"] = "info"):
from gradio import queueing

if not hasattr(queueing.thread_data, "blocks"): # Function called outside of Gradio
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved
if level == "info":
print(message)
elif level == "warning":
warnings.warn(message)
return
if not queueing.thread_data.blocks.enable_queue:
warnings.warn(
f"Queueing must be enabled to issue {level.capitalize()}: '{message}'."
)
return
print("here")
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved
queueing.thread_data.blocks._queue.log_message(
event_id=queueing.thread_data.event_id, log=message, level=level
)


@document()
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved
class Warning:
"""
This class allows you to pass custom warning messages to the user. You can do so simply with `gr.Warning('message here')`, and when that line is executed the custom message will appear in a modal on the demo.
"""

def __init__(self, message: str = "Warning issued."):
"""
Parameters:
message: The warning message to be displayed to the user.
"""
log_message(message, level="warning")


@document()
class Info:
"""
This class allows you to pass custom info messages to the user. You can do so simply with `gr.Info('message here')`, and when that line is executed the custom message will appear in a modal on the demo.
"""

def __init__(self, message: str = "Info issued."):
"""
Parameters:
message: The info message to be displayed to the user.
"""
log_message(message, level="info")
74 changes: 57 additions & 17 deletions gradio/queueing.py
Expand Up @@ -2,18 +2,28 @@

import asyncio
import copy
import threading
import time
from asyncio import TimeoutError as AsyncTimeOutError
from collections import deque
from typing import Any

import fastapi
import httpx

from gradio.data_classes import Estimation, PredictBody, Progress, ProgressUnit
from typing_extensions import Literal

from gradio.data_classes import (
Estimation,
LogMessage,
PredictBody,
Progress,
ProgressUnit,
)
from gradio.helpers import TrackedIterable
from gradio.utils import AsyncRequest, run_coro_in_background, set_task_name

thread_data = threading.local()
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved


class Event:
def __init__(
Expand All @@ -31,6 +41,7 @@ def __init__(
self.token: str | None = None
self.progress: Progress | None = None
self.progress_pending: bool = False
self.log_messages: list[LogMessage] = []

async def disconnect(self, code: int = 1000):
await self.websocket.close(code=code)
Expand Down Expand Up @@ -65,14 +76,14 @@ def __init__(
self.blocks_dependencies = blocks_dependencies
self.access_token = ""
self.queue_client = None
self.continuous_tasks: list[Event] = []

async def start(self, progress_tracking=False, ssl_verify=True):
async def start(self, ssl_verify=True):
# So that the client is attached to the running event loop
self.queue_client = httpx.AsyncClient(verify=ssl_verify)

run_coro_in_background(self.start_processing)
if progress_tracking:
run_coro_in_background(self.start_progress_tracking)
run_coro_in_background(self.start_log_and_progress_updates)
if not self.live_updates:
run_coro_in_background(self.notify_clients)

Expand Down Expand Up @@ -134,26 +145,35 @@ async def start_processing(self) -> None:
run_coro_in_background(self.broadcast_live_estimations)
set_task_name(task, events[0].session_hash, events[0].fn_index, batch)

async def start_progress_tracking(self) -> None:
async def start_log_and_progress_updates(self) -> None:
while not self.stopped:
if not any(self.active_jobs):
await asyncio.sleep(self.progress_update_sleep_when_free)
continue

for job in self.active_jobs:
if job is None:
continue
for event in job:
if event.progress_pending and event.progress:
event.progress_pending = False
client_awake = await self.send_message(
event, event.progress.dict()
)
if not client_awake:
await self.clean_event(event)
events = [
evt for job in self.active_jobs if job is not None for evt in job
] + self.continuous_tasks

for event in events:
if event.progress_pending and event.progress:
event.progress_pending = False
client_awake = await self.send_message(
event, event.progress.dict()
)
if not client_awake:
await self.clean_event(event)
await self.send_log_updates_for_event(event)

await asyncio.sleep(self.progress_update_sleep_when_free)

async def send_log_updates_for_event(self, event: Event) -> None:
while len(event.log_messages) > 0:
message = event.log_messages.pop(0)
client_awake = await self.send_message(event, message.dict())
if not client_awake:
await self.clean_event(event)

def set_progress(
self,
event_id: str,
Expand All @@ -179,6 +199,23 @@ def set_progress(
evt.progress = Progress(progress_data=progress_data)
evt.progress_pending = True

def log_message(
self,
event_id: str,
log: str,
level: Literal["info", "warning"],
):
for job in self.active_jobs:
if job is None:
continue
for evt in job:
if evt._id == event_id:
log_message = LogMessage(
log=log,
level=level,
)
evt.log_messages.append(log_message)

def push(self, event: Event) -> int | None:
"""
Add event to queue, or return None if Queue is full
Expand Down Expand Up @@ -404,6 +441,9 @@ async def process_events(self, events: list[Event], batch: bool) -> None:
for e, event in enumerate(awake_events):
if batch and "data" in output:
output["data"] = list(zip(*response.json.get("data")))[e]
await self.send_log_updates_for_event(
event
) # clean out pending log updates first
await self.send_message(
event,
{
Expand Down
1 change: 1 addition & 0 deletions gradio/routes.py
Expand Up @@ -561,6 +561,7 @@ async def join_queue(
if blocks.dependencies[event.fn_index].get("every", 0):
await cancel_tasks({f"{event.session_hash}_{event.fn_index}"})
await blocks._queue.reset_iterators(event.session_hash, event.fn_index)
blocks._queue.continuous_tasks.append(event)
task = run_coro_in_background(
blocks._queue.process_events, [event], False
)
Expand Down
10 changes: 9 additions & 1 deletion gradio/utils.py
Expand Up @@ -46,7 +46,7 @@
from gradio.strings import en

if TYPE_CHECKING: # Only import for type checking (is False at runtime).
from gradio.blocks import Block, BlockContext
from gradio.blocks import Block, BlockContext, Blocks
from gradio.components import Component

JSON_PATH = os.path.join(os.path.dirname(gradio.__file__), "launches.json")
Expand Down Expand Up @@ -645,6 +645,14 @@ def continuous_fn(*args):

return continuous_fn

def get_function_with_locals(fn: Callable, blocks: Blocks, event_id: str):
from gradio.queueing import thread_data
def fn_wrap(*args, **kwargs):
thread_data.blocks = blocks
thread_data.event_id = event_id
return fn(*args, **kwargs)
return fn_wrap


async def cancel_tasks(task_ids: set[str]):
if sys.version_info < (3, 8):
Expand Down
6 changes: 4 additions & 2 deletions guides/01_getting-started/02_key-features.md
Expand Up @@ -25,9 +25,11 @@ You can load a large dataset into the examples to browse and interact with the d

Continue learning about examples in the [More On Examples](https://gradio.app/more-on-examples) guide.

## Errors
## Errors and Warnings
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved

You wish to pass custom error messages to the user. To do so, raise a `gr.Error("custom message")` to display an error message. If you try to divide by zero in the calculator demo above, a popup modal will display the custom error message. Learn more about Error in the [docs](https://gradio.app/docs#error).
You wish to pass custom error messages to the user. To do so, raise a `gr.Error("custom message")` to display an error message. If you try to divide by zero in the calculator demo above, a popup modal will display the custom error message. Learn more about Error in the [docs](https://gradio.app/docs#error).

You can also issue `gr.Warning("message")` and `gr.Info("message")` by having them as standalone lines in your function, which will immediately display modals while continuing the execution of your function. Queueing needs to be enabled for this to work.
aliabid94 marked this conversation as resolved.
Show resolved Hide resolved

## Descriptive Content

Expand Down