Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions doc/changes/unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@

## Changes

## Features

* #146: Add interface for text_ai_extension_wrapper

## Security Issues

* #101: Updated bucketfs-python dependency to the version 1.0.0+
Expand Down
79 changes: 79 additions & 0 deletions exasol/nb_connector/text_ai_extension_wrapper.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
from typing import Optional

from exasol.nb_connector.secret_store import Secrets
from pathlib import Path

LANGUAGE_ALIAS = "PYTHON3_TXAIE"

LATEST_KNOWN_VERSION = "???"

def deploy_licence(conf: Secrets,
licence_file: Optional[Path] = None,
licence_content: Optional[str] = None) -> None:
"""
Deploys the given license and saves its identifier to the secret store. The licence can either be
defined by a path pointing to a licence file, or by the licence content given as a string.
Parameters:
conf:
The secret store.
licence_file:
Optional. Path of a licence file.
licence_content:
Optional. Content of a licence given as a string.

"""
pass

def initialize_text_ai_extension(conf: Secrets,
container_file: Optional[Path] = None,
version: Optional[str] = LATEST_KNOWN_VERSION,
language_alias: str = LANGUAGE_ALIAS,
run_deploy_container: bool = True,
run_deploy_scripts: bool = True,
run_upload_models: bool = True,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't have scripts to deploy. Neither do we have any models to upload upfront.
But we probably need the BucketFS credentials and HF token, just like the TE.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes but we plan on having scripts and default models in the future, and want to prevent having to change the interface to often.

i am not sure about the HF token. that one is for private models, and i doubt we will have private models as default models for text ai, which are the only ones this installer will use. @tkilias what do you think?

Bucketfs credential you might be right. i thought those would be given through the secret store? is this not the case?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With regards to the private token, as far as I am concerned, there is no difference between the Text AI and TE.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

User might use private models, so the token is needed

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding bucketfs credentials that comes from the AI Lab config

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will have scripts and default models. Scripts will be added in Q2 in form of span functions and default models, we actually will need already this quarter.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

having the HF token here still seems kinda iffy to me. this call does not need them. unless we allow users to set default model? which seem not default then.
otherwise if people want to install additional private models, should that not happen via a different call, which then can also be used to set the token?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can also remove the hf token, have no strong feelings about it

run_encapsulate_bfs_credentials: bool = True,
allow_override: bool = True) -> None:
"""
Depending on which flags are set, runs different steps to install Text-AI Extension in the DB.
Possible steps:

* Call the Text-AI Extension's language container deployment API.
If given a version, downloads the specified released version of the extension from ???
and uploads it to the BucketFS.

If given a container_file path instead, installs the given container in the Bucketfs.

If neither is given, attempts to install the latest version from ???.

This function doesn't activate the language container. Instead, it gets the
activation SQL using the same API and writes it to the secret store. The name
of the key is defined in the ACTIVATION_KEY constant.

* Install default transformers models into
the Bucketfs using Transformers Extensions upload model functionality.

* Install Text-AI specific scripts.

Parameters:
conf:
The secret store. The store must contain the DB connection parameters
and the parameters of the BucketFS service.
container_file:
Optional. Path pointing to the locally stored Script Language Container file for the Text-AI Extension.
version:
Optional. Text-AI extension version.
language_alias:
The language alias of the extension's language container.
run_deploy_container:
If True runs deployment of the locally stored Script Language Container file for the Text-AI Extension.
run_deploy_scripts:
If True runs deployment of Text-AI Extension scripts.
run_upload_models:
If True uploads default Transformers models to the BucketFS.
run_encapsulate_bfs_credentials:
If set to False will skip the creation of the text ai specific database connection
object encapsulating the BucketFS credentials.
allow_override:
If True allows overriding the language definition.
"""
pass
Loading