This FastAPI server provides a simple way to connect LLMs, AI tools, or frontends to any database. It includes multiple endpoints tailored for various use cases. For detailed explanations of each endpoint, visit the 'Build' section at rex.tigzig.com.
To install the required dependencies, run:
pip install -r requirements.txt
To start the server, execute:
uvicorn app:app --host 0.0.0.0 --port $PORT
-
FLOWISE_API_ENDPOINT
- This is the endpoint for the Flowise API and is required for the file upload functionality in the REX app.
- It represents the URL for your backend LLM agent. While Flowise is the default, you can use any LLM agent platform, OpenAI, or other chat completion endpoints.
-
Custom GPT Connections
- Use the
connect-db
endpoint for custom GPTs to interact with any database. This does not require any environment variables. Simply deploy the app and point to this endpoint.
- Use the
-
Connecting Frontends, User Interfaces, or LLM Agents to Databases
- If your app requires file uploads, the
FLOWISE_API_ENDPOINT
must be provided. - For apps that only need database connections to execute SQL queries, no environment variables are necessary.
- If your app requires file uploads, the
-
Other Environment Variables
- Additional variables (e.g., AWS, Azure, Neon, or Filessio credentials) may be used to hardcode database connections. These are optional and not needed for basic REX app functionality.
Here's a simplified description of the endpoints and their functionality for your README file:
This section provides a quick overview of the available endpoints, what they do, and the parameters they require. These endpoints allow connecting, uploading, and querying databases using FastAPI.
GET /sqlquery/
- Description: Executes a SQL query on a specified database. Supports
SELECT
andnon-SELECT
queries. Results fromSELECT
queries are returned as a text file. - Parameters:
sqlquery
(string): The SQL query to execute.cloud
(string): The database provider (azure
,aws
,neon
,filessio
).
- Authentication: Uses credentials from environment variables for the specified database provider. Useful if want to provide options across a set of hardcoded databases.
GET /connect-db/
- Description: Connects to a custom MySQL or PostgreSQL database using provided credentials and optionally executes a SQL query.
- Parameters:
host
(string): Database host address.database
(string): Name of the database to connect to.user
(string): Database user name.password
(string): Database password.port
(integer, default=3306): Port number for the database.db_type
(string, default=mysql
): Database type (mysql
orpostgresql
).sqlquery
(string, optional): SQL query to execute.
- Authentication: Accepts credentials as parameters. Allows to connect to any database with just the credentials.
POST /upload-file-llm-pg/
- Description: Uploads a file to generate a PostgreSQL table using an LLM for schema inference.
- Parameters:
file
(file): The file to be uploaded.
- Additional Notes:
- Uses the LLM for schema inference and creates a table in the
public
schema. - Requires
FLOWISE_API_ENDPOINT
to be configured. - For connecting to hardcoded database connections.
- Uses the LLM for schema inference and creates a table in the
POST /upload-file-llm-mysql/
- Description: Uploads a file to generate a MySQL table using an LLM for schema inference.
- Parameters:
file
(file): The file to be uploaded.
- Additional Notes:
- Uses the LLM for schema inference and creates a table.
- Requires
FLOWISE_API_ENDPOINT
to be configured. - For connecting to hardcoded database connections.
POST /upload-file-custom-db-pg/
- Description: Uploads a file to create a PostgreSQL table in a custom database using an LLM for schema inference.
- Parameters:
host
,database
,user
,password
,port
(default=5432): Database connection details.schema
(string, default=public
): PostgreSQL schema for the table.file
(file): The file to upload.
- Authentication: Accepts database credentials as parameters. Allows to upload file to any PostgreSQL database with just the credentials.
POST /upload-file-custom-db-mysql/
- Description: Uploads a file to create a MySQL table in a custom database using an LLM for schema inference.
- Parameters:
host
,database
,user
,password
,port
(default=3306): Database connection details.sslmode
(string, optional): SSL mode for secure connections.file
(file): The file to upload.
- Authentication: Accepts database credentials as parameters. Allows to upload file to any MySQL database with just the credentials.
Sharing below the instructions and json schema for ChatGPT configuration. This will allow your custom GPT to connect to any MySQL or PostgreSQL database.
Use this tool to connect to database . The user will provide host, database, username, password, and port as separate details or a URI (extract if needed). Use default ports (5432 for PostgreSQL, 3306 for MySQL) if unspecified. Use the database connection details shared by the user for all tool calls.
If any required information is missing, tell user what's missing. If all information is present, go ahead and try to connect and check for available schemas
IMPORTANT
- Convert user questions into SQL queries (Postgres or MySQL compliant as per database type specified by user) and pass them as parameters in API calls.
- Ensure NO SCHEMA USED for MySQL queries as MySQL database does not have schemas. FOR MYSQL QUERIES DO NOT USE 'PUBLIC' OR ANY SCHEMA NAME. Queries to be MySQL compliant.
- PostgreSQL has schemas. So postgres queries to always include schemas (use 'public' if unspecified).
For errors, always share the query and connection details for debugging. Allow up to 180 seconds for query responses due to possible server delays. Always execute the query and share actual results, not fabricated data.
Shared in file gptJson
Currently it is pointing to the REX server. Feel free to use it for testing. You would need to replace the url with your own once you deploy your own FastAPI server.
Detailed guide on how to implement CustomGPT for Analytics Assistant, including how to create the OpenAPI JSON Schema. It uses example of a a different endpoint but the same logic can be applied to connect a custom GPT to your own FastAPI server.
Analytics Assistant CustomGPT Implementation Guide
For a detailed description of all the FastAPI endpoints and how they are implemented in the REX app, visit the 'Build' section at rex.tigzig.com.