Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ Configure the claude_desktop_config.json (Settings>Developer>Edit Config) by add
"mcpServers": {
"teradata": {
"command": "uvx",
"args": ["teradata-mcp-server", "--profile", "all"],
"args": ["teradata-mcp-server"],
"env": {
"DATABASE_URI": "teradata://<USERNAME>:<PASSWORD>@<HOST_URL>:1025/<USERNAME>"
}
Expand Down
14 changes: 8 additions & 6 deletions docs/developer_guide/DEVELOPER_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -355,6 +355,7 @@ mcp dev teradata-mcp-server
## Build, Test, and Publish

We build with **uv**, test locally (wheel), then push to **TestPyPI** before PyPI.
The examples below use the Twine utility.

### Versions
- The CLI reads its version from package metadata (`importlib.metadata`).
Expand All @@ -370,8 +371,7 @@ uv build --no-cache
### 2) Test the wheel locally (no install)
```bash
# Run the installed console entry point from the wheel
uvx ./dist/teradata_mcp_server-<ver>-py3-none-any.whl \
python -m teradata_mcp_server --version
uvx ./dist/teradata_mcp_server-<ver>-py3-none-any.whl teradata_mcp_server --version

# Or install as a persistent tool and run
uv tool install --reinstall ./dist/teradata_mcp_server-<ver>-py3-none-any.whl
Expand All @@ -380,13 +380,13 @@ uv tool install --reinstall ./dist/teradata_mcp_server-<ver>-py3-none-any.whl

### 3) Verify metadata/README
```bash
twine check dist/*
uvx twine check dist/*
```

### 4) Publish to **TestPyPI** (dress rehearsal)
```bash
# Upload
python -m twine upload --repository testpypi dist/*
uvx twine upload --repository testpypi dist/*

# Try installing the just-published version with uvx
uvx --no-cache \
Expand All @@ -401,9 +401,11 @@ Notes:

### 5) Publish to **PyPI**
```bash
python -m twine upload dist/*
uvx twine upload dist/*
```
If you see `File already exists`, bump the version in `pyproject.toml`, rebuild, and upload again.
If you see `File already exists`, it is either:
- You haven't bumped the the version in `pyproject.toml`. Do so, rebuild, and upload again.
- You have prior builds in the ./dist directory. Remove prior or be specify the exact version (eg. `uvx twine upload dist/*1.4.0*`)

### 6) Post‑publish smoke test
```bash
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "teradata-mcp-server"
version = "0.1.4"
version = "0.1.5"
description = "Model Context Protocol (MCP) server for Teradata, Community edition"
readme = {file = "README.md", content-type = "text/markdown"}
license = {text = "MIT"}
Expand Down
33 changes: 25 additions & 8 deletions src/teradata_mcp_server/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -400,19 +400,25 @@ def _cube_query_tool(dimensions: str, measures: str, dim_filters: str, meas_filt
expr = mdef["expression"]
mes_lines.append(f"{expr} AS {measure}")
meas_list = ",\n ".join(mes_lines)
top_clause = f"TOP {top}" if top else ""
dim_comma = ",\n " if dim_list.strip() else ""
where_dim_clause = f"WHERE {dim_filters}" if dim_filters else ""
where_meas_clause = f"WHERE {meas_filters}" if meas_filters else ""
order_clause = f"ORDER BY {order_by}" if order_by else ""

sql = (
f"SELECT {'TOP ' + str(top) if top else ''} * from\n"
f"SELECT {top_clause} * from\n"
"(SELECT\n"
f" {dim_list}{',\n ' if dim_list.strip() else ''}"
f" {dim_list}{dim_comma}"
f" {meas_list}\n"
"FROM (\n"
f"{cube['sql'].strip()}\n"
f"{'WHERE '+dim_filters if dim_filters else ''}"
f"{where_dim_clause}"
") AS c\n"
f"GROUP BY {', '.join(dim_list_raw)}"
") AS a\n"
f"{'WHERE '+meas_filters if meas_filters else ''}"
f"{'ORDER BY '+order_by if order_by else ''}"
f"{where_meas_clause}"
f"{order_clause}"
";"
)
return sql
Expand All @@ -439,6 +445,17 @@ async def _dynamic_tool(dimensions, measures, dim_filters="", meas_filters="", o
measure_lines = []
for n, m in cube.get('measures', {}).items():
measure_lines.append(f" - {n}: {m.get('description', '')}")

# Create example strings for documentation
dim_examples = [f"{d} {e}" for d, e in zip(list(cube.get('dimensions', {}))[:2], ["= 'value'", "in ('X', 'Y', 'Z')"])]
dim_example = ' AND '.join(dim_examples)

meas_examples = [f"{m} {e}" for m, e in zip(list(cube.get('measures', {}))[:2], ["> 1000", "= 100"])]
meas_example = ' AND '.join(meas_examples)

order_examples = [f"{d} {e}" for d, e in zip(list(cube.get('dimensions', {}))[:2], [" ASC", " DESC"])]
order_example = ' , '.join(order_examples)

_dynamic_tool.__doc__ = f"""
Tool to query the cube '{name}'.
{cube.get('description', '')}
Expand All @@ -451,11 +468,11 @@ async def _dynamic_tool(dimensions, measures, dim_filters="", meas_filters="", o
{chr(10).join(measure_lines)}

* dim_filters (str): Filter expression to apply to dimensions. Valid dimension names are: [{', '.join(cube.get('dimensions', {}).keys())}], use valid SQL expressions, for example:
\"{' AND '.join([f"{d} {e}" for d, e in zip(list(cube.get('dimensions', {}))[:2], ["= 'value'", "in ('X', 'Y', 'Z')"])])}\"
"{dim_example}"
* meas_filters (str): Filter expression to apply to computed measures. Valid measure names are: [{', '.join(cube.get('measures', {}).keys())}], use valid SQL expressions, for example:
\"{' AND '.join([f"{m} {e}" for m, e in zip(list(cube.get('measures', {}))[:2], ["> 1000", "= 100"])])}\"
"{meas_example}"
* order_by (str): Order expression on any selected dimensions and measures. Use SQL syntax, for example:
\"{' , '.join([f"{d} {e}" for d, e in zip(list(cube.get('dimensions', {}))[:2], [" ASC", " DESC"])])}\"
"{order_example}"
top (int): Limit the number of rows returned, use a positive integer.

Returns:
Expand Down
2 changes: 1 addition & 1 deletion uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.