Skip to content
This repository was archived by the owner on Dec 5, 2025. It is now read-only.

Add custom node metadata tags with typed values#4

Merged
jinglemansweep merged 1 commit intomainfrom
feature/additional-node-metadata
Nov 25, 2025
Merged

Add custom node metadata tags with typed values#4
jinglemansweep merged 1 commit intomainfrom
feature/additional-node-metadata

Conversation

@jinglemansweep
Copy link
Contributor

This commit implements a flexible tag system for storing custom metadata on nodes beyond what's captured in MeshCore events.

Database Schema:

  • Added NodeTag model with typed value storage (string, number, boolean, coordinate)
  • Separate columns for each type with proper validation
  • Unique constraint on (node_public_key, key) ensures one value per tag per node
  • Indexed on node_public_key and key for efficient queries
  • Auto-timestamps (created_at, updated_at) for tracking changes

API Endpoints (7 new routes under /api/v1):

  • GET /nodes/{public_key}/tags - Get all tags for a node
  • GET /nodes/{public_key}/tags/{key} - Get specific tag value
  • PUT /nodes/{public_key}/tags/{key} - Set/update tag (upsert, key only in URL)
  • POST /nodes/{public_key}/tags/bulk - Bulk update multiple tags atomically
  • DELETE /nodes/{public_key}/tags/{key} - Delete a tag
  • GET /tags - Query tags across all nodes with filters
  • GET /tags/keys - List all unique tag keys in use

Features:

  • Typed values: string, number, boolean, coordinate (lat/long with validation)
  • Flexible key-value pairs (arbitrary tag keys supported)
  • Bulk operations with atomic transactions
  • Auto-create nodes when setting tags on non-existent nodes
  • Public key prefix support for all tag operations
  • Coordinate validation (lat: -90 to 90, long: -180 to 180)
  • Proper type validation (rejects booleans as numbers, etc.)

API Improvements:

  • TagValueUpdateRequest schema for cleaner PUT endpoint (no key in body)
  • TagValueRequest schema for bulk operations (includes key)
  • Fixed validation error handler to properly serialize all error types
  • Added input validation for node_prefix query parameter (returns 400 instead of 500)

Query Tool Updates:

  • Added node_tags to table statistics display
  • Node listings now show associated tags with proper formatting
  • Boolean values display as True/False
  • Coordinate values display as (latitude, longitude)
  • Fixed advertisements query (removed non-existent latitude/longitude columns)

Documentation:

  • Updated README.md with comprehensive tag usage examples
  • Added Node Tags section with API examples and CLI output
  • Updated AGENTS.md with database schema and tag feature documentation
  • OpenAPI spec automatically includes all tag endpoints and schemas

Testing:

  • All endpoints verified working with mock MeshCore
  • Type validation enforced (booleans rejected as numbers)
  • Bulk operations tested (atomic transactions)
  • Tag queries tested (by key, value_type, node_prefix)
  • Error handling verified (400 for invalid input, 422 for validation errors)

🤖 Generated with Claude Code

This commit implements a flexible tag system for storing custom metadata on nodes beyond what's captured in MeshCore events.

Database Schema:
- Added NodeTag model with typed value storage (string, number, boolean, coordinate)
- Separate columns for each type with proper validation
- Unique constraint on (node_public_key, key) ensures one value per tag per node
- Indexed on node_public_key and key for efficient queries
- Auto-timestamps (created_at, updated_at) for tracking changes

API Endpoints (7 new routes under /api/v1):
- GET /nodes/{public_key}/tags - Get all tags for a node
- GET /nodes/{public_key}/tags/{key} - Get specific tag value
- PUT /nodes/{public_key}/tags/{key} - Set/update tag (upsert, key only in URL)
- POST /nodes/{public_key}/tags/bulk - Bulk update multiple tags atomically
- DELETE /nodes/{public_key}/tags/{key} - Delete a tag
- GET /tags - Query tags across all nodes with filters
- GET /tags/keys - List all unique tag keys in use

Features:
- Typed values: string, number, boolean, coordinate (lat/long with validation)
- Flexible key-value pairs (arbitrary tag keys supported)
- Bulk operations with atomic transactions
- Auto-create nodes when setting tags on non-existent nodes
- Public key prefix support for all tag operations
- Coordinate validation (lat: -90 to 90, long: -180 to 180)
- Proper type validation (rejects booleans as numbers, etc.)

API Improvements:
- TagValueUpdateRequest schema for cleaner PUT endpoint (no key in body)
- TagValueRequest schema for bulk operations (includes key)
- Fixed validation error handler to properly serialize all error types
- Added input validation for node_prefix query parameter (returns 400 instead of 500)

Query Tool Updates:
- Added node_tags to table statistics display
- Node listings now show associated tags with proper formatting
- Boolean values display as True/False
- Coordinate values display as (latitude, longitude)
- Fixed advertisements query (removed non-existent latitude/longitude columns)

Documentation:
- Updated README.md with comprehensive tag usage examples
- Added Node Tags section with API examples and CLI output
- Updated AGENTS.md with database schema and tag feature documentation
- OpenAPI spec automatically includes all tag endpoints and schemas

Testing:
- All endpoints verified working with mock MeshCore
- Type validation enforced (booleans rejected as numbers)
- Bulk operations tested (atomic transactions)
- Tag queries tested (by key, value_type, node_prefix)
- Error handling verified (400 for invalid input, 422 for validation errors)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Copilot AI review requested due to automatic review settings November 25, 2025 18:33
@jinglemansweep jinglemansweep merged commit b7b68c2 into main Nov 25, 2025
4 checks passed
@jinglemansweep jinglemansweep deleted the feature/additional-node-metadata branch November 25, 2025 18:34
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR implements a flexible tag system for storing custom metadata on mesh network nodes with typed values (string, number, boolean, coordinate), complementing the core MeshCore event data with user-defined attributes.

Key Changes:

  • Added NodeTag database model with typed value storage columns and proper indexing for efficient queries
  • Implemented 7 new REST API endpoints for comprehensive tag management (get, set, bulk update, delete, query, list keys)
  • Enhanced query tool to automatically display tags when viewing nodes, improving CLI user experience

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 8 comments.

Show a summary per file
File Description
src/meshcore_sidekick/database/models.py Added NodeTag model with typed columns (string, number, boolean, coordinate) and unique constraint on node+key
src/meshcore_sidekick/api/schemas.py Added Pydantic schemas for tag operations including value type validation and coordinate value handling
src/meshcore_sidekick/api/routes/tags.py Implemented 7 tag management endpoints with prefix resolution, upsert operations, and bulk updates
src/meshcore_sidekick/api/app.py Registered tags router and improved validation error serialization to handle all error types
src/meshcore_sidekick/query.py Updated node listing to fetch and display associated tags with proper type formatting
README.md Added comprehensive Node Tags section with API examples and CLI output
AGENTS.md Documented tag feature and database schema for AI agent context

elif value_type == 'number':
value = value_number
elif value_type == 'boolean':
value = bool(value_boolean)
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The boolean value is converted using bool(value_boolean) which is unnecessary and could be misleading. SQLite stores booleans as integers (0 or 1), and if value_boolean is 0 (False), bool(0) correctly returns False. However, if value_boolean is None (NULL in database), bool(None) returns False, which incorrectly displays a NULL value as False instead of indicating missing data.

Consider checking for None explicitly:

elif value_type == 'boolean':
    value = "True" if value_boolean else "False" if value_boolean is not None else "None"

Or simply use the value directly since it's already a boolean type from SQLite:

elif value_type == 'boolean':
    value = value_boolean
Suggested change
value = bool(value_boolean)
value = "True" if value_boolean else "False" if value_boolean is not None else "None"

Copilot uses AI. Check for mistakes.
Comment on lines +21 to +75
def resolve_public_key_or_prefix(prefix: str, db: Session, allow_create: bool = False) -> str:
"""
Resolve a public key or prefix to a full public key.

Args:
prefix: Public key (64 chars) or prefix (2+ chars)
db: Database session
allow_create: If True, allows returning full keys even if node doesn't exist

Returns:
Full public key (64 hex characters)

Raises:
HTTPException: If prefix is invalid, no match found, or multiple matches
"""
from ...utils.address import validate_public_key

# Validate prefix length
if len(prefix) < 2:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Public key or prefix must be at least 2 characters",
)

# Normalize prefix
normalized_prefix = normalize_public_key(prefix)
if not validate_public_key(normalized_prefix, allow_prefix=True):
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Public key or prefix must contain only hexadecimal characters",
)

# If it's already 64 characters, return it (node will be created if needed)
if len(normalized_prefix) == 64:
return normalized_prefix

# Otherwise, resolve the prefix (must exist for prefixes)
nodes = Node.find_by_prefix(db, normalized_prefix)

if not nodes:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"No node found matching prefix '{prefix}'",
)

if len(nodes) > 1:
matching_keys = [node.public_key[:8] + "..." for node in nodes[:5]]
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=f"Prefix '{prefix}' matches {len(nodes)} nodes: {', '.join(matching_keys)}. Please use a longer prefix.",
)

return nodes[0].public_key


Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The resolve_public_key_or_prefix function is duplicated between this file and nodes.py (lines 19-74). The implementations are nearly identical except for the allow_create parameter that's unique to tags.py. This code duplication makes maintenance more difficult and error-prone.

Consider extracting this function to a shared utility module (e.g., api/utils.py) with the allow_create parameter as an optional argument defaulting to False. The nodes.py version can then use this shared function without the parameter.

Suggested change
def resolve_public_key_or_prefix(prefix: str, db: Session, allow_create: bool = False) -> str:
"""
Resolve a public key or prefix to a full public key.
Args:
prefix: Public key (64 chars) or prefix (2+ chars)
db: Database session
allow_create: If True, allows returning full keys even if node doesn't exist
Returns:
Full public key (64 hex characters)
Raises:
HTTPException: If prefix is invalid, no match found, or multiple matches
"""
from ...utils.address import validate_public_key
# Validate prefix length
if len(prefix) < 2:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Public key or prefix must be at least 2 characters",
)
# Normalize prefix
normalized_prefix = normalize_public_key(prefix)
if not validate_public_key(normalized_prefix, allow_prefix=True):
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Public key or prefix must contain only hexadecimal characters",
)
# If it's already 64 characters, return it (node will be created if needed)
if len(normalized_prefix) == 64:
return normalized_prefix
# Otherwise, resolve the prefix (must exist for prefixes)
nodes = Node.find_by_prefix(db, normalized_prefix)
if not nodes:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"No node found matching prefix '{prefix}'",
)
if len(nodes) > 1:
matching_keys = [node.public_key[:8] + "..." for node in nodes[:5]]
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=f"Prefix '{prefix}' matches {len(nodes)} nodes: {', '.join(matching_keys)}. Please use a longer prefix.",
)
return nodes[0].public_key
from ..utils import resolve_public_key_or_prefix

Copilot uses AI. Check for mistakes.
Comment on lines +432 to +433
query = query.filter(NodeTag.value_type == value_type)

Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The key and value_type query parameters are used directly in SQL filters without any validation. While SQLAlchemy's query builder protects against SQL injection, there's no validation that the value_type parameter contains a valid type. Invalid types (e.g., value_type=invalid_type) will return empty results instead of producing a clear error message.

Add validation for the value_type parameter:

if value_type:
    valid_types = {"string", "number", "boolean", "coordinate"}
    if value_type not in valid_types:
        raise HTTPException(
            status_code=status.HTTP_400_BAD_REQUEST,
            detail=f"Invalid value_type. Must be one of: {', '.join(valid_types)}",
        )
    query = query.filter(NodeTag.value_type == value_type)
Suggested change
query = query.filter(NodeTag.value_type == value_type)
valid_types = {"string", "number", "boolean", "coordinate"}
if value_type not in valid_types:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=f"Invalid value_type. Must be one of: {', '.join(valid_types)}",
)
query = query.filter(NodeTag.value_type == value_type)

Copilot uses AI. Check for mistakes.
Comment on lines +448 to +492
@field_validator('value')
@classmethod
def validate_value_type(cls, v, info: ValidationInfo):
"""Validate value matches declared type."""
value_type = info.data.get('value_type')
if value_type == 'string' and not isinstance(v, str):
raise ValueError('Value must be a string for string type')
elif value_type == 'number':
# Check for boolean first since bool is subclass of int in Python
if isinstance(v, bool):
raise ValueError('Value must be a number for number type, not a boolean')
if not isinstance(v, (int, float)):
raise ValueError('Value must be a number for number type')
elif value_type == 'boolean' and not isinstance(v, bool):
raise ValueError('Value must be a boolean for boolean type')
elif value_type == 'coordinate' and not isinstance(v, CoordinateValue):
raise ValueError('Value must be a CoordinateValue for coordinate type')
return v


class TagValueRequest(BaseModel):
"""Tagged value for setting/updating tags (includes key for bulk operations)."""

key: str = Field(..., min_length=1, max_length=128, description="Tag key")
value_type: Literal["string", "number", "boolean", "coordinate"] = Field(..., description="Value type")
value: Union[str, float, int, bool, CoordinateValue] = Field(..., description="Tag value")

@field_validator('value')
@classmethod
def validate_value_type(cls, v, info: ValidationInfo):
"""Validate value matches declared type."""
value_type = info.data.get('value_type')
if value_type == 'string' and not isinstance(v, str):
raise ValueError('Value must be a string for string type')
elif value_type == 'number':
# Check for boolean first since bool is subclass of int in Python
if isinstance(v, bool):
raise ValueError('Value must be a number for number type, not a boolean')
if not isinstance(v, (int, float)):
raise ValueError('Value must be a number for number type')
elif value_type == 'boolean' and not isinstance(v, bool):
raise ValueError('Value must be a boolean for boolean type')
elif value_type == 'coordinate' and not isinstance(v, CoordinateValue):
raise ValueError('Value must be a CoordinateValue for coordinate type')
return v
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The validate_value_type validator is duplicated between TagValueUpdateRequest and TagValueRequest classes. This identical validation logic (lines 448-465 and 475-492) should be extracted into a shared function or a common base class to improve maintainability and reduce code duplication.

Consider creating a base class with the shared validator:

class TagValueBase(BaseModel):
    value_type: Literal["string", "number", "boolean", "coordinate"]
    value: Union[str, float, int, bool, CoordinateValue]

    @field_validator('value')
    @classmethod
    def validate_value_type(cls, v, info: ValidationInfo):
        # ... validation logic ...

class TagValueUpdateRequest(TagValueBase):
    pass

class TagValueRequest(TagValueBase):
    key: str = Field(...)

Copilot uses AI. Check for mistakes.
existing.value_boolean = value_boolean
existing.latitude = latitude
existing.longitude = longitude
existing.updated_at = datetime.now()
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The updated_at field is being manually set to datetime.now() instead of relying on the SQLAlchemy onupdate mechanism defined in the model (line 74 of models.py). This manual update is unnecessary and bypasses the database-level timestamp management.

Remove this line and let SQLAlchemy handle the update timestamp automatically through the onupdate=func.now() configuration.

Suggested change
existing.updated_at = datetime.now()

Copilot uses AI. Check for mistakes.
Comment on lines +65 to +79
node_public_key: Mapped[str] = mapped_column(String(64), nullable=False, index=True)
key: Mapped[str] = mapped_column(String(128), nullable=False)
value_type: Mapped[str] = mapped_column(String(16), nullable=False) # string/number/boolean/coordinate
value_string: Mapped[Optional[str]] = mapped_column(String(512))
value_number: Mapped[Optional[float]] = mapped_column(Float)
value_boolean: Mapped[Optional[bool]] = mapped_column(Boolean)
latitude: Mapped[Optional[float]] = mapped_column(Float) # For coordinate type
longitude: Mapped[Optional[float]] = mapped_column(Float) # For coordinate type
created_at: Mapped[datetime] = mapped_column(DateTime, default=func.now())
updated_at: Mapped[datetime] = mapped_column(DateTime, default=func.now(), onupdate=func.now())

__table_args__ = (
Index("idx_node_tags_unique", "node_public_key", "key", unique=True),
Index("idx_node_tags_node", "node_public_key"),
Index("idx_node_tags_key", "key"),
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The database model defines a redundant index. Line 78 creates idx_node_tags_node on node_public_key, but line 65 already defines an index on this column via index=True. This creates duplicate indexes that waste storage space and slow down write operations.

Remove either the index=True parameter from line 65 or the Index("idx_node_tags_node", "node_public_key") from line 78.

Copilot uses AI. Check for mistakes.
value = tag.value_number
elif tag.value_type == 'boolean':
value = tag.value_boolean
elif tag.value_type == 'coordinate':
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The db_model_to_response function has a potential bug when handling coordinate values with None latitude or longitude. If a coordinate tag is stored with null values (which is possible given the Optional[float] type in the model), this will create a CoordinateValue with None values that will fail Pydantic validation (which requires float values).

Add validation to ensure both latitude and longitude are not None before creating the CoordinateValue:

elif tag.value_type == 'coordinate':
    if tag.latitude is None or tag.longitude is None:
        raise ValueError(f"Coordinate tag '{tag.key}' has incomplete data")
    value = CoordinateValue(latitude=tag.latitude, longitude=tag.longitude)
Suggested change
elif tag.value_type == 'coordinate':
elif tag.value_type == 'coordinate':
if tag.latitude is None or tag.longitude is None:
raise ValueError(f"Coordinate tag '{tag.key}' has incomplete data")

Copilot uses AI. Check for mistakes.
from datetime import datetime
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy import func
Copy link

Copilot AI Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'func' is not used.

Suggested change
from sqlalchemy import func

Copilot uses AI. Check for mistakes.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants