Skip to content

DualOrg/dual-ai-classifier

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DUAL AI Token Classifier

Node.js License

An AI-powered token classification tool for the DUAL platform. Automatically classifies tokens using OpenAI's GPT-4o-mini model, extracting category, tags, sentiment, and summary information.

Overview

The DUAL AI Token Classifier is a production-ready Node.js application that streamlines the classification and tagging of tokens on the DUAL platform. It leverages OpenAI's language models to intelligently categorize tokens and extract relevant metadata.

Key Features

  • Automatic Classification: Use AI to classify tokens into predefined categories
  • Batch Processing: Process multiple tokens efficiently in configurable batches
  • Dry-Run Mode: Preview classifications without making changes to DUAL API
  • Error Handling: Robust error handling with detailed logging
  • Rate Limiting: Built-in delays to prevent API rate limiting
  • Template Filtering: Optionally classify tokens from specific templates only
  • Progress Tracking: Real-time statistics on classification progress

Architecture

┌─────────────────┐
│  DUAL API       │
│  (Fetch Objects)│
└────────┬────────┘
         │
         ▼
┌─────────────────┐
│  Token Objects  │
│  (Untagged)     │
└────────┬────────┘
         │
         ▼
┌─────────────────┐
│  Classifier     │
│  (LLM Engine)   │
└────────┬────────┘
         │
         ▼
┌─────────────────┐
│  Classification │
│  Results        │
└────────┬────────┘
         │
         ▼
┌─────────────────┐
│  DUAL API       │
│  (Tag Objects)  │
└─────────────────┘

Quick Start

Prerequisites

  • Node.js 18.0.0 or higher
  • DUAL API token
  • OpenAI API key

Installation

  1. Clone the repository:
git clone https://github.com/dual-network/dual-ai-classifier.git
cd dual-ai-classifier
  1. Install dependencies:
npm install
  1. Configure environment variables:
cp .env.example .env
# Edit .env with your API tokens
  1. Run the classifier:
npm run classify

Configuration

All configuration is managed through environment variables. Create a .env file based on .env.example:

Variable Default Description
DUAL_TOKEN (required) DUAL API authentication token
OPENAI_API_KEY (required) OpenAI API key for GPT access
DUAL_API_BASE https://blockv-labs.io DUAL API base URL
OPENAI_MODEL gpt-4o-mini OpenAI model to use
BATCH_SIZE 10 Number of objects per classification batch
DRY_RUN false Enable dry-run mode (preview only)
TEMPLATE_ID (optional) Filter by specific template ID
LIMIT 0 Max objects to process (0 = no limit)
LOG_LEVEL info Logging level (info, warn, error)

Usage

Basic Classification

Run the classifier on all untagged objects:

npm run classify

Dry-Run Mode

Preview classifications without writing to DUAL API:

npm run dry-run

Or with command-line flag:

npm run classify -- --dry-run

Classify Specific Template

Only classify tokens from a particular template:

npm run classify -- --template-id my-template-id

Batch Mode with Limit

Process a limited number of objects:

npm run classify -- --limit 50

Combine Arguments

npm run classify -- --template-id loyalty-tokens --limit 25 --dry-run

How Classification Works

The classifier uses OpenAI's GPT-4o-mini model to analyze token properties and generate classifications. Here's the process:

Token Categories

Tokens are classified into one of six categories:

  • loyalty: Loyalty programs, reward points, membership tokens
  • collectible: NFTs, digital collectibles, unique items
  • access-pass: Access tickets, passes, event entries
  • certificate: Certificates, credentials, achievements
  • coupon: Discount codes, promotional offers, redemption vouchers
  • identity: Identity documents, profiles, KYC tokens

Classification Output

For each token, the classifier produces:

  • category: The primary classification (one of the six above)
  • tags: Relevant tags extracted from the token properties
  • sentiment: Positive, neutral, or negative sentiment
  • summary: A brief one-line description of the token

Example Flow

Input Token:
{
  "id": "token-123",
  "name": "VIP Access Pass",
  "description": "Exclusive event entry",
  "properties": {
    "event": "tech-conference-2026",
    "validity": "2026-04-01"
  }
}

Classification Result:
{
  "category": "access-pass",
  "tags": ["event", "exclusive", "conference"],
  "sentiment": "positive",
  "summary": "Exclusive VIP access ticket for tech conference"
}

API Reference

DualClient

DUAL API client for authentication and communication.

Methods:

  • fetchObjects(params) - Fetch objects from DUAL API
  • searchObjects(query, limit, offset) - Search objects
  • patchObject(id, updates) - Update a single object
  • batchPatchObjects(patches) - Update multiple objects

Classifier

AI-powered classification engine using OpenAI.

Methods:

  • classify(tokenObject) - Classify a single token
  • classifyBatch(tokenObjects) - Classify multiple tokens
  • getValidCategories() - Get list of valid categories

Pipeline

Main orchestration pipeline.

Methods:

  • fetchUntaggedObjects() - Retrieve untagged objects from API
  • classifyBatch(objects) - Run classification on a batch
  • tagObjects(classifications) - Write results back to DUAL API
  • run() - Execute complete pipeline

Extending

Custom Categories

To add new categories, modify the VALID_CATEGORIES array in src/classifier.js:

const VALID_CATEGORIES = [
  'loyalty',
  'collectible',
  'access-pass',
  'certificate',
  'coupon',
  'identity',
  'your-new-category', // Add here
];

Then update the system prompt in buildSystemPrompt() to include documentation for the new category.

Different LLM Providers

To use a different LLM provider:

  1. Replace the OpenAI client initialization in Classifier.constructor()
  2. Update the message format in the classify() and classifyBatch() methods
  3. Ensure output is valid JSON matching the expected format

Example template for Anthropic Claude:

// Replace OpenAI import
import Anthropic from '@anthropic-ai/sdk';

// Initialize different client
this.client = new Anthropic({ apiKey });

// Adapt message format for new provider

Troubleshooting

"DUAL_TOKEN not set" Error

Make sure your .env file includes a valid DUAL token:

echo "DUAL_TOKEN=your-token-here" >> .env

"OPENAI_API_KEY not set" Error

Ensure your OpenAI API key is configured:

echo "OPENAI_API_KEY=sk-..." >> .env

API Rate Limiting

If you encounter rate limit errors:

  • Reduce BATCH_SIZE in your configuration
  • Use the --limit flag to process fewer objects per run
  • The pipeline automatically adds 500ms delays between batches

Authentication Failures

  • Verify your DUAL_TOKEN is valid and has API access
  • Check that OPENAI_API_KEY is a valid OpenAI API key
  • Ensure neither token has expired

No Objects Found

  • Check that you have untagged objects in your DUAL instance
  • Verify the --template-id filter matches actual templates
  • Try running with --dry-run to see what would be processed

Resources

License

MIT License - Copyright (c) 2026 DUAL Network. See LICENSE file for details.

Support

For issues, questions, or contributions, please refer to the DUAL documentation or contact the DUAL Network team.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors