Skip to content

MyWebIntelligence/mywebapi

Repository files navigation

Component repository (backend at scale, optional).
Start here for the main tool and installation: https://github.com/MyWebIntelligence/mwi

MyWebIntelligence API scale project [In progess, 2026 June]

Roadmap (stabilization)

Target: Q2 2026

  • End-to-end API aligned with mwi pipeline
  • Versioned endpoints + minimal compatibility policy
  • One-command demo + example dataset
  • CI tests (smoke tests + lint)

This repository contains two main applications that are currently in a state of transition:

  1. MyWebIntelligenceAPI: A modern backend API built with FastAPI, PostgreSQL, and Celery. It is designed for crawling and content analysis and is fully containerized. This is the recommended component to run.
  2. MyWebClient: A legacy web client (React + Node.js) that provides a UI for a SQLite database generated by a separate, older project (MyWebIntelligencePython). It does not connect to the MyWebIntelligenceAPI.

The long-term goal is to migrate MyWebClient to use the new MyWebIntelligenceAPI, but they currently operate independently.

MyWebIntelligence Banner

1. MyWebIntelligenceAPI (Backend Service)

This is the main API for crawling, analysis, and data management. The simplest way to run it is with Docker.

Prerequisites

  • Docker and Docker Compose

Installation (Docker)

  1. Clone the project

    git clone <repository-url>
    cd MyWebIntelligenceProject
  2. Configure environment variables The API requires its own environment file. Copy the example file inside the MyWebIntelligenceAPI directory.

    cp MyWebIntelligenceAPI/.env.example MyWebIntelligenceAPI/.env

    Edit MyWebIntelligenceAPI/.env if necessary. You must set the SECRET_KEY and other credentials.

  3. Start the services From the project root, run the following command. This will build the API image and start all necessary backend services.

    docker-compose up -d

    Database migrations are applied automatically when the API container starts, so no manual steps are needed.

Accessing the API

Included Services

The docker-compose.yml at the root orchestrates the following backend services:

  • db: PostgreSQL 15 database.
  • redis: Redis server for caching and Celery message brokering.
  • mywebintelligenceapi: The FastAPI application.
  • celery_worker: Celery worker for handling asynchronous tasks like crawling.

Note: For local development without Docker, please refer to the instructions in MyWebIntelligenceAPI/README.md.


2. MyWebClient (Legacy Frontend)

This is the legacy web client. It runs independently from the MyWebIntelligenceAPI and connects to a separate SQLite database. It is considered a legacy component and is scheduled for future development to integrate with the new API.

Installation

The installation for MyWebClient is documented in its own directory. It can be run either with Docker or directly from the source code.

For detailed instructions, please refer to the dedicated README file: ➡️ MyWebClient/README.md

The guide above explains how to:

  • Build and run the client using Docker.
  • Install and run it from the source code using yarn.
  • Correctly connect it to your legacy SQLite database file (mwi.db).

🏗️ Project Structure

.
├── MyWebClient/              # Legacy Web App (React + Node.js) -> See dedicated README
│   ├── client/
│   ├── server/
│   └── README.md
├── MyWebIntelligenceAPI/       # Modern API (FastAPI)
│   ├── app/
│   ├── .env.example
│   └── README.md
├── docker-compose.yml        # API services orchestration (only)
└── README.md                 # This file

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •