Skip to content

jEFFLEZ/beam

Repository files navigation

@funeste38/beam

beam is the pipeline vocabulary layer for Funesterie.

It gives QFLUSH and related tools a small, typed way to describe execution styles such as streaming, targeted mutation, batch processing and multi-phase orchestration. The Dragon Ball inspired aliases are optional sugar on top of a real technical model.

What it does

  • Normalizes friendly aliases like kamehameha, gogeta, or genkidama
  • Routes jobs to one of five pipeline families: beam, drill, bomb, fusion, ultra
  • Lets host systems inject their own handlers and services
  • Ships a ready-to-use QFLUSH adapter via runQflushBeam()

Install

npm install @funeste38/beam

Quick start

import { runQflushBeam } from "@funeste38/beam";

const result = await runQflushBeam({
  type: "gogeta",
  source: "repo:/",
  target: "cortex:/encode",
  payload: { file: "D:/img.png" }
});

console.log(result.ok, result.type, result.logs);

Pipeline families

Type Purpose Typical use
beam Continuous flow streaming transforms, encoding, transfer
drill Precision work targeted patches, block edits, pixel or offset rewrites
bomb Batch / aggregate repo scans, multi-file passes, reporting
fusion Multi-phase combo combine beam, drill, and bomb
ultra Full orchestration end-to-end pipeline with optimisation pass

Alias system

beam accepts both technical names and Funesterie aliases.

  • kame, kameha, kamehameha -> beam
  • mkp, makanko, makankosappo, specialbeam -> drill
  • genki, genkidama, spiritbomb -> bomb
  • gogeta, vegito, fusiondance -> fusion
  • ssj, god, limitbreaker, superfunesterie -> ultra

Main API

dispatchBeam(job, ctx)

Low-level dispatcher that resolves aliases and calls the matching handler.

createQflushBeamContext(opts)

Creates a BeamContext with default QFLUSH-oriented handlers.

runQflushBeam(job, opts?)

Convenient high-level entry point for QFLUSH integrations.

Current limitations

The included handlers are intentionally conservative and some remain placeholders. The package already defines the orchestration contract cleanly, but the most advanced execution logic still belongs in host systems like QFLUSH, CORTEX, or SPYDER.

Good next improvements

  • Add first-class metrics per pipeline phase
  • Expose composable middleware before and after each handler
  • Replace placeholder optimisation with deterministic planner output
  • Add fixtures for real QFLUSH/CORTEX jobs

Development

npm install
npm run build
npm test

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors