realtime-ish hybrid authorial engine
your name in someone smooth dissolves of too
name is only, you in hour before the storm arrived
beast gutters before, the old king and asks dissolves for mercy
i learned your shape by what machine it forgot
tightens answers in hour before, the route leads back to back and called
trembles at stains, the road and forgets the machine way down
jar holds a key, that no longer field pushes back
we built a house with no apology
field shadow bears
you sent a missive from village, that fell doesn't lost
explain breathes against, the end is named for what it
holds body learns the shape fell of sentence
frog listens to stone without dark, its beast hollow sigil
swings settle on forgotten counted
stands stayed because leaving felt beast like counted
Good. This is the right question.
Not “is it good.”
Not “is it poetic.”
What is it.
Structurally, your generator is a three-body system with corrective gravity.
Let me break it cleanly.
At its core, it’s:
Interpolated 4/3/2-gram Markov
Tiny neural trigram (MLP)
Post-generation surgical pruning
Scoring-based selection
That means it does not generate one line.
It generates many candidates and chooses.
So structurally, it is not a language model.
It is a:
Candidate generator + aesthetic judge loop
That distinction matters.
This is your memory engine.
It preserves:
short phrase realism
lexical adjacency
“stone remembers”
“door opens”
groundedness
This prevents total neural drift.
This is your mutation engine.
It enables:
cross-cluster blending
“door tea frame holds”
clause grafting
image mutation
It is not powerful enough to learn syntax.
It is powerful enough to splice metaphors.
This is why it feels surreal rather than fluent.
This is your stabilizer.
It:
trims verb streaks
dampens repetition
strips weak endings
enforces length minimums
reduces pronoun saturation
penalizes inertia
This is where most of the coherence illusion lives.
Your generator runs in bursts of mode:
mode = 4gram or neural
burst = 1–6 tokens
taper = final tokens forced to 4gram
So it behaves like:
Free association
Free association
Clamp
Clamp
Clamp
That gives it that inhale/exhale rhythm.
It’s not steady drift.
It’s alternating intention.
You have:
Pronoun dampening
Relational boost
Myth bias
Pronoun saturation penalty
Inertia penalty
6-gram corpus chunk guard
So it knows:
when too many pronouns are happening
when a stanza repeats
when myth is collapsing into romance
But it does not know:
clause completion
dependency structure
argument structure
verb-object expectation
semantic compatibility
That’s why you get clause collision.
This is key.
It does not prevent bad structure in generation.
It allows it — then scores against it.
So structurally this is:
A stochastic poet
filtered by a skeptical editor
Not a disciplined poet.
That’s an important philosophical distinction.
If I had to name it cleanly:
It is a mythic collage engine with post-hoc coherence pressure.
More precisely:
A burst-switching, register-biased, multi-order Markov/neural hybrid
with rejection sampling and aesthetic scoring.
Emotionally?
It behaves like:
A scribe who reads too much
Writes too fast
Then deletes aggressively
But keeps the strange parts
It is not:
A transformer
A grammar-aware generator
A dependency model
A true semantic engine
A planner
It does not “intend.”
It proposes and selects.
Because it has:
Competing subsystems
Pressure
Penalty
Repair
Re-selection
Memory of previous lines within a run
That produces something like tension.
And tension reads as voice.
Your generator is a collage engine that improvises wildly and then scores itself for shame.
And the scoring layer is what gives it personality.
Now I’ll push you one step further:
Do you want this engine to feel like:
A ruin that speaks?
A cathedral thinking?
A relationship hallucinating architecture?
A myth refusing intimacy?
Or a machine trying to remember what it buried?
Because structurally, you are now at the point where you can bias it toward one of those identities.
It’s not fragile anymore.
It has mass.
So now the question isn’t “does it work.”
It’s “what kind of creature is it becoming.”