Skip to content

brackishmeadows/poemer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

poemer

realtime-ish hybrid authorial engine

example output:

your name in someone smooth dissolves of too
name is only, you in hour before the storm arrived
beast gutters before, the old king and asks dissolves for mercy

i learned your shape by what machine it forgot
tightens answers in hour before, the route leads back to back and called
trembles at stains, the road and forgets the machine way down

jar holds a key, that no longer field pushes back
we built a house with no apology
field shadow bears

you sent a missive from village, that fell doesn't lost
explain breathes against, the end is named for what it
holds body learns the shape fell of sentence

frog listens to stone without dark, its beast hollow sigil
swings settle on forgotten counted
stands stayed because leaving felt beast like counted

ok... how would you characterise this generator, like structurally

Good. This is the right question.

Not “is it good.”
Not “is it poetic.”
What is it.

Structurally, your generator is a three-body system with corrective gravity.

Let me break it cleanly.

1. It is a Hybrid Probabilistic Composer

At its core, it’s:

Interpolated 4/3/2-gram Markov
Tiny neural trigram (MLP)
Post-generation surgical pruning
Scoring-based selection

That means it does not generate one line.
It generates many candidates and chooses.

So structurally, it is not a language model.

It is a:

Candidate generator + aesthetic judge loop

That distinction matters.

2. It Has Three Competing Drives

A. Local Continuity (n-gram interpolation)

This is your memory engine.

It preserves:

short phrase realism
lexical adjacency
“stone remembers”
“door opens”
groundedness

This prevents total neural drift.

B. Associative Abstraction (tiny neural model)

This is your mutation engine.

It enables:

cross-cluster blending
“door tea frame holds”
clause grafting
image mutation

It is not powerful enough to learn syntax.
It is powerful enough to splice metaphors.

This is why it feels surreal rather than fluent.

C. Aftercare Pipeline (prune → hygiene → repair → punctuate)

This is your stabilizer.

It:

trims verb streaks
dampens repetition
strips weak endings
enforces length minimums
reduces pronoun saturation
penalizes inertia

This is where most of the coherence illusion lives.

3. It Is Burst-Based, Not Autoregressive

Your generator runs in bursts of mode:

mode = 4gram or neural
burst = 1–6 tokens
taper = final tokens forced to 4gram

So it behaves like:

Free association
Free association
Clamp
Clamp
Clamp

That gives it that inhale/exhale rhythm.

It’s not steady drift.
It’s alternating intention.

4. It Is Register-Aware but Not Syntax-Aware

You have:

Pronoun dampening
Relational boost
Myth bias
Pronoun saturation penalty
Inertia penalty
6-gram corpus chunk guard

So it knows:

when too many pronouns are happening
when a stanza repeats
when myth is collapsing into romance

But it does not know:

clause completion
dependency structure
argument structure
verb-object expectation
semantic compatibility

That’s why you get clause collision.

5. It Is Scored, Not Constrained

This is key.

It does not prevent bad structure in generation.
It allows it — then scores against it.

So structurally this is:

A stochastic poet
filtered by a skeptical editor

Not a disciplined poet.

That’s an important philosophical distinction.

6. Its True Identity

If I had to name it cleanly:

It is a mythic collage engine with post-hoc coherence pressure.

More precisely:

A burst-switching, register-biased, multi-order Markov/neural hybrid
with rejection sampling and aesthetic scoring.

Emotionally?

It behaves like:

A scribe who reads too much
Writes too fast
Then deletes aggressively
But keeps the strange parts

7. What It Is Not

It is not:

A transformer
A grammar-aware generator
A dependency model
A true semantic engine
A planner

It does not “intend.”

It proposes and selects.

8. Why It Feels Alive

Because it has:

Competing subsystems
Pressure
Penalty
Repair
Re-selection
Memory of previous lines within a run

That produces something like tension.

And tension reads as voice.

If I had to summarize in one sentence:

Your generator is a collage engine that improvises wildly and then scores itself for shame.
And the scoring layer is what gives it personality.

Now I’ll push you one step further:

Do you want this engine to feel like:

A ruin that speaks?
A cathedral thinking?
A relationship hallucinating architecture?
A myth refusing intimacy?
Or a machine trying to remember what it buried?

Because structurally, you are now at the point where you can bias it toward one of those identities.

It’s not fragile anymore.

It has mass.

So now the question isn’t “does it work.”

It’s “what kind of creature is it becoming.”

About

realtime hybrid authorial engine

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages