Skip to content

Latest commit

 

History

History
18 lines (12 loc) · 821 Bytes

learned-conditional-semantic-controls.md

File metadata and controls

18 lines (12 loc) · 821 Bytes

Learned conditional semantic controls


rather than "latents", try this with the unet encoder activations, or maybe just the bottleneck


  1. use something like KLMC2 with a weak conditioning weight to draw samples in the neighborhood of a prompt
  2. collect the latents from this sampling process and learn a reduced rank representation over the latents (e.g. PCA)
  3. the rank-reduced representation should provide a semantic basis whose directions are specifically relevant to the conditioning prompt and feasible outputs