Description
termwiz currently defines PaletteIndex
as u8
, with the "256 color" intention. But the *-direct
variant terminals use the index as a 3-byte true color index. The one-byte index affects only the blue byte. This means PaletteIndex
users will be surprised using the PaletteIndex
to render index within 8..256 using those terminals.
The main problem is that the setaf
is defined as:
TERM='tmux-direct' infocmp | grep setaf
setaf=\E[%?%p1%{8}%<%t3%p1%d%e38:2::%p1%{65536}%/%d:%p1%{256}%/%{255}%&%d:%p1%{255}%&%d%;m,
which only special treats the first 8 colors and treats everything else as a "direct" RGB color index.
The ncurses terminfo comments mentioned "color index" but it does not seem universally supported. There is an RGB
capability that tput RGB
works but infocmp
does not show it. The "max colors" is now 16M.
Note there is force_terminfo_render_to_use_ansi_sgr
but it might have unwanted side effects. @chadaustin reported ^O
around some other color renderings. So it might not be a practical solution.
Describe the solution you'd like
Given that terminfo setaf uses low-level "color index" (which is ambiguous - rgb(0,0,0..8)
conflicts with basic colors) while termwiz already provides high-level types for different kinds of colors. I think the following might be a reasonable solution.
- When
MaxColors
is 16M (which should setColorLevel
toTrueColor
but that's a separate issue), do not usesetaf
to render 256 or 16 colors. - Maybe rename
TrueColorWithPaletteFallback
toTrueColorWith256Fallback
, since thePalette
might depend on context while256
is less ambiguous. - Maybe the
force_terminfo_render_to_use_ansi_sgr
can be a bitflag to control when to fallback more precisely. Its initialization can be "smarter" by running testingsetaf
with a memory buffer to figure out how the first 8, 8-16, 16-256 colors are rendered, and choose sgr fallback smartly.