This is a fairly far-reaching commit. The idea is:
* Introduce a unicode_version config that specifies the default level
of unicode conformance for each newly created Terminal (each Pane)
* The unicode_version is passed down to the `grapheme_column_width`
function which interprets the width based on the version
* `Cell` records the width so that later calculations don't need to
know the unicode version
In a subsequent diff, I will introduce an escape sequence that allows
setting/pushing/popping the unicode version so that it can be overridden
via eg: a shell alias prior to launching an application that uses a
different version of unicode from the default.
This approach allows output from multiple applications with differing
understanding of unicode to coexist on the same screen a little more
sanely.
Note that the default `unicode_version` is set to 9, which means that
emoji presentation selectors are now by-default ignored. This was
selected to better match the level of support in widely deployed
applications.
I expect to raise that default version in the future.
Also worth noting: there are a number of callers of
`unicode_column_width` in things like overlays and lua helper functions
that pass `None` for the unicode version: these will assume the latest
known-to-wezterm/termwiz version of unicode to be desired. If those
overlays do things with emoji presentation selectors, then there may be
some alignment artifacts. That can be tackled in a follow up commit.
refs: #1231
refs: #997
* mux: unzoom when switching panes
Add `unzoom_on_switch_pane` config option:
When switching to another pane with ActivatePaneDirection, if the
current pane is zoomed, unzoom and then switch instead of doing nothing.
* Apply suggestions from code review
Co-authored-by: Wez Furlong <wez@wezfurlong.org>
The issue seems to correlate with fractional pixels produced
by 0.5cell padding, so let's round down to avoid the fraction.
refs: https://github.com/wez/wezterm/issues/1228
Was mistakenly trying to use the windows font because of negative logic.
Tidy up, and prefer the gnome default theme font (Cantarell), but fall
back to DejaVu Sans.
This commit introduces the `Dimension` type which allows specifying
a value in a variety of units; pixels, points, cells, percent.
`Dimension` needs contextual information to be evaluated as pixel
values, which makes resolving the value from the config slightly
more of a chore.
However, this type allows more flexible configurations that scale
with the font size and display dpi.
refs: #1124, #291, #195
Avoid accidentally scaling the tab bar when using IncreaseFontSize.
Use a "better" default title font based on the platform.
Avoid a gap between bottom of tab button and dividing line at
certain font sizes.
`use_fancy_tab_bar` switches to an alternate rendering of the tab
bar that uses the window_frame config to get a proportional
title font to use to render tabs, as well as rendering a few
additional elements to space out and make the tabs feel more
like tabs.
Computing the number of tabs doesn't respect the alternate font
at this time.
Formatted tab item foreground and background colors are also
not respected at this time.
refs: #1180
* Trigger a paint immediately from invalidate if not throttled
* Otherwise defer the other events until we're about to sleep for xcb
events, which should maximize the coalesce around resize/expose events
refs: #1051
This makes the comparison in https://github.com/wez/wezterm/issues/544
work for me on mac, linux (x11, wayland) and also on Windows but
only using WGL.
It looks like we can use the proper colorspace on all targets
except for ANGLE EGL. For whatever reason, the combination of
glium and ANGLE EGL on windows over-gamma corrects.
AFAICT, the framebuffer and perhaps the surfaces it creates
don't indicate srgb support, and whatever combination of status
they return tickles glium's srgb stuff the wrong way.
I think the "solution" is just to directly use WGL by default.
EGL was on by default because it tended to be more survivable
when graphics card drivers were updated, but the last couple
of times I updated mine it still killed wezterm anyway.
refs: #544
This size seems to be ~sweet spot for `time cat bigfile`, cutting the
time in half from the prior value. Making the buffer smaller doesn't
improve things, nor does making it bigger; this is balancing the latency
of applying the update to the model with the back-pressured parsing
latency.
I'm not sure if this is needed now that we have a single draw call, but
based on the history and the nuance of different gl/driver/os quirks it
feels like a good idea to keep this option in the back pocket.
Previously, we'd use a 1MB buffer both to read the output from the
associated pty (blocking), and the same size buffer again to do the
non-blocking read on top of that.
For pathological cases (eg: cat 100MB+ files), we could build a
resulting `Vec<Action>` with over 1mm entries and it could take as much
as 100ms to apply those actions to the terminal model.
This meant that the output could stutter/lag and appear to be processed
more slowly.
This commit introduces a configuration value for the buffer size for the
second stage, and makes it 10KB in size. This helps to constrain the
size of the Action vec and keeps the incremental processing costs down,
while still managing the same throughput.