This commit adds some plumbing for describing the cursor shape
(block, line, blinking etc) and visibility, and feeds that through
the mux and render layers.
The renderer now knows to omit the cursor when it is not visible.
1f81a064ed added support for noticing
that the dpi scale was not 1 on startup, but the timing of this
signal was different between the opengl and software renderers.
When using the software renderer, we'd end up computing a scaling
change with a pre-change pixel size but adjusted by a post-post
scaling factor, and that effectively caused the window to halve
its size on startup.
This commit improves things by also tracking the dpi in our locally
stored dimensions.
@jsgf suggested that it would be nice to have a degree of padding
around the terminal cells. This commit adds some plumbing for this;
```
[window_padding]
left = 10
top = 0
right = 10
bottom = 0
```
The left and top padding are used top compute the top-left coordinates
of the terminal cells. The right and bottom padding act as minimum
values; the actual padding used may be larger depending on the size
of the window and the number of cells that fit the available space.
top padding > 0 and the tab bar needs some work.
The ongoing saga from a465378dc4,
f204ad9a82, and others!
Only do scale adjustment here for glyphs that are taller rather than
wider.
This makes the ligatures in fira code look good again, and still
works with the v1 of the operator mono fonts.
@sunshowers requested a way to turn this off, so here's a top level
config option to control whether we perform ligature substitution.
Ideally this would be associated with the font rather than globally,
but threading that information through the various layers is more
difficult than a global setting.
@sunshowers mentioned to me that the window appeared blurry on a hidpi
display on startup, and was fixed by changing focus in a tiling window
manager.
I could replicate this using weston with scaling set to 2; the issue was
that the initial scale factor change event wasn't fully propagated and
bubbled up as a resize event to the terminal layer.
This commit taps into the dpi change event and forces it to be
interpreted as a window configuration change, resulting in more crisp
text.
Show the error message using a toast notification, but fall back to
the defaults; this makes it easier to get in and fix the issue,
rather than silently failing.
This is useful when launched from a gui launcher and the stderr is
not conveniently visible.
If a panic occurs, generate a toast notification with the panic
message so that there is a breadcrumb to follow if the application
does terminate in this way.
Every so often I encounter this situation on my windows machine during
a graphics card driver update, so this should help to run that down.
Adds the following options for the top level configuration,
which allow manipulating the quality of hinting and AA that
occurs for freetype:
```
font_antialias = "Subpixel" # None, Greyscale, Subpixel
font_hinting = "Full" # None, Vertical, VerticalSubpixel, Full
```
Refs: https://github.com/wez/wezterm/issues/79
The move to anyhow changed the nature of the error objects
that get passed through to the notification system; since
we use the `context` method and used the normal display
presentation (via `to_string`) we were only reporting the
context and not the nature of the error.
Switching to alternate output makes the error messages more
useful by including the underlying problem and line number.
Refs: https://github.com/wez/wezterm/issues/78
As mentioned in f204ad9a82, this has
gone back and forth a few times.
This version avoids some artifacts by avoiding scaling in most cases.
The test scenario for this is to tab complete a directory name in
zsh; that causes a bold `/` glyph to be rendered which selects a
typeface with different metrics and would render a horizontal line
at either the top of bottom of the glyph.
Similarly, a `/` in italics (eg: comments in vim) would select a
third different font and have different artifact properties.
Now both of those artifacts are eliminated.
The issue mentioned from the prior commit was due to not breaking
out of the inner fallback loop, so we'd stack up multiple glyphs
with the same cluster value, causing the earlier versions to be
obscured by the later versions.
Adjust glyph width calculation: the metrics we were getting from
harfbuzz were synchronized with the font render size. When we're
configured to use allsorts, the shaper metrics are not connected
to the render metrics and we can end up with the raw emoji glyph
width being much larger than the font advance metric value and
then render a giant heart emoji. The revised calculations trigger
scaling if the glyph width is too wide. I've gone back and forth
on that particular line a couple of times in the past: hopefully
this time we've got the right calculation?
I "fixed" the font metrics we compute and return from allsorts
by rounding them to the nearest integer value. That makes the
spacing look better at my normal font size (8.3 -> 8) and makes
things look tighter. It feels a bit magical IMO.
Adds the ability to specify `--font-shaper Allsorts` and use that
for metrics and shaping.
It is sufficient to expand things like ligatures, but there's something
slightly off about how the metrics are computed and they differ slightly
from the freetype renderer, which leads to some artifacts when rendering
with opengl.
One of my tests is to `grep message src/main.rs` to pull out the line
that has a selection of emoji. The heart emoji is missing from that
line currently.
Refs: https://github.com/wez/wezterm/issues/66