Sometimes I like to play around with running Ladybird tests using full
blown Ladybird instead of just headless browser to interactively mess
around with the test page. One problem with this is that the internals
object is not exposed in this mode.
This commit supports this usecase by adding an option to specifically
expose the internals object without needing to run headless-browser
in test mode.
This lets us remove the color/dimension-parsing lambdas, since these
always forwarded to the same methods. This will make it easier to later
convert those methods to take a TokenStream.
We have to unregister link element stylesheets from the old document's
StyleSheetList when moving them into a new document.
This makes it possible to load GitHub contributor graphs. :^)
This simply reads the current cycle count from the cycle CSR.
x86-64 uses the similar rdtsc instruction here, which also may or may
not tick at a constant rate.
The strategy of recreating an object for the second time after
DynamicLoader::map is interesting, of course, but it doesn't help
comprehensibility of the code, unfortunately.
Previously, the actual behavior of magic lookup and one described in its
commit description have not matched. Instead of being weak definitions
in a library that is always in the end of load order, the definitions
were normal ones and thus were able to override other weak definitions
in LibC. While this was consistent with how DynamicLoader resolves
ambiguity between normal and weak relocations, this is not the behavior
POSIX mandates -- we should always choose first available definition wrt
load order. To fix this problem, the patch makes sure we don't define
any of magic symbols in LibC.
In addition to this, it makes all provided magic symbols functions
(instead of objects), what renders MagicWeakSymbol class unnecessary.
When WebDriver accesses cookies, it specifically says to run:
the first step of the algorithm in RFC6265 to compute cookie-string
So we should skip subsequent steps. We already skip step 2, which sorts
the cookies, but neglected to skip step 3 to update their last access
time.
We can't rely on Element.setAttribute() in cloneNode() since that will
throw on weird attribute names. Instead, just follow the spec and copy
attributes into cloned elements verbatim.
This fixes a crash when loading the "issues" tab on GitHub repos.
They are actually sending us unintentionally broken markup, but we
should still support cloning it. :^)
This just reuses the layout to compute the bounding box rather than
implementing the full bounding box algorithm, which is likely the slower
option, but also much simpler to implement.
This is enough to fix the faces on https://zengm.com/facesjs/.
Capturing a struct that owns bunch of JS::Handle makes it very hard to
understand what keeps these handles alive in the GC-graph.
Instead let's capture only members of a struct used in the callback.
With FAT write support copying the file would show two errors because
it does not support chown and chmod and would return ENOTSUP. After
this commit these errors are ignored in that case.
This is a large commit because it implements a lot of stuff to make
add_child simpler to get working. This allows us to create new files
on a FAT partition.
This is the first part of write support, it allows for full file
modification, but no creating or removing files yet.
Co-Authored-By: implicitfield <114500360+implicitfield@users.noreply.github.com>
Caching the cluster list allows us to fill the two fields in the
InodeMetadata. While at it, don't cache the metadata as when we
have write support having to keep both InodeMetadata and FATEntry
correct is going to get very annoying.