Summary:
Write commit cloud sync and clienttelemetry blackbox logs to tracing data.
Note: since metalog can answer head changes, I didn't add head changes of
commit cloud sync to the tracing data.
Reviewed By: DurhamG
Differential Revision: D19797700
fbshipit-source-id: b89924a7aa5e6027cad5c8138e8988f6b0ea4b2a
Summary: This allows us to query tracing data for watchman commands.
Reviewed By: DurhamG
Differential Revision: D19797711
fbshipit-source-id: 4dfd50fff820da70888faa0fe8f53af25f205137
Summary:
The repo.pull API updates remote bookmarks on its own. Therefore do not even
ask the server to listkey (all) bookmarks.
This also removes the need of listkeys(bookmarks) for the new clone API.
Reviewed By: DurhamG
Differential Revision: D21011393
fbshipit-source-id: b10bdbc82563c32626bdcb2632170fd56819e904
Summary:
The core `clone.streamclone` is the new clean way to do a streaming clone with
selectivepull. Detect the use of it and skip remotenames' own exclone logic.
Reviewed By: DurhamG
Differential Revision: D21011396
fbshipit-source-id: 50fdbf4c2761a96c50e23f21a87ef636fac74afb
Summary:
The current `clone --shallow` command has some issues:
- It fetches *all* remote bookmarks, since selectivepull does not work with
streamclone, then remove most remote bookmarks in a second transaction.
- It goes through remotenames, which is racy, and D20703268 does not fix the
clone case. Possible cause of T65349853.
- Too many wrappers (ex. in remotefilelog, remotenames, fastdiscovery) wtih
many configurations (ex. narrow-heads on/off) makes it hard to reason about.
Instead of bandaidding the clone function, this diff adds a new clone implementation
that aims to solve the issues:
- Use streamclone, but do not pull all remote names.
- Pull selectivepull names explicitly with a working "discovery" strategy
(repo heads should be non-empty with narrow-heads on or off).
- Do clone in one transaction. Outside world won't see an incomplete state.
- Use `repo.pull` API, which is not subject to race conditions.
- Eventually, this might be the only supported "clone" after Mononoke becoming
the single source of truth.
Note: the code path still goes through bookmarks.py and remotenames.py.
They will be cleaned up in upcoming diffs.
Reviewed By: DurhamG
Differential Revision: D21011401
fbshipit-source-id: d8751ac9bd643e9661e58c87b683be285f0dc925
Summary:
In the past we hide the revlog headrevs API with the idea that calculating
heads in the DAG is not going to scale, and heads should be based on references
(remotenames, visible heads). Practically calculating heads in the DAG based
on segmented changelog is not going to be painfully slow so we probably can
afford it.
Therefore let's just re-expose the DAG-based heads API as rawheads. The only
user of it is in dagutil.py.
This will be used in the next diff where streamclone first gets the revlog
changelog copied without remote bookmarks. Then it needs to do a pull
which requires the heads information.
Reviewed By: DurhamG
Differential Revision: D21296530
fbshipit-source-id: a81a61e3b58c921a3390fda8f716bd7ae0e55ed1
Summary:
Move the logic to write repo hgrc ([paths]) and set [paths] config options
earlier, so other logic can use the [paths] config.
Some tests are changed because remotenames can now write bookmarks in more
cases.
Reviewed By: DurhamG
Differential Revision: D21011397
fbshipit-source-id: 4b921a02c20daeef31d44a03264a89b975303aa5
Summary:
"[paths] being empty" will no longer be a way to test initial clone, use
transaction name instead.
Reviewed By: DurhamG
Differential Revision: D21011395
fbshipit-source-id: e257fe8eb2efd45ac52fad7c74363151b0a8c417
Summary: This will be used by hg-git to test initial clone.
Reviewed By: DurhamG
Differential Revision: D21011400
fbshipit-source-id: 11a1a41631830273a6407e419ebe5ff21964e7de
Summary: It is not used and makes the already complicated clone logic more complicated.
Reviewed By: DurhamG
Differential Revision: D21011394
fbshipit-source-id: 3620f7372a9f3cefc60618052c768c6c2cbe04f9
Summary:
They are not used. Remove it to make it a bit easier to move stream_out_shallow
to core.
Note: this does not remove all include/excludepatterns yet.
Reviewed By: DurhamG
Differential Revision: D21011403
fbshipit-source-id: f6d27a3e2472f6c69f95a958ac99f75a8b8f8b74
Summary: It will be used in the next change.
Reviewed By: DurhamG
Differential Revision: D21011399
fbshipit-source-id: 6bdffc79af0474e42562686109417882a8cb2cd6
Summary:
Add the `hg cloud hide` command. This allows removal of commits, bookmarks and
remote bookmarks from a cloud workspace, even when the items are omitted
locally.
Reviewed By: DurhamG, quark-zju
Differential Revision: D21409384
fbshipit-source-id: 24b64c207c78f9b0258e9cf6a578db7b14c84901
Summary: This will reduce the amount of space they take in scuba.
Reviewed By: xavierd
Differential Revision: D21483472
fbshipit-source-id: 9de49dedef480932f8583dd17fe6625d222a3285
Summary: This allows us to query tracing data for fsmonitor walk events.
Reviewed By: DurhamG
Differential Revision: D19797709
fbshipit-source-id: 1ff76dd6122cf56787e7928711f604f9c3d571cc
Summary:
Pass `configparser::config::ConfigSet` to `repack` in
`revisionstore/src/repack.rs` so that we can use various config values in `filter_incrementalpacks`.
* `repack.maxdatapacksize`, `repack.maxhistpacksize`
* The overall max pack size
* `repack.sizelimit`
* The size limit for any individual pack
* `repack.maxpacks`
* The maximum number of packs we want to have after repack (overrides sizelimit)
Reviewed By: xavierd
Differential Revision: D21484836
fbshipit-source-id: 0407d50dfd69f23694fb736e729819b7285f480f
Summary:
Let's not allow proceeding with memcommit if repo is locked. This what normal
push flow does, so we should allow it here as well.
Reviewed By: markbt
Differential Revision: D21502435
fbshipit-source-id: 80e665f065fb0cd882bc99482769a3de01d3de30
Summary:
If http_proxy.no is set, we should respect it to avoid sending traffic to it
whenever required.
Reviewed By: wez
Differential Revision: D21383138
fbshipit-source-id: 4c8286aaaf51cbe19402bcf8e4ed03e0d167228b
Summary:
When Qing implemented all the get method, the translate_lfs_missing function
didn't exist, and I forgot to add them in the right places when landing the
diff that added it. Fix this.
Reviewed By: sfilipco
Differential Revision: D21418043
fbshipit-source-id: baf67b0fe60ed20aeb2c1acd50a209d04dc91c5e
Summary: Make them reusable in other Python bindings, ex. pymutation.
Reviewed By: sfilipco
Differential Revision: D21486524
fbshipit-source-id: 258455c6a442353c77588fadcb560cb5a170926e
Summary: This makes it easier to visualize a MemNameDag.
Reviewed By: sfilipco
Differential Revision: D21486523
fbshipit-source-id: c65f1fc421bd654dc820faae3c93f2aa57f910d4
Summary:
This will allow clients to operate on MemNameDag.
Unfortunately, it isn't that easy to reuse code in `py_class!`. Since they are
just thin wrappers, I live with the copy-paste for now.
Reviewed By: sfilipco
Differential Revision: D21479015
fbshipit-source-id: ddcc7f5c7ede6bb1e9c73d058779805875b09200
Summary: This would be handy to visualize a MemNameDag.
Reviewed By: sfilipco
Differential Revision: D21486522
fbshipit-source-id: c8d7147dc53a1a7c1b8b09ce055493c69cceba2f
Summary:
Use MemNameDag::from_ascii to simplify the tests. This removes the need of:
- using tempdir
- converting between Id and VertexName manually via an IdMap
- depending on drawdag directly
Reviewed By: sfilipco
Differential Revision: D21486519
fbshipit-source-id: f04061d8892f043de40e7e321273acc51e15308a
Summary:
It seems handy to construct a Dag just from ASCII. Therefore move it to a
public interface.
Reviewed By: sfilipco
Differential Revision: D21486525
fbshipit-source-id: de7f4b8dfcbcc486798928d4334c655431373276
Summary:
They are part of the read-only algorithms that are not specific to a certain
type of NameDag.
Reviewed By: sfilipco
Differential Revision: D21479017
fbshipit-source-id: 3fa58071ac43246d3cd45d84384ee93c7385f414
Summary:
Adds an in-memory NameDag so we can construct the DAG and use its algorithms by
just providing parents function and heads.
Reviewed By: sfilipco
Differential Revision: D21479021
fbshipit-source-id: e12d53a97afec77b2307d5efbb280bd506dee0ba
Summary: Adds an in-memory IdMap to be used in an in-memory NameDag.
Reviewed By: sfilipco
Differential Revision: D21479018
fbshipit-source-id: bc702762b059e8659c6ab322f3c39f032e95d5b6
Summary:
This allows them to switch to a different IdMap implementation relatively
easily.
Reviewed By: sfilipco
Differential Revision: D21479023
fbshipit-source-id: 8ecb99cafe2093ec7d14b848ffa08581c5300414
Summary: This will allow different IdMap implementations.
Reviewed By: sfilipco
Differential Revision: D21479016
fbshipit-source-id: 852501896fddcb82624338acd9dceee41150e302
Summary:
`NameDag::add_heads` API changes the internal `dag` state without updating
`snapshot_map`. That will cause queries relying on `snapshot_map` to fail.
Update it so that `snapshot_map` gets updated by `add_heads`.
Reviewed By: sfilipco
Differential Revision: D21479019
fbshipit-source-id: 70528aa4a488cef3dc71bf21dd89e45cfe763794
Summary:
This makes it easier to add an "in-memory-only" NameDag with all the algorithms
implemented.
Reviewed By: sfilipco
Differential Revision: D21479020
fbshipit-source-id: c1a73e95f3291c273c800650f70db2a7eb0966d7
Summary:
Commit cloud now uses `mercurial.json` rather than `json`. This doesn't
support the `indent` arg, so remove this from the debug output when
`debugrequests` is enabled.
Reviewed By: farnz
Differential Revision: D21500306
fbshipit-source-id: ae436e9c32d1d2da432eeb93d114115ea80b825b
Summary: If no LFS blobs needs uploading, then don't try to connect to the LFS server in the first place.
Reviewed By: DurhamG
Differential Revision: D21478243
fbshipit-source-id: 81fa960d899b14f47aadf2fc90485747889041e1
Summary:
When LFS is enabled, only bundle3 is supported, so we have to hack the exchange
code a bit in this case to always chose bundle3.
This is copied verbatim from the lfs extension.
Reviewed By: DurhamG
Differential Revision: D21459734
fbshipit-source-id: 41c867cec09e2485ec1e9d91545b61da568f4766
Summary: This is according to the suggestion in the discussion referenced in the task. Per quark-zju we do need to change `rage` to use `hg sparse` rather than `hg sparse show`.
Reviewed By: quark-zju
Differential Revision: D21422005
fbshipit-source-id: 6dd0e20125635c7fb9b6ea6c9e2b35c8fb517d5d
Summary: Added the `status` field to json in order to provide that information to the automated client, as well as match similar output of `hg status`.
Reviewed By: quark-zju
Differential Revision: D21421494
fbshipit-source-id: 2a8b80068f2068b09930b90c43252003421b324e
Summary: Fixed that `hg sparse show` failed on missing profiles. Added them to be shown in the output with "!" symbol and in cyan color - which matches output of deleted files in `hg status`.
Reviewed By: quark-zju
Differential Revision: D21419278
fbshipit-source-id: 5581e67774686a5240dceb9aac428fac3b1b73c2
Summary:
Remove HgIdDataStore::get_delta and all implementations. Remove HgIdDataStore::get_delta_chain from trait, remove all unnecessary implentations, remove all implementations from public Rust API. Leave Python API and introduce "delta-wrapping".
MutableDataPack::get_delta_chain must remain in some form, as it necessary to implement get using a sequence of Deltas. It has been moved to a private inherent impl.
DataPack::get_delta_chain must remain in some form for the same reasons, and in fact both implenetations can probably be merged, but it is also used in repack.rs for the free function repack_datapack. There are a few ways to address this without making DataPack::get_delta_chain part of the public API. I've currently chosen to make the method pub(crate), ie visible only within the revisionstore crate. Alternatively, we could move the repack_datapack function to a method on DataPack, or use a trait in a private module, or some other technique to restrict visibility to only where necessary.
UnionDataStore::get has been modified to call get on it's sub-stores and return the first which matches the given key.
MultiplexDeltaStore has been modified to implement get similarly to UnionDataStore.
Reviewed By: xavierd
Differential Revision: D21356420
fbshipit-source-id: d04e18a0781374a138395d1c21c3687897223d15
Summary:
Update `contrib/check-code.py` to Python 3.
Mostly it was already compatible, however stricter regular expression parsing
revealed a case where one of our tests wasn't working, and as a result lots of
instances of `open(file).read()` existed that this test should have caught.
I have fixed up most of the instances in the code, although there are many
in the test suite that I have ignored for now.
Reviewed By: quark-zju
Differential Revision: D21427212
fbshipit-source-id: 7461a7c391e0ade947f779a2b476ca937fd24a8d
Summary: Reformat using a newer version of Black.
Reviewed By: quark-zju
Differential Revision: D21426337
fbshipit-source-id: 1ac7f6e85a06feec0d41e9509eca09194f421a1d
Summary:
The latest version of Black removes unneccessary parenthesis. Mercurial's
test-check-code currently uses extra parentheses to signal untranslated strings, so
Black's reformatting breaks this test.
Capitulate to Black by adding a new `_x` translation marker that means "untranslated".
Reviewed By: quark-zju
Differential Revision: D21426335
fbshipit-source-id: a6c26d7c6365c49530a7dee3a5f9ed71ff166835
Summary:
Implement autopull so non-automation `hg pull -r Dxxxx`, `hg up Dxxxx` will
pull `Dxxxx` automatically.
Since we now autopull the commits, error messages are removed. The old code
actually causes issues because it will raise at `"D1234" in repo`, which is
a surprise to many code paths, including the revset autopull logic, which
uses `x in repo` to decide whether `x` is unknown.
Note `hg pull -r Dxxxx` is not using the revset layer and needs to be handled
separately. It works for hg servers right now because the server can translate
`Dxxx` to a commit hash. It probably does not work for a Mononoke server.
Reviewed By: sfilipco
Differential Revision: D21320626
fbshipit-source-id: 939abe12e3a9a8ed5ca7ed29bb4f90fb39e7674a
Summary:
Change the interface to return infallible `Optional[node]` instead of fallible
`List[rev]`. In the next diff, we're going to use the 'node' information to
implement autopull.
Reviewed By: sfilipco
Differential Revision: D21320628
fbshipit-source-id: 6f7c070faba667cc85313cc78d6149c787ca8593
Summary:
Currently, phrevset picks the "successor" that has the maximum revision
number. That depends on the assumption that larger revision numbers
are modified last. Howevver, that's not always true with commit cloud sync. For example, in my repo I have D21179514 modified from 63768bf43
to 684612d5d. Phabricator has 63768bf43. Local successor is 684612d5d,
but the revision number of 684612d5d is smaller than 63768bf43:
quark@devvm1939 ~/hg/edenscm/hgext % hg log -r 'D21179514'
changeset: 63768bf436d01982a8d42ce97160ac6d9ae2cdad D21179514
user: Jun Wu <quark@fb.com>
date: Wed, 22 Apr 2020 09:45:50 -0700
summary: [hg] commitcloud: log metalog root during update references
quark@devvm1939 ~/hg/edenscm/hgext % lhg log -r 'D21179514'
changeset: 684612d5d606b01c224889f2b3f87aff7b93db49 D21179514 (@)
user: Jun Wu <quark@fb.com>
date: Wed, 22 Apr 2020 10:10:37 -0700
summary: [hg] commitcloud: log metalog root during update references
quark@devvm1939 ~/hg/edenscm/hgext % lhg log -r 'successors(63768bf436d0198
2a8d42ce97160ac6d9ae2cdad)' -T '{node} {rev}\n'
684612d5d606b01c224889f2b3f87aff7b93db49 76718
63768bf436d01982a8d42ce97160ac6d9ae2cdad 95363
Improve it by actually prefer selecting a non-obsoleted successor.
Reviewed By: sfilipco
Differential Revision: D21267552
fbshipit-source-id: d43d72a7c273c55af70bb41ad967fff0c78a452a
Summary: This is more efficient (if pullattempts have high chance to succeed).
Reviewed By: sfilipco
Differential Revision: D21320627
fbshipit-source-id: a2166f5a59f98c7d705c806b9d152ceb9981f3be