bundle: fix performance regression when bundling file changes (issue4031)

Somewhere before 2.7, a change [82beb9b16505] was committed that
entailed a large performance regression when bundling (and therefore
remote cloning) repositories. For each file in the repository, it would
recompute the set of needed changesets even though it is the same for
all files. This computation would dominate bundle runtimes according to
profiler output (by 10x or more).
This commit is contained in:
Antoine Pitrou 2013-09-07 21:20:00 +02:00
parent fd04e86dd0
commit ecc34bd7c0

View File

@ -354,11 +354,11 @@ class bundle10(object):
progress(msgbundling, None) progress(msgbundling, None)
mfs.clear() mfs.clear()
needed = set(cl.rev(x) for x in clnodes)
def linknodes(filerevlog, fname): def linknodes(filerevlog, fname):
if fastpathlinkrev: if fastpathlinkrev:
ln, llr = filerevlog.node, filerevlog.linkrev ln, llr = filerevlog.node, filerevlog.linkrev
needed = set(cl.rev(x) for x in clnodes)
def genfilenodes(): def genfilenodes():
for r in filerevlog: for r in filerevlog:
linkrev = llr(r) linkrev = llr(r)