packs: refresh stores after fetching packs

Summary:
The prefetch code was not refreshing the pack stores after writing
stuff into packs. This meant that in some cases we would execute a prefetch,
then when we went to read an entry from the store it wouldn't be found. That
would trigger a one-by-one download, which would happen to trigger a pack store
refresh (since remotecontentstore had a refresh) which would cause the rest of
the prefetched data to then become visible.

This fixes it to make prefetch responsible for refreshing the pack data.

This also caused issues where prefetching lfs files requires the hg fetching to
get the metadata, then it reads that metadata and does an lfs fetch. Since the
original metadata wasn't visible (since the packs had not refreshed), this
resulted in infinite recursion until the default 100ms refresh time passed.

Differential Revision: D15171701

fbshipit-source-id: 190b1da542675265efaad75a2a6826cbe3c9631c
This commit is contained in:
Durham Goode 2019-05-03 13:06:12 -07:00 committed by Facebook Github Bot
parent 5588453fb6
commit c5b3e4bcb6
3 changed files with 9 additions and 2 deletions

View File

@ -251,7 +251,6 @@ class remotecontentstore(object):
self._fileservice.prefetch(
[(name, hex(node))], force=True, fetchdata=True, fetchhistory=False
)
self._shared.markforrefresh()
def get(self, name, node):
self._prefetch(name, node)

View File

@ -419,6 +419,7 @@ class fileserverclient(object):
self.connect()
perftrace.traceflag("packs")
cache = self.remotecache
fileslog = self.repo.fileslog
total = len(fileids)
totalfetches = 0
@ -433,8 +434,10 @@ class fileserverclient(object):
getkeys = [file + "\0" + node for file, node in fileids]
if fetchdata:
cache.getdatapack(getkeys)
fileslog.contentstore.markforrefresh()
if fetchhistory:
cache.gethistorypack(getkeys)
fileslog.metadatastore.markforrefresh()
# receive both data and history
misses = []
@ -476,6 +479,12 @@ class fileserverclient(object):
datapackpath, histpackpath = self._fetchpackfiles(
misses, fetchdata, fetchhistory
)
if datapackpath:
fileslog.contentstore.markforrefresh()
if histpackpath:
fileslog.metadatastore.markforrefresh()
self.updatecache(datapackpath, histpackpath)
finally:
os.umask(oldumask)

View File

@ -161,7 +161,6 @@ class remotemetadatastore(object):
self._fileservice.prefetch(
[(name, hex(node))], force=True, fetchdata=False, fetchhistory=True
)
self._shared.markforrefresh()
def getancestors(self, name, node, known=None):
self._prefetch(name, node)