util: increase chunk size from 128KB to 2MB

Summary:
On connections that might experience higher latency the throughput for a clone
is easy to spike, raising the chunk size results in more consistent throughput.

Given that hardware these days has plenty of memory, this change shouldn't make
any significant difference either.

Reviewed By: farnz

Differential Revision: D14502443

fbshipit-source-id: 0e90f955304e9955df0ec69b5733db7c34b09e83
This commit is contained in:
Johan Schuijt-Li 2019-03-18 07:24:29 -07:00 committed by Facebook Github Bot
parent 3ca494b062
commit 3e436ef80b

View File

@ -2105,9 +2105,9 @@ class chunkbuffer(object):
return "".join(buf)
def filechunkiter(f, size=131072, limit=None):
def filechunkiter(f, size=2097152, limit=None):
"""Create a generator that produces the data in the file size
(default 131072) bytes at a time, up to optional limit (default is
(default 2MB) bytes at a time, up to optional limit (default is
to read all data). Chunks may be less than size bytes if the
chunk is the last chunk in the file, or the file is a socket or
some other type of file that sometimes reads less data than is