sapling/hgdemandimport/demandimportpy2.py

314 lines
11 KiB
Python
Raw Normal View History

# demandimport.py - global demand-loading of modules for Mercurial
#
# Copyright 2006, 2007 Matt Mackall <mpm@selenic.com>
#
# This software may be used and distributed according to the terms of the
2010-01-20 07:20:08 +03:00
# GNU General Public License version 2 or any later version.
'''
demandimport - automatic demandloading of modules
To enable this module, do:
import demandimport; demandimport.enable()
Imports of the following forms will be demand-loaded:
import a, b.c
import a.b as c
from a import b,c # a will be loaded immediately
These imports will not be delayed:
from a import *
b = __import__(a)
'''
2015-08-09 05:05:28 +03:00
from __future__ import absolute_import
import contextlib
import os
import sys
# __builtin__ in Python 2, builtins in Python 3.
try:
import __builtin__ as builtins
except ImportError:
import builtins
2015-08-09 05:05:28 +03:00
contextmanager = contextlib.contextmanager
_origimport = __import__
nothing = object()
# Python 3 doesn't have relative imports nor level -1.
level = -1
if sys.version_info[0] >= 3:
level = 0
_import = _origimport
def _hgextimport(importfunc, name, globals, *args, **kwargs):
try:
return importfunc(name, globals, *args, **kwargs)
except ImportError:
if not globals:
raise
# extensions are loaded with "hgext_" prefix
hgextname = 'hgext_%s' % name
nameroot = hgextname.split('.', 1)[0]
contextroot = globals.get('__name__', '').split('.', 1)[0]
if nameroot != contextroot:
raise
# retry to import with "hgext_" prefix
return importfunc(hgextname, globals, *args, **kwargs)
class _demandmod(object):
"""module demand-loader and proxy
Specify 1 as 'level' argument at construction, to import module
relatively.
"""
def __init__(self, name, globals, locals, level):
if '.' in name:
head, rest = name.split('.', 1)
after = [rest]
else:
head = name
after = []
object.__setattr__(self, r"_data",
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
(head, globals, locals, after, level, set()))
object.__setattr__(self, r"_module", None)
def _extend(self, name):
"""add to the list of submodules to load"""
self._data[3].append(name)
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
def _addref(self, name):
"""Record that the named module ``name`` imports this module.
References to this proxy class having the name of this module will be
replaced at module load time. We assume the symbol inside the importing
module is identical to the "head" name of this module. We don't
actually know if "as X" syntax is being used to change the symbol name
because this information isn't exposed to __import__.
"""
self._data[5].add(name)
def _load(self):
if not self._module:
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
head, globals, locals, after, level, modrefs = self._data
mod = _hgextimport(_import, head, globals, locals, None, level)
demandimport: avoid infinite recursion at actual module importing (issue5304) Before this patch, importing C module on Windows environment causes infinite recursion call, if py2exe is used with -b2 option. At importing C module "a.b", extra hooking by zipextimporter of py2exe causes: 0. assumption before accessing "b" of "a": - built-in module object is created for "a", (= "a" is actually imported) - _demandmod is created for "a.b" as a proxy object, and (= "a.b" is not yet imported) - an attribute "b" of "a" is initialized by the latter 1. invocation of __import__ via _hgextimport() in _demandmod._load() for "a.b" implies _demandimport() for "a.b" This is unintentional, because _demandmod might be returned by _hgextimport() instead of built-in module object. 2. _demandimport() at (1) is invoked with not context of "a", but context of zipextimporter Just after invocation of _hgextimport() in _demandimport(), an attribute "b" of the built-in module object for "a" is still bound to the proxy object for "a.b", because context of "a" isn't updated by actual importing "a.b". even though the built-in module object for "a.b" already appears in sys.modules. Therefore, chainmodules() returns _demandmod for "a.b", which is gotten from the attribute "b" of "a". 3. processfromitem() on "a.b" causes _demandmod._load() for "a.b" again _demandimport() takes context of "a" in this case. Therefore, attributes below are bound to built-in module object for "a.b", as expected: - "b" of built-in module object for "a" - _module of _demandmod for "a.b" 4. but _demandimport() invoked at (1) returns _demandmod object because _demandimport() just returns the object returned by chainmodules() at (3) above. 5. then, _demandmod._load() causes infinite recursion call _demandimport() returns _demandmod for "a.b", and it is "self" at _demandmod._load(). To avoid infinite recursion at actual module importing, this patch uses self._module, if _hgextimport() returns _demandmod itself. If _demandmod._module isn't yet bound at this point, execution should be aborted, because actual importing failed. In this patch, _demandmod._module is examined not on _demandimport() side, but on _demandmod._load() side, because: - the former has some exit points - only the latter uses _hgextimport(), except for _demandimport() BTW, this issue occurs only in the code path for non .py/.pyc files in zipextimporter (strictly speaking, in _memimporter) of py2exe. Even if zipextimporter is enabled, .py/.pyc files are handled by zipimporter, and it doesn't imply unintentional _demandimport() at invocation of __import__ via _hgextimport().
2016-07-30 23:39:59 +03:00
if mod is self:
# In this case, _hgextimport() above should imply
# _demandimport(). Otherwise, _hgextimport() never
# returns _demandmod. This isn't intentional behavior,
# in fact. (see also issue5304 for detail)
#
# If self._module is already bound at this point, self
# should be already _load()-ed while _hgextimport().
# Otherwise, there is no way to import actual module
# as expected, because (re-)invoking _hgextimport()
# should cause same result.
# This is reason why _load() returns without any more
# setup but assumes self to be already bound.
mod = self._module
assert mod and mod is not self, "%s, %s" % (self, mod)
return
# load submodules
def subload(mod, p):
h, t = p, None
if '.' in p:
h, t = p.split('.', 1)
if getattr(mod, h, nothing) is nothing:
setattr(mod, h, _demandmod(p, mod.__dict__, mod.__dict__,
level=1))
elif t:
subload(getattr(mod, h), t)
for x in after:
subload(mod, x)
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
# Replace references to this proxy instance with the actual module.
if locals:
if locals.get(head) is self:
locals[head] = mod
elif locals.get(head + r'mod') is self:
locals[head + r'mod'] = mod
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
for modname in modrefs:
modref = sys.modules.get(modname, None)
if modref and getattr(modref, head, None) is self:
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
setattr(modref, head, mod)
object.__setattr__(self, r"_module", mod)
def __repr__(self):
if self._module:
return "<proxied module '%s'>" % self._data[0]
return "<unloaded module '%s'>" % self._data[0]
def __call__(self, *args, **kwargs):
raise TypeError("%s object is not callable" % repr(self))
def __getattr__(self, attr):
self._load()
return getattr(self._module, attr)
def __setattr__(self, attr, val):
self._load()
setattr(self._module, attr, val)
@property
def __dict__(self):
self._load()
return self._module.__dict__
@property
def __doc__(self):
self._load()
return self._module.__doc__
_pypy = '__pypy__' in sys.builtin_module_names
def _demandimport(name, globals=None, locals=None, fromlist=None, level=level):
if locals is None or name in ignore or fromlist == ('*',):
# these cases we can't really delay
return _hgextimport(_import, name, globals, locals, fromlist, level)
elif not fromlist:
# import a [as b]
if '.' in name: # a.b
base, rest = name.split('.', 1)
# email.__init__ loading email.mime
if globals and globals.get('__name__', None) == base:
return _import(name, globals, locals, fromlist, level)
# if a is already demand-loaded, add b to its submodule list
if base in locals:
if isinstance(locals[base], _demandmod):
locals[base]._extend(rest)
return locals[base]
return _demandmod(name, globals, locals, level)
else:
# There is a fromlist.
# from a import b,c,d
# from . import b,c,d
# from .a import b,c,d
# level == -1: relative and absolute attempted (Python 2 only).
# level >= 0: absolute only (Python 2 w/ absolute_import and Python 3).
# The modern Mercurial convention is to use absolute_import everywhere,
# so modern Mercurial code will have level >= 0.
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
# The name of the module the import statement is located in.
globalname = globals.get('__name__')
def processfromitem(mod, attr):
"""Process an imported symbol in the import statement.
If the symbol doesn't exist in the parent module, and if the
parent module is a package, it must be a module. We set missing
modules up as _demandmod instances.
"""
symbol = getattr(mod, attr, nothing)
nonpkg = getattr(mod, '__path__', nothing) is nothing
if symbol is nothing:
if nonpkg:
# do not try relative import, which would raise ValueError,
# and leave unknown attribute as the default __import__()
# would do. the missing attribute will be detected later
# while processing the import statement.
return
mn = '%s.%s' % (mod.__name__, attr)
if mn in ignore:
importfunc = _origimport
else:
importfunc = _demandmod
symbol = importfunc(attr, mod.__dict__, locals, level=1)
setattr(mod, attr, symbol)
demandimport: replace more references to _demandmod instances _demandmod instances may be referenced by multiple importing modules. Before this patch, the _demandmod instance only maintained a reference to its first consumer when using the "from X import Y" syntax. This is because we only created a single _demandmod instance (attached to the parent X module). If multiple modules A and B performed "from X import Y", we'd produce a single _demandmod instance "demandmod" with the following references: X.Y = <demandmod> A.Y = <demandmod> B.Y = <demandmod> The locals from the first consumer (A) would be stored in <demandmod1>. When <demandmod1> was loaded, we'd look at the locals for the first consumer and replace the symbol, if necessary. This resulted in state: X.Y = <module> A.Y = <module> B.Y = <demandmod> B's reference to Y wasn't updated and was still using the proxy object because we just didn't record that B had a reference to <demandmod> that needed updating! With this patch, we add support for tracking which modules in addition to the initial importer have a reference to the _demandmod instance and we replace those references at module load time. In the case of posix.py, this fixes an issue where the "encoding" module was being proxied, resulting in hundreds of thousands of __getattribute__ lookups on the _demandmod instance during dirstate operations on mozilla-central, speeding up execution by many milliseconds. There are likely several other operation that benefit from this change as well. The new mechanism isn't perfect: references in locals (not globals) may likely linger. So, if there is an import inside a function and a symbol from that module is used in a hot loop, we could have unwanted overhead from proxying through _demandmod. Non-global imports are discouraged anyway. So hopefully this isn't a big deal in practice. We could potentially deploy a code checker that bans use of attribute lookups of function-level-imported modules inside loops. This deficiency in theory could be avoided by storing the set of globals and locals dicts to update in the _demandmod instance. However, I tried this and it didn't work. One reason is that some globals are _demandmod instances. We could work around this, but it's a bit more work. There also might be other module import foo at play. The solution as implemented is better than what we had and IMO is good enough for the time being. It's worth noting that this sub-optimal behavior was made worse by the introduction of absolute_import and its recommended "from . import X" syntax for importing modules from the "mercurial" package. If we ever wrote performance tests, measuring the amount of module imports and __getattribute__ proxy calls through _demandmod instances would be something I'd have it check.
2015-10-04 21:17:43 +03:00
# Record the importing module references this symbol so we can
# replace the symbol with the actual module instance at load
# time.
if globalname and isinstance(symbol, _demandmod):
symbol._addref(globalname)
demandimport: delay loading for "from a import b" with absolute_import Before this patch, "from a import b" doesn't delay loading module "b", if absolute_import is enabled, even though "from . import b" does. For example: - it is assumed that extension X has "from P import M" for module M under package P with absolute_import feature - if importing module M is already delayed before loading extension X, loading module M in extension X is delayed until actually referring util, cmdutil, scmutil or so of Mercurial itself should be imported by "from . import M" style before loading extension X - otherwise, module M is loaded immediately at loading extension X, even if extension X itself isn't used at that "hg" command invocation Some minor modules (e.g. filemerge or so) of Mercurial itself aren't imported by "from . import M" style before loading extension X. And of course, external libraries aren't, too. This might cause startup performance problem of hg command, because many bundled extensions already enable absolute_import feature. To delay loading module for "from a import b" with absolute_import feature, this patch does below in "from a (or .a) import b" with absolute_import case: 1. import root module of "name" by system built-in __import__ (referred as _origimport) 2. recurse down the module chain for hierarchical "name" This logic can be shared with non absolute_import case. Therefore, this patch also centralizes it into chainmodules(). 3. and fall through to process elements in "fromlist" for the leaf module of "name" Processing elements in "fromlist" is executed in the code path after "if _pypy: .... else: ..." clause. Therefore, this patch replaces "if _pypy:" with "elif _pypy:" to share it. At faecf59a4184 introducing original "work around" for "from a import b" case, elements in "fromlist" were imported with "level=level". But "level" might be grater than 1 (e.g. level=2 in "from .. import b" case) at demandimport() invocation, and importing direct sub-module in "fromlist" with level grater than 1 causes unexpected result. IMHO, this seems main reason of "errors for unknown reason" described in faecf59a4184, and we don't have to worry about it, because this issue was already fixed by 2711f50242cf. This is reason why this patch removes "errors for unknown reasons" comment.
2016-06-18 20:17:33 +03:00
def chainmodules(rootmod, modname):
# recurse down the module chain, and return the leaf module
mod = rootmod
for comp in modname.split('.')[1:]:
if getattr(mod, comp, nothing) is nothing:
setattr(mod, comp, _demandmod(comp, mod.__dict__,
mod.__dict__, level=1))
demandimport: delay loading for "from a import b" with absolute_import Before this patch, "from a import b" doesn't delay loading module "b", if absolute_import is enabled, even though "from . import b" does. For example: - it is assumed that extension X has "from P import M" for module M under package P with absolute_import feature - if importing module M is already delayed before loading extension X, loading module M in extension X is delayed until actually referring util, cmdutil, scmutil or so of Mercurial itself should be imported by "from . import M" style before loading extension X - otherwise, module M is loaded immediately at loading extension X, even if extension X itself isn't used at that "hg" command invocation Some minor modules (e.g. filemerge or so) of Mercurial itself aren't imported by "from . import M" style before loading extension X. And of course, external libraries aren't, too. This might cause startup performance problem of hg command, because many bundled extensions already enable absolute_import feature. To delay loading module for "from a import b" with absolute_import feature, this patch does below in "from a (or .a) import b" with absolute_import case: 1. import root module of "name" by system built-in __import__ (referred as _origimport) 2. recurse down the module chain for hierarchical "name" This logic can be shared with non absolute_import case. Therefore, this patch also centralizes it into chainmodules(). 3. and fall through to process elements in "fromlist" for the leaf module of "name" Processing elements in "fromlist" is executed in the code path after "if _pypy: .... else: ..." clause. Therefore, this patch replaces "if _pypy:" with "elif _pypy:" to share it. At faecf59a4184 introducing original "work around" for "from a import b" case, elements in "fromlist" were imported with "level=level". But "level" might be grater than 1 (e.g. level=2 in "from .. import b" case) at demandimport() invocation, and importing direct sub-module in "fromlist" with level grater than 1 causes unexpected result. IMHO, this seems main reason of "errors for unknown reason" described in faecf59a4184, and we don't have to worry about it, because this issue was already fixed by 2711f50242cf. This is reason why this patch removes "errors for unknown reasons" comment.
2016-06-18 20:17:33 +03:00
mod = getattr(mod, comp)
return mod
if level >= 0:
if name:
demandimport: delay loading for "from a import b" with absolute_import Before this patch, "from a import b" doesn't delay loading module "b", if absolute_import is enabled, even though "from . import b" does. For example: - it is assumed that extension X has "from P import M" for module M under package P with absolute_import feature - if importing module M is already delayed before loading extension X, loading module M in extension X is delayed until actually referring util, cmdutil, scmutil or so of Mercurial itself should be imported by "from . import M" style before loading extension X - otherwise, module M is loaded immediately at loading extension X, even if extension X itself isn't used at that "hg" command invocation Some minor modules (e.g. filemerge or so) of Mercurial itself aren't imported by "from . import M" style before loading extension X. And of course, external libraries aren't, too. This might cause startup performance problem of hg command, because many bundled extensions already enable absolute_import feature. To delay loading module for "from a import b" with absolute_import feature, this patch does below in "from a (or .a) import b" with absolute_import case: 1. import root module of "name" by system built-in __import__ (referred as _origimport) 2. recurse down the module chain for hierarchical "name" This logic can be shared with non absolute_import case. Therefore, this patch also centralizes it into chainmodules(). 3. and fall through to process elements in "fromlist" for the leaf module of "name" Processing elements in "fromlist" is executed in the code path after "if _pypy: .... else: ..." clause. Therefore, this patch replaces "if _pypy:" with "elif _pypy:" to share it. At faecf59a4184 introducing original "work around" for "from a import b" case, elements in "fromlist" were imported with "level=level". But "level" might be grater than 1 (e.g. level=2 in "from .. import b" case) at demandimport() invocation, and importing direct sub-module in "fromlist" with level grater than 1 causes unexpected result. IMHO, this seems main reason of "errors for unknown reason" described in faecf59a4184, and we don't have to worry about it, because this issue was already fixed by 2711f50242cf. This is reason why this patch removes "errors for unknown reasons" comment.
2016-06-18 20:17:33 +03:00
# "from a import b" or "from .a import b" style
rootmod = _hgextimport(_origimport, name, globals, locals,
level=level)
mod = chainmodules(rootmod, name)
elif _pypy:
# PyPy's __import__ throws an exception if invoked
# with an empty name and no fromlist. Recreate the
# desired behaviour by hand.
mn = globalname
mod = sys.modules[mn]
if getattr(mod, '__path__', nothing) is nothing:
mn = mn.rsplit('.', 1)[0]
mod = sys.modules[mn]
if level > 1:
mn = mn.rsplit('.', level - 1)[0]
mod = sys.modules[mn]
else:
mod = _hgextimport(_origimport, name, globals, locals,
level=level)
for x in fromlist:
processfromitem(mod, x)
return mod
# But, we still need to support lazy loading of standard library and 3rd
# party modules. So handle level == -1.
mod = _hgextimport(_origimport, name, globals, locals)
demandimport: delay loading for "from a import b" with absolute_import Before this patch, "from a import b" doesn't delay loading module "b", if absolute_import is enabled, even though "from . import b" does. For example: - it is assumed that extension X has "from P import M" for module M under package P with absolute_import feature - if importing module M is already delayed before loading extension X, loading module M in extension X is delayed until actually referring util, cmdutil, scmutil or so of Mercurial itself should be imported by "from . import M" style before loading extension X - otherwise, module M is loaded immediately at loading extension X, even if extension X itself isn't used at that "hg" command invocation Some minor modules (e.g. filemerge or so) of Mercurial itself aren't imported by "from . import M" style before loading extension X. And of course, external libraries aren't, too. This might cause startup performance problem of hg command, because many bundled extensions already enable absolute_import feature. To delay loading module for "from a import b" with absolute_import feature, this patch does below in "from a (or .a) import b" with absolute_import case: 1. import root module of "name" by system built-in __import__ (referred as _origimport) 2. recurse down the module chain for hierarchical "name" This logic can be shared with non absolute_import case. Therefore, this patch also centralizes it into chainmodules(). 3. and fall through to process elements in "fromlist" for the leaf module of "name" Processing elements in "fromlist" is executed in the code path after "if _pypy: .... else: ..." clause. Therefore, this patch replaces "if _pypy:" with "elif _pypy:" to share it. At faecf59a4184 introducing original "work around" for "from a import b" case, elements in "fromlist" were imported with "level=level". But "level" might be grater than 1 (e.g. level=2 in "from .. import b" case) at demandimport() invocation, and importing direct sub-module in "fromlist" with level grater than 1 causes unexpected result. IMHO, this seems main reason of "errors for unknown reason" described in faecf59a4184, and we don't have to worry about it, because this issue was already fixed by 2711f50242cf. This is reason why this patch removes "errors for unknown reasons" comment.
2016-06-18 20:17:33 +03:00
mod = chainmodules(mod, name)
for x in fromlist:
processfromitem(mod, x)
return mod
ignore = []
def init(ignorelist):
global ignore
ignore = ignorelist
def isenabled():
return builtins.__import__ == _demandimport
def enable():
"enable global demand-loading of modules"
if os.environ.get('HGDEMANDIMPORT') != 'disable':
builtins.__import__ = _demandimport
def disable():
"disable global demand-loading of modules"
builtins.__import__ = _origimport
@contextmanager
def deactivated():
"context manager for disabling demandimport in 'with' blocks"
demandenabled = isenabled()
if demandenabled:
disable()
try:
yield
finally:
if demandenabled:
enable()