Joaquim Ferrer Joaquim Ferrer - 3 months ago 19
Python Question

Use eval with dictionary without losing imported modules in Python2

I have a string to be executed inside my python program and I want to change some variables in the string like x[1], x[2] to something else.
I had previously used eval with 2 arguments (the second being a dict with replaced_word: new_word) but now I noticed I can't use previously imported modules like this. So if I do this

from math import log
eval(log(x[1], {x[1]: 1})

it will say it doesn't recognize the name log.
How can I use eval like this without losing the global variables?
I can't really make sense of the documentation:
so an explanation would be useful too.


Build your globals dict with globals() as a base:

from math import log

# Copy the globals() dict so changes don't affect real globals
eval_globals = globals().copy()
# Tweak the copy to add desired new global
eval_globals[x[1]] = 1

# eval using the updated copy
eval('log(x[1])', eval_globals)

Alternatively, you can use three-arg eval to use globals() unmodified, but also supply a locals dict that will be checked (and modified) first, in preference to global values:

eval('log(x[1])', globals(), {x[1]: 1})

In theory, the latter approach could allow the expression to mutate the original globals, so adding .copy() to make it eval('log(x[1])', globals().copy(), {x[1]: 1}) minimizes the risk of that happening accidentally. But pathological/malicious code could work around that; eval is dangerous after all, don't trust it for arbitrary inputs no matter how sandboxed you make it.