This question is a result of a student of mine asking a question about the following code, and I'm honestly completely stumped. Any help would be appreciated.
When I run this code:
a = 1
x = x + a
# test 3
a = 1
a = x + a
local variable 'a' referenced before assignment
a = 1 def func3(x): global a a = x + a return(x) print(func3(3))
Now it should work.
When you put the statement
a=x+a inside the function, it creates a new local variable
a and tries to reference its value(which clearly hasnt been defined before). Thus you have to use
global a before altering the value of a global variable so that it knows which value to refer to.
The execution of a function introduces a new symbol table used for the local variables of the function. More precisely, all variable assignments in a function store the value in the local symbol table; whereas variable references first look in the local symbol table, then in the local symbol tables of enclosing functions, then in the global symbol table, and finally in the table of built-in names. Thus, global variables cannot be directly assigned a value within a function (unless named in a global statement), although they may be referenced.