tf.function-decorated function tried to create variables on non-first call.
Package:
tensorflow
158813

Exception Class:
ValueError
Raise code
lf._graph_deleter = FunctionDeleter(self._lifted_initializer_graph)
self._concrete_stateful_fn = (
self._stateful_fn._get_concrete_function_internal_garbage_collected( # pylint: disable=protected-access
*args, **kwds))
def invalid_creator_scope(*unused_args, **unused_kwds):
"""Disables variable creation."""
raise ValueError(
"tf.function-decorated function tried to create "
"variables on non-first call.")
self._stateless_fn = self._defun_with_scope(invalid_creator_scope)
self._stateless_fn._name = self._name # pylint: disable=protected-access
def _clone(self, python_function):
"" """
Comment explaining raise
Disables variable creation.
Links to the raise (1)
https://github.com/tensorflow/tensorflow/blob/105dbc2326e4fb36a793569e9834efe62e150d8d/tensorflow/python/eager/def_function.py#L764Ways to fix
Function
only supports creating variables once, when first called, and then reusing them. You cannot create tf.Variables
in new traces. Creating new variables in subsequent calls is currently not allowed, but will be in the future.
Error to reproduce:
import tensorflow as tf
@tf.function
def f(x):
v = tf.Variable(1.0)
return v
f(1)
Also optimizer is the same as variable. Adam creates variables internally, so it has the same limitation.
import tensorflow as tf
y_N= tf.Variable([1., 2., 3.],name="dd")
@tf.function
def loss():
return -tf.reduce_mean(input_tensor=tf.reduce_sum(input_tensor=tf.math.log(y_N), axis=0))
@tf.function
def run():
tf.keras.optimizers.Adam(0.5).minimize(loss, var_list=[y_N])
run()
You can create variables inside a Function
as long as those variables are only created the first time the function is executed. Check here
Fix code:
class Count(tf.Module):
def __init__(self):
self.count = None
@tf.function
def __call__(self):
if self.count is None:
self.count = tf.Variable(0)
return self.count.assign_add(1)
c = Count()
print(c())
print(c())
For the optimizer, you need to pull that out optimizer too.
import tensorflow as tf
y_N= tf.Variable([1., 2., 3.],name="dd")
optimizer = tf.keras.optimizers.Adam(0.5)
@tf.function
def loss():
return -tf.reduce_mean(input_tensor=tf.reduce_sum(input_tensor=tf.math.log(y_N), axis=0))
@tf.function
def run():
optimizer.minimize(loss, var_list=[y_N])
run()
Add a possible fix
Please authorize to post fix