votes up 7

tf.function-decorated function tried to create variables on non-first call.

Package:
Exception Class:
ValueError

Raise code

lf._graph_deleter = FunctionDeleter(self._lifted_initializer_graph)
    self._concrete_stateful_fn = (
        self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
            *args, **kwds))

    def invalid_creator_scope(*unused_args, **unused_kwds):
      """Disables variable creation."""
      raise ValueError(
          "tf.function-decorated function tried to create "
          "variables on non-first call.")

    self._stateless_fn = self._defun_with_scope(invalid_creator_scope)
    self._stateless_fn._name = self._name  # pylint: disable=protected-access

  def _clone(self, python_function):
    "" """

Comment explaining raise

Disables variable creation.

🙏 Scream for help to Ukraine
Today, 25th May 2022, Russia continues bombing and firing Ukraine. Don't trust Russia, they are bombing us and brazenly lying in same time they are not doing this 😠, civilians and children are dying too! We are screaming and asking exactly you to help us, we want to survive, our families, children, older ones.
Please spread the information, and ask your governemnt to stop Russia by any means. We promise to work extrahard after survival to make the world safer place for all.

Ways to fix

votes up 2 votes down

Function only supports creating variables once, when first called, and then reusing them. You cannot create tf.Variables in new traces. Creating new variables in subsequent calls is currently not allowed, but will be in the future.

Error to reproduce:

import tensorflow as tf 

@tf.function
def f(x):
  v = tf.Variable(1.0)
  return v
f(1)

Also optimizer is the same as variable.  Adam creates variables internally, so it has the same limitation.

import tensorflow as tf

y_N= tf.Variable([1.2.3.],name="dd")
@tf.function
def loss():
  return -tf.reduce_mean(input_tensor=tf.reduce_sum(input_tensor=tf.math.log(y_N), axis=0))

@tf.function
def run():
  tf.keras.optimizers.Adam(0.5).minimize(loss, var_list=[y_N])

run()

You can create variables inside a Function as long as those variables are only created the first time the function is executed. Check here

Fix code:

class Count(tf.Module):
  def __init__(self):
    self.count = None

  @tf.function
  def __call__(self):
    if self.count is None:
      self.count = tf.Variable(0)
    return self.count.assign_add(1)

c = Count()
print(c())
print(c())

For the optimizer, you need to pull that out optimizer too.

import tensorflow as tf

y_N= tf.Variable([1.2.3.],name="dd")
optimizer = tf.keras.optimizers.Adam(0.5)
@tf.function
def loss():
  return -tf.reduce_mean(input_tensor=tf.reduce_sum(input_tensor=tf.math.log(y_N), axis=0))

@tf.function
def run():
  optimizer.minimize(loss, var_list=[y_N])

run()
Jul 04, 2021 anonim answer
anonim 13.0k

Add a possible fix

Please authorize to post fix