votes up 7

tf.function-decorated function tried to create variables on non-first call.

Exception Class:

Raise code

lf._graph_deleter = FunctionDeleter(self._lifted_initializer_graph)
    self._concrete_stateful_fn = (
        self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
            *args, **kwds))

    def invalid_creator_scope(*unused_args, **unused_kwds):
      """Disables variable creation."""
      raise ValueError(
          "tf.function-decorated function tried to create "
          "variables on non-first call.")

    self._stateless_fn = self._defun_with_scope(invalid_creator_scope)
    self._stateless_fn._name = self._name  # pylint: disable=protected-access

  def _clone(self, python_function):
    "" """

Comment explaining raise

Disables variable creation.

😲  Walkingbet is Android app that pays you real bitcoins for a walking. Withdrawable real money bonus is available now, hurry up! 🚶

Ways to fix

votes up 2 votes down

Function only supports creating variables once, when first called, and then reusing them. You cannot create tf.Variables in new traces. Creating new variables in subsequent calls is currently not allowed, but will be in the future.

Error to reproduce:

import tensorflow as tf 

def f(x):
  v = tf.Variable(1.0)
  return v

Also optimizer is the same as variable.  Adam creates variables internally, so it has the same limitation.

import tensorflow as tf

y_N= tf.Variable([1.2.3.],name="dd")
def loss():
  return -tf.reduce_mean(input_tensor=tf.reduce_sum(input_tensor=tf.math.log(y_N), axis=0))

def run():
  tf.keras.optimizers.Adam(0.5).minimize(loss, var_list=[y_N])


You can create variables inside a Function as long as those variables are only created the first time the function is executed. Check here

Fix code:

class Count(tf.Module):
  def __init__(self):
    self.count = None

  def __call__(self):
    if self.count is None:
      self.count = tf.Variable(0)
    return self.count.assign_add(1)

c = Count()

For the optimizer, you need to pull that out optimizer too.

import tensorflow as tf

y_N= tf.Variable([1.2.3.],name="dd")
optimizer = tf.keras.optimizers.Adam(0.5)
def loss():
  return -tf.reduce_mean(input_tensor=tf.reduce_sum(input_tensor=tf.math.log(y_N), axis=0))

def run():
  optimizer.minimize(loss, var_list=[y_N])

Jul 04, 2021 anonim answer
anonim 13.0k

Add a possible fix

Please authorize to post fix