votes up 5

`tape` is required when a `Tensor` loss is passed.

Package:
keras
github stars 52268
Exception Class:
ValueError

Raise code

    """Clip gradients according to the clipnorm and clipvalue attributes."""
    # TFOptimizer wrapper has no gradient clipping options.
    return grads

  def minimize(self, loss, var_list, grad_loss=None, tape=None):
    """Mimics the `OptimizerV2.minimize` API."""
    if not callable(loss) and tape is None:
      raise ValueError('`tape` is required when a `Tensor` loss is passed.')
    tape = tape if tape is not None else tf.GradientTape()

    if callable(loss):
      with tape:
        if not callable(var_list):
          tape.watch(var_list)
        loss = loss()
😲  Walkingbet is Android app that pays you real bitcoins for a walking. Withdrawable real money bonus is available now, hurry up! 🚶

Ways to fix

votes up 1 votes down

Steps to reproduce:

Step 1: Create a test directory using the command

$ mkdir test-keras

Step 2: Navigate to the new test directory using the command

$ cd test-keras

Step 3: Run the command

$ pipenv shell

Step 4: Install the dependencies using the command

$ pipenv install tensorflow

Step 5: Run the below code

import tensorflow as tf

opt = tf.keras.optimizers.Adam(learning_rate=0.2)
var1 = tf.Variable(10.0, name='var1')

loss = (var1 ** 2) / 2.0 # a single loss value instead of loss function
  
opt.minimize(loss, var_list=[var1]).numpy() # minimize takes a loss function as arg that is callable

The above code generates the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-11-acdc197725e0> in <module>()
      6 loss = (var1 ** 2) / 2.0
      7 
----> 8 opt.minimize(loss, var_list=[var1]).numpy()

/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py in _compute_gradients(self, loss, var_list, grad_loss, tape)
    562     # TODO(josh11b): Test that we handle weight decay in a reasonable way.
    563     if not callable(loss) and tape is None:
--> 564       raise ValueError("`tape` is required when a `Tensor` loss is passed.")
    565     tape = tape if tape is not None else backprop.GradientTape()
    566 

ValueError: `tape` is required when a `Tensor` loss is passed.


Fixed version of code:

import tensorflow as tf

opt = tf.keras.optimizers.Adam(learning_rate=0.2)
var1 = tf.Variable(10.0, name='var1')

loss_function = lambda: (var1 ** 2) / 2.0 # loss function using lambda function in python
opt.minimize(loss_function, var_list=[var1]).numpy()

Explanation:

The minimize function takes a loss function as an argument. To reproduce we supplied a single loss value that is not callable and since we haven't supplied tape as well and it's none by default, thus, causing the exception to occur. To fix it make sure we have supplied a loss function or if you are providing tensor as a loss then define tape argument as well.

Jun 19, 2021 umangtaneja98 answer

Add a possible fix

Please authorize to post fix