votes up 6

activation should be relu/gelu, not (param1)

Package:
torch
github stars 50580
Exception Class:
RuntimeError

Raise code

def _get_activation_fn(activation):
    if activation == "relu":
        return F.relu
    elif activation == "gelu":
        return F.gelu

    raise RuntimeError("activation should be relu/gelu, not {}".format(activation))
😲  Walkingbet is Android app that pays you real bitcoins for a walking. Withdrawable real money bonus is available now, hurry up! 🚶

Ways to fix

votes up 2 votes down

When initializing TransformerDecoderLayer the parameter activation should be given a valid value.

Reproducing the error:

pipenv install torch

import torch
from torch import nn


decoder_layer = nn.TransformerDecoderLayer(d_model=512, 
                                           nhead=8,
                                           activation="rel"# the valid value are either relu or gelu
memory = torch.rand(1032512)
tgt = torch.rand(2032512)
out = decoder_layer(tgt, memory)
print(out.shape)

The error:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-7-36493082f32e> in <module>()
      4 decoder_layer = nn.TransformerDecoderLayer(d_model=512, 
      5                                            nhead=8,
----> 6                                            activation="rel")
      7 memory = torch.rand(10, 32, 512)
      8 tgt = torch.rand(20, 32, 512)

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/transformer.py in __init__(self, d_model, nhead, dim_feedforward, dropout, activation, layer_norm_eps, batch_first, device, dtype)
    380         self.dropout3 = Dropout(dropout)
    381 
--> 382         self.activation = _get_activation_fn(activation)
    383 
    384     def __setstate__(self, state):

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/transformer.py in _get_activation_fn(activation)
    426         return F.gelu
    427 
--> 428     raise RuntimeError("activation should be relu/gelu, not {}".format(activation))

RuntimeError: activation should be relu/gelu, not rel

Fixed:

import torch
from torch import nn


decoder_layer = nn.TransformerDecoderLayer(d_model=512, 
                                           nhead=8,
                                           activation="relu")
memory = torch.rand(1032512)
tgt = torch.rand(2032512)
out = decoder_layer(tgt, memory)
print(out.shape)

Output:

torch.Size([20, 32, 512])

Jul 14, 2021 kellemnegasi answer
kellemnegasi 31.6k

Add a possible fix

Please authorize to post fix