votes up 5

Make sure `_init_weigths` is implemented for (self.__class__)

Exception Class:

Raise code

        return None  # Overwrite for models with output embeddings

    def _init_weights(self, module):
        Initialize the weights. This method should be overridden by derived class.
        raise NotImplementedError(f"Make sure `_init_weigths` is implemented for {self.__class__}")

    def tie_weights(self):
        Tie the weights between the input embeddings and the output embeddings.

        If the :obj:`torchscript` flag is set in the configuration, can't handle parameter sharing so we are cloning
        the weights instead. """

Comment explaining raise

Initialize the weights. This method should be overridden by derived class.

ūüôŹ Scream for help to Ukraine
Today, 2nd July 2022, Russia continues bombing and firing Ukraine. Don't trust Russia, they are bombing us and brazenly lying in same time they are not doing this ūüė†, civilians and children are dying too! We are screaming and asking exactly you to help us, we want to survive, our families, children, older ones.
Please spread the information, and ask your governemnt to stop Russia by any means. We promise to work extrahard after survival to make the world safer place for all.

Ways to fix

votes up 2 votes down
def _init_weights(self, module):
    Initialize the weights. This method should be overridden by derived class.
    raise NotImplementedError(f"Make sure `_init_weigths` is implemented for {self.__class__}")

Initialize weights are overridden for PreTrainedModel by a derived class. Because it doesn't matter what we are passing to the method, it gives an error.

Error to reproduce:

from transformers import PreTrainedModel,PretrainedConfig
import torch 

# FloatTensor containing pretrained weights
weight = torch.FloatTensor([[1, 2.3, 3], [4, 5.1, 6.3]])
embedding = nn.Embedding.from_pretrained(weight)

# Download model and configuration from and cache.
config = PretrainedConfig()
moda = PreTrainedModel(config)

For other transformers Model, init_weights are used and you can check here why we need

Jul 15, 2021 anonim answer
anonim 13.0k

Add a possible fix

Please authorize to post fix