Fix Exception
🏆 FixMan
BTC
Cup
Exceptions for package
transformers
50617
6
3009
text input must of type `str` (single example), `List[str]` (batch or single pretokenized example) or `List[List[str]]` (batch of pretokenized examples).
/ transformers
6
1408
word_ids() is not available when using Python-based tokenizers
/ transformers
6
1206
Some specified arguments are not used by the HfArgumentParser: (remaining_args)
/ transformers
6
1204
got_ver is None
/ transformers
6
838
char_to_token() is not available when using Python based tokenizers
/ transformers
6
241
Both extra_ids ((extra_ids)) and additional_special_tokens ((additional_special_tokens)) are provided to T5Tokenizer. In this case the additional_special_tokens must include the extra_ids tokens
/ transformers
5
2227
Unable to create tensor, you should probably activate truncation and/or padding with 'padding=True' 'truncation=True' to have batched tensors with the same length.
/ transformers
5
1434
return_offset_mapping is not available when using Python tokenizers.To use this feature, change your tokenizer to one deriving from transformers.PreTrainedTokenizerFast.
/ transformers
5
471
Can only automatically infer lengths for datasets whose items are dictionaries with an '(self.model_input_name)' key.
/ transformers
5
367
Make sure `_init_weigths` is implemented for (self.__class__)
/ transformers
5
325
`num_return_sequences` has to be smaller or equal to `num_beams`.
/ transformers
4
1526
Unable to convert output to PyTorch tensors format, PyTorch is not installed.
/ transformers
First
Prev
Page
1
of
52
Next
Last