transformers
dc8b6eae - Fix contrastive search to correctly handle input with padding (#33507)

Commit
1 year ago
Fix contrastive search to correctly handle input with padding (#33507) * fix: handle padding in contrastive search for decoder-only models * fix: handle padding in contrastive search for encoder-decoder models * tests: move padding contrastive test to test_util, add t5 test * fix: handle if model_kwargs["decoder_attention_mask"] is None * refactor: improve padding input contrastive search generation tests * chore: _ranking_fast to use LongTensor for cosine_matrix_mask
Author
Parents
Loading