transformers
Fix contrastive search to correctly handle input with padding
#33507
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
6
Changes
View On
GitHub
Commits
fix: handle padding in contrastive search for decoder-only models
ducviet00
committed
1 year ago
fix: handle padding in contrastive search for encoder-decoder models
ducviet00
committed
1 year ago
tests: move padding contrastive test to test_util, add t5 test
ducviet00
committed
1 year ago
fix: handle if model_kwargs["decoder_attention_mask"] is None
ducviet00
committed
1 year ago
refactor: improve padding input contrastive search generation tests
ducviet00
committed
1 year ago
chore: _ranking_fast to use LongTensor for cosine_matrix_mask
ducviet00
committed
1 year ago
Loading