transformers
Bugfix/alexsherstinsky/fix none check for attention factor in rope scaling 2024 08 28 0
#33188
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
6
Changes
View On
GitHub
Commits
Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
alexsherstinsky
committed
1 year ago
Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
alexsherstinsky
committed
1 year ago
Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
alexsherstinsky
committed
1 year ago
Merge branch 'main' into bugfix/alexsherstinsky/fix_none_check_for_attention_factor_in_rope_scaling-2024_08_28-0
alexsherstinsky
committed
1 year ago
Merge branch 'main' into bugfix/alexsherstinsky/fix_none_check_for_attention_factor_in_rope_scaling-2024_08_28-0
alexsherstinsky
committed
1 year ago
Merge branch 'main' into bugfix/alexsherstinsky/fix_none_check_for_attention_factor_in_rope_scaling-2024_08_28-0
alexsherstinsky
committed
1 year ago
Loading