transformers
4e94c6c0 - Fix position embeddings for GPT-J and CodeGen (#22069)

Commit
2 years ago
Fix position embeddings for GPT-J and CodeGen (#22069) * Revert "[GPT-J] add deprecation warning (#21869)" This reverts commit fb76994c41d1eaf09e50020cbd849d3bb686b6a3. * Fix position embeddings for GPT-J and CodeGen * Address review comments from @gante * Fix "Copied from" comment referencing wrong function * Fix copy/paste mistake * Fix training path * Hopefully make torch.fx happy * Move position_ids long cast * Revert "Hopefully make torch.fx happy" This reverts commit e41a6f4cad3ff441124c7457b19cfb630d4ca025. * Changes to help with torch.fx tracing * Linter fix * Correct position_ids tensor type hint * Work-around torch.fx tracing issue * Get the changes to work with torch.fx * Address review comment from @michaelbenayoun * Another small adjustment * Add explanatory comment; small code tidyup
Author
Parents
Loading