Add GPT-2 with flash attention #1889
danieldk
force pushed
from
e3c55276
to
2261b92c
1 year ago
danieldk
force pushed
from
2261b92c
to
0b66a507
1 year ago
Add GPT-2 with flash attention
8acd1267
danieldk
force pushed
from
0b66a507
to
8acd1267
1 year ago
Narsil
approved these changes
on 2024-05-15
Narsil
merged
b5bc6e5c
into main 1 year ago
Narsil
deleted the feature/flash-gpt2 branch 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub