llama.cpp
Support attention_bias on LLaMA architecture
#4283
Merged

Support attention_bias on LLaMA architecture #4283

RealJosephus
RealJosephus Support attention_bias on LLaMA architecture
c48679a8
slaren
slaren commented on 2023-12-01
RealJosephus check existence of qkvo bias while loading llama models
e192572d
RealJosephus Update llama.cpp
b1efaed3
RealJosephus RealJosephus requested a review from ggerganov ggerganov 2 years ago
ggerganov
ggerganov approved these changes on 2023-12-01
ggerganov ggerganov merged 03562f3a into master 2 years ago
cebtenzzre
cebtenzzre commented on 2023-12-01

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone