llama.cpp
OpenELM support
#7359
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
16
Changes
View On
GitHub
Commits
Initial OpenELM support (270M only so far)
icecream95
committed
1 year ago
Fill out missing entries in llama_model_type_name
icecream95
committed
1 year ago
fixup! Initial OpenELM support (270M only so far)
icecream95
committed
1 year ago
Merge branch 'master' into openelm
compilade
committed
1 year ago
llama : support all OpenELM models
compilade
committed
1 year ago
llama : minor spacing changes
compilade
committed
1 year ago
llama : use std::array for per-layer hparams
ggerganov
committed
1 year ago
llama : fix save/load state
ggerganov
committed
1 year ago
llama : do not print hparams for vocab-only models
ggerganov
committed
1 year ago
Merge branch 'master' into pr/7359
ggerganov
committed
1 year ago
llama : handle n_head == 0
ggerganov
committed
1 year ago
Merge branch 'master' into pr/7359
ggerganov
committed
1 year ago
llama : use const ref for print_f and fix division by zero
compilade
committed
1 year ago
Merge branch 'master' into openelm
compilade
committed
1 year ago
llama : fix t5 uses of n_head and n_ff
compilade
committed
1 year ago
llama : minor comment
ggerganov
committed
1 year ago
Loading