llama.cpp
llama : switch to floating-point token positions
#5679
Open

llama : switch to floating-point token positions #5679

ggerganov wants to merge 4 commits into master from gg/float-pos
ggerganov
ggerganov llama : switch to floating-point token positions
fc775366
ggerganov ggerganov force pushed from 28ad0a6d to 06f92202 1 year ago
ggerganov ggml : add I32 <-> F32 conversion
8772658b
ggerganov ggerganov force pushed from 06f92202 to 8772658b 1 year ago
ngxson
ggerganov
ggerganov batched.swift : fix build
fff1e8a5
ggerganov swift : fix build
608f4498
mofosyne mofosyne added refactoring
mofosyne mofosyne added Review Complexity : High
ggerganov ggerganov added demo

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone