llama.cpp
Custom RoPE + bettter memory management for CUDA
#2295
Merged

Custom RoPE + bettter memory management for CUDA #2295

ikawrakow merged 2 commits into master from ik/context_extend_cuda
ikawrakow
Custom RoPE + bettter memory management for CUDA
1cdbbbb3
ikawrakow ikawrakow requested a review from JohannesGaessler JohannesGaessler 2 years ago
jxy
ikawrakow
JohannesGaessler
ikawrakow
Adjusted look ahead in ggml_cuda_pool_malloc to 5%
b068f2f4
ggerganov
ggerganov approved these changes on 2023-07-21
ikawrakow
ikawrakow ikawrakow merged d924522a into master 2 years ago
ikawrakow ikawrakow deleted the ik/context_extend_cuda branch 2 years ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone