llama.cpp
ggml-alloc : fix leak when reusing a tensor with a larger size
#16679
Merged

ggml-alloc : fix leak when reusing a tensor with a larger size #16679

slaren merged 1 commit into master from sl/fix-alloc-inline-leak
slaren
github-actions github-actions added ggml
slaren slaren changed the title ggml-alloc : free leak when reusing a tensor with a larger size ggml-alloc : fix leak when reusing a tensor with a larger size 55 days ago
ggerganov
slaren ggml-alloc : fix leak when reusing a tensor with a larger size
64c66065
slaren slaren force pushed to 64c66065 55 days ago
slaren slaren marked this pull request as ready for review 55 days ago
ggerganov
ggerganov approved these changes on 2025-10-20
slaren slaren merged b617cfd2 into master 55 days ago
slaren slaren deleted the sl/fix-alloc-inline-leak branch 55 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone