llama.cpp
[WebGPU] Plug memory leaks and free resources on shutdown
#19315
Merged

[WebGPU] Plug memory leaks and free resources on shutdown #19315

nikhilJain17
nikhilJain17 Merge
df604979
nikhilJain17 Merge
5ae7583c
nikhilJain17 Fix memory leaks in shader lib, backend, backend_context, buffer_cont…
4a619f54
nikhilJain17 Free pools
f2eb8b92
nikhilJain17 Cleanup
2395b8a5
nikhilJain17 More cleanup
98cbfd29
nikhilJain17
nikhilJain17 commented on 2026-02-04
github-actions github-actions added ggml
nikhilJain17 Run clang-format
3b6a596f
nikhilJain17 Fix arg-parser and tokenizer test errors that free an unallocated buffer
df9e093b
nikhilJain17 Fix device lost callback to not print on device teardown
5dfef31f
nikhilJain17 Fix include and run clang-format
c2aa8a15
nikhilJain17 nikhilJain17 marked this pull request as ready for review 69 days ago
nikhilJain17 nikhilJain17 requested a review from reeselevine reeselevine 69 days ago
reeselevine remove unused unused
73d367a1
reeselevine
reeselevine approved these changes on 2026-02-09
reeselevine
reeselevine Merge remote-tracking branch 'upstream/master' into nikhilJain17/clea…
6fe1d4d9
reeselevine Update binary ops
a088e051
reeselevine reeselevine merged 57487a64 into master 64 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone