llama.cpp
musa: enable VMM support
#9597
Merged

musa: enable VMM support #9597

slaren merged 1 commit into ggml-org:master from makllama:musa_vmm
yeahdongcn
yeahdongcn
yeahdongcn mtgpu: enable VMM
bd95f2c3
yeahdongcn yeahdongcn force pushed to bd95f2c3 1 year ago
yeahdongcn yeahdongcn marked this pull request as ready for review 1 year ago
yeahdongcn
yeahdongcn
slaren
slaren approved these changes on 2024-09-26
slaren slaren merged 7691654c into master 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone