llama.cpp
c95fa362
- ci: [SYCL] ggml-ci Use main GPU and enable sysman (#12547)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
167 days ago
ci: [SYCL] ggml-ci Use main GPU and enable sysman (#12547)
References
#12547 - ci: [SYCL] Use main GPU and enable sysman
Author
qnixsynapse
Parents
2b65ae30
Loading