llama.cpp
c95fa362
- ci: [SYCL] ggml-ci Use main GPU and enable sysman (#12547)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Hide Minimap (CTRL+M)
Commit
112 days ago
ci: [SYCL] ggml-ci Use main GPU and enable sysman (#12547)
References
#12547 - ci: [SYCL] Use main GPU and enable sysman
Author
qnixsynapse
Parents
2b65ae30
Files
1
ci
run.sh
Loading