llama.cpp
[SYCL] Fixed minor bug when enabling FP16 for non intel targets
#6464
Merged

[SYCL] Fixed minor bug when enabling FP16 for non intel targets #6464

OuadiElfarouki
OuadiElfarouki moved INTEL_MKL guard from gemm_impl to gemm (wrapper)
40704232
OuadiElfarouki Merge branch 'master' into sycl_fix_non_intel_fp16
f746e707
OuadiElfarouki Update ggml-sycl.cpp
84ef62e8
OuadiElfarouki Merge branch 'master' into sycl_fix_non_intel_fp16
c0c4b309
OuadiElfarouki Merge branch 'master' into sycl_fix_non_intel_fp16
a7c67582
github-actions
AidanBeltonS
AidanBeltonS approved these changes on 2024-04-04
abhilash1910
abhilash1910 approved these changes on 2024-04-04
abhilash1910 abhilash1910 requested a review from ggerganov ggerganov 1 year ago
abhilash1910 abhilash1910 merged 1b496a74 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone