llama.cpp
Fix FlashAttention debug test, FP32 assert
#7684
Merged

Fix FlashAttention debug test, FP32 assert #7684

JohannesGaessler
JohannesGaessler Fix FlashAttention debug test, FP32 assert
45102363
github-actions github-actions added testing
github-actions github-actions added Nvidia GPU
github-actions github-actions added ggml
slaren
slaren approved these changes on 2024-06-01
JohannesGaessler JohannesGaessler merged e141ce62 into master 1 year ago

Login to write a write a comment.

Login via GitHub

Reviewers
Assignees
No one assigned
Labels
Milestone