llama.cpp
Add support for new gfx1200 and gfx1201 targets
#12372
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
7
Changes
View On
GitHub
Add support for new gfx1200 and gfx1201 targets
#12372
IMbackK
merged 7 commits into
ggml-org:master
from
slojosic-amd:feature/gfx120X_targets
HIP: Add support for new gfx1200 and gfx1201 targets
2d7a1f9a
HIP: Avoid fp32->fp16->fp32 conversion on RDNA4
f2872aa8
slojosic-amd
requested a review
from
JohannesGaessler
190 days ago
github-actions
added
documentation
github-actions
added
Nvidia GPU
github-actions
added
ggml
JohannesGaessler
requested a review
from
IMbackK
187 days ago
IMbackK
requested changes on 2025-03-16
cgmb
commented on 2025-03-26
Addressed few comments from code review
42840e9f
HIP: Fixed fp32->fp16->fp32 conversion on RDNA4
d7680808
Merge branch 'master' into feature/gfx120X_targets
f763866c
bugfix
f18ad77d
slojosic-amd
requested a review
from
IMbackK
177 days ago
IMbackK
requested changes on 2025-03-26
Additional code review changes
6b46213d
IMbackK
approved these changes on 2025-03-26
IMbackK
merged
bd40678d
into master
176 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
IMbackK
Beinsezii
cgmb
JohannesGaessler
Assignees
No one assigned
Labels
documentation
Nvidia GPU
ggml
Milestone
No milestone
Login to write a write a comment.
Login via GitHub