Post

Replies

Boosts

Views

Activity

Reply to Official One-Click Local LLM Deployment for 2019 Mac Pro (7,1) Dual W6900X
ROCm does not support the W6900X, based on the information at https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html. You will likely need to use Vulkan or request that AMD add support. The W6800X and W6800X Duo should work fine, as their architectures are the same as the W6800. Infinity Fabric Link does not work on Linux. There is already an ongoing bug report for the AMDGPU kernel driver: https://gitlab.freedesktop.org/drm/amd/-/work_items/3793. I doubt AMD will fix this anytime soon. You also likely need to use distributions provided by T2 Linux or use the T2 kernel.
Topic: Machine Learning & AI SubTopic: General Tags:
Apr ’26
Reply to Official One-Click Local LLM Deployment for 2019 Mac Pro (7,1) Dual W6900X
ROCm does not support the W6900X, based on the information at https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html. You will likely need to use Vulkan or request that AMD add support. The W6800X and W6800X Duo should work fine, as their architectures are the same as the W6800. Infinity Fabric Link does not work on Linux. There is already an ongoing bug report for the AMDGPU kernel driver: https://gitlab.freedesktop.org/drm/amd/-/work_items/3793. I doubt AMD will fix this anytime soon. You also likely need to use distributions provided by T2 Linux or use the T2 kernel.
Topic: Machine Learning & AI SubTopic: General Tags:
Replies
Boosts
Views
Activity
Apr ’26