tmigot / AMDGPU.jl

AMD GPU (ROCm) programming in Julia

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AMDGPU.jl

AMD GPU (ROCm) programming in Julia

Documentation Build Status

Quick start

AMDGPU.jl can be installed with the Julia package manager. From the Julia REPL, type ] to enter the Pkg REPL mode and run:

pkg> add AMDGPU

Or, equivalently, via the Pkg API:

julia> import Pkg; Pkg.add("AMDGPU")

Project Status

The package is tested against, and being developed for, Julia 1.6 and above. Only 64-bit Linux is supported and working at this time, until ROCm is ported to other platforms. It is recommended to use a version of Julia with LLVM 9.0 or higher. This package is under active maintenance and is reasonably complete, however not all features (and especially performance) are up to par with CUDA.jl.

Supported Functionality

Feature Supported Notes
Host-side kernel launches ✔️ See #58
Dynamic parallelism
Local (shared) memory ✔️
Coarse-grained memory ✔️
Page-locked (pinned) memory ✔️

Questions and Contributions

Usage questions can be posted on the Julia Discourse forum under the GPU domain and/or in the #gpu channel of the Julia Slack.

Contributions are very welcome, as are feature requests and suggestions. Please open an issue if you encounter any problems.

Acknowledgment

AMDGPU would not have been possible without the work by Tim Besard and contributors to CUDA.jl and LLVM.jl.

License

AMDGPU.jl is licensed under the MIT License.

About

AMD GPU (ROCm) programming in Julia

License:Other


Languages

Language:Julia 100.0%