Accelerated pseudospectral calculations using KernelAbstractions.jl
- Clone this repo
- Inside of the Julia REPL (preferably within a top-level environment e.g. @v1.10, etc.) run
]dev path/to/KAPseudospectra.jl
- You may need to run
]instantiate
to resolve package dependencies ofKAPseudospectra.jl
Check out examples/
for three scripts that showcase usage of this package.
They are:
ihlpsa_backends.jl
-- a good starting point, showing how to switch between device-specific backends (CUDA
andAMDGPU
have been tested thusfar) Note,AMDGPU
currently requires running Julia with a single thread (there is a bug in Julia when running with multiple threads, likely related to premature garbage collection)test_real_structured_psa.jl
-- compute structured/unstructured pseudospectra for a matrix usingCPU()
and plot them togethertest_ihlpsa_large.jl
-- compute pseudospectra for increasingly large matrices usingCUDABackend()
, writing timing information and plots toexamples/test_large_results/
Run ]add LinearAlgebra MatrixDepot Plots LaTeXStrings PyPlot KernelAbstractions
to be able to run ihlpsa_backends.jl
and test_real_structured_psa.jl
.
Optionally, run ]add CUDA
if you have a cuda-enabled device to run test_ihlpsa_large.jl
.