mmaitre314 / ONNX-DirectML

A Jupyter notebook testing DirectML support of ONNX on Windows using an Intel GPU.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Python ML inference acceleration on Windows using commodity GPUs

A Jupyter notebook testing DirectML support of ONNX on Windows using an Intel GPU.

Getting Started

Initialize the virtual environment and start Jupyter Lab. Open an Anaconda prompt (Miniconda is sufficient) and run:

conda env create -f conda.yaml
conda activate onnx-directml
python -m ipykernel install --user --name=onnx-directml
jupyter lab

Then open notebook.ipynb and continue there.

Note: if Jupyter does not load on Chrome, open chrome://net-internals/#hsts and use Delete domain security policies to delete localhost, then try again.

About

A Jupyter notebook testing DirectML support of ONNX on Windows using an Intel GPU.

License:MIT License


Languages

Language:Jupyter Notebook 100.0%