salehjg / llm.sycl

The sycl version of llm.c (for the final project of HPC course 2024, UNISA)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

LLM.SYCL

This project is basically a partial translation of LLM.C repository from C-CUDA to C++ SYCL.

How to

Prepare

You need to have the oneAPI and CUDA SDKs installed. The code has been tested with the following versions:

  • oneAPI: 2021.4
  • CUDA: 12.2

Furthermore, you need to have numpy, torch, and python3 installed to run the training. The dataset will be fetched automatically.

Train

Refer to the readme file in data/ for training the model. This is required to run the CUDA and the SYCL implementations.

Build

Source the oneAPI and CUDA environment and then:

mkdir build && cd build
CC=icx CXX=icpx cmake ..
# ccmake ..
make -j

This will give you the LLM_SYCL, OrigTrain, and TestAll executables.

Run

To run the original CUDA code with minor modifications to disable training and to perform some intermediate tensor dumping as gold values:

./OrigTrain -b 1

To run the SYCL code:

./LLM_SYCL -s --batch 1 -x -g 10 -y

Set -g for larger values to generate more text. See -h for more details.

To run the test suite:

./TestAll

Verify

The output of the SYCL code should be similar to the output of the CUDA code. Other than that, for more detailed comparison with the gold (CUDA) implementation, you can use the data/compare.py script:

./build/OrigTrain -b 1
./build/LLM_SYCL -s --batch 1 -g 10
python data/compare.py

Note that we are running the SYCL implementation with profiling and intermediate tensor dumping enabled. This is the default config for the modified CUDA implementation.

Credits

This repo is developed as the final project for the HPC course 2024 of Prof. B. Cosenza at the University of Salerno. The following open-source projects have been used:

About

The sycl version of llm.c (for the final project of HPC course 2024, UNISA)

License:MIT License


Languages

Language:C++ 46.0%Language:Cuda 28.5%Language:Python 19.4%Language:C 3.3%Language:CMake 2.6%Language:Shell 0.3%