karpathy / llm.c

LLM training in simple, raw C/CUDA

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

apparent compatibility issues with earlier c++ versions after recent pushes

hafezmg48 opened this issue · comments

So I had an earlier version of the code which would compile 'train_gpt2cu' just fine on my system, with

nvidia driver v 545.23.08
CUDA version 12.3
gcc 9.4.0
Ubuntu 20.04

But the recent updates of the code, specifically the one that added the .cuh files (matmul.cuh) made the code incompatible with my system and compiling it results in errors:

llmc/matmul.cuh(16): error: namespace "std" has no member "bool_constant"
                                               std::bool_constant<UseAuxBuffer>) {

and other places.
I did some search, apparently this is a c++ version compatibility issue and setting the flag -std=c++17 would resolve the issue as this bool_constant was introduced in c++ 17. I was wondering if it is possible to avoid this? Many of us don't have the exact same day updates of these libraries and it would cause a lot of other hardware compatibility issues for everyone to update their drivers and compiler versions. So I was wondering if it is possible to avoid them if they are not necessary or add macros to still support the earlier compiler and cuda versions.

After all, as @karpathy mentioned, this project also has educational purposes and for someone like me it is getting a little hard to follow as well. But anyways, thank you everyone for your great work and teachings.

I'm sorry, this is why I am against the use of fancy/new features and will accept PRs that remove them.

note that g++9 has had its latest release in 2022, cublas and cutlass both officially require C++17, and that GCC has essentially full C++17 support since GCC 7 (core language, not all library features).

The things that is actually broken is just that we don't set the correct compile options, and C++17 has become the default only in later gcc releases

@ngc92 thanks for clarifying. closing and will merge the other PR.