Well I did notice how nVidia left out the part that the CUDA C API had a known bug that would crash on exit due to a de-allocation bug for years. Thus, C++ became the de facto interface for their GPU library.
All code is terrible, but some of it is useful (except rust and VB). ;)
Well I did notice how nVidia left out the part that the CUDA C API had a known bug that would crash on exit due to a de-allocation bug for years. Thus, C++ became the de facto interface for their GPU library.
All code is terrible, but some of it is useful (except rust and VB). ;)