lightvector / KataGo

GTP engine and self-play learning in Go

Home Page:https://katagotraining.org/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Proposal for Docker Integration to Test KataGo Versions Across Various Environments

TTXS123OK opened this issue · comments

Hello,

I've been exploring the KataGo project and appreciate the robust capabilities it offers for Go game analysis. One challenge I've noticed is that KataGo often depends on newer versions of CUDA and TensorRT, and users may not frequently update their local environments to the latest versions. This may make users uncomfortable.

To address this, I propose establishing a Dockerized environment for KataGo building and testing, which maybe allow users and developers to test and run KataGo versions in a more controlled and consistent manner, without the need to directly manage CUDA and TensorRT on their phsical systems.

If this seems like a valuable addition, I am willing to contribute by updating and maintaining a functional Dockerfile for this repo. This Dockerfile would aim to:

  • Enable developers and users to test different versions of KataGo against various versions of CUDA and TensorRT.
  • Facilitate easier bug reporting and feature testing by standardizing the testing environment.

I believe this could enhance the accessibility and usability of KataGo for a wider audience.

Please let me know if this aligns with the project's goals, and if there are any specific requirements or preferences for such an integration.

Thank you for considering this proposal.

If you use or can use the nix package manager, I maintain a nix derivation of katago, which also serves the purpose of creating an environment with the exact versions of libraries required to run, without the overhead of a virtualized system.

Yes, that's indeed a great idea. The nix package manager allows users to easily install KataGo based on their own environment. However, I still believe that for compatibility testing and benchmarking across different versions of KataGo, CUDA, and TensorRT, using Docker offers certain advantages. In time, I might set up my own open-source repository for this purpose. If there's a similar demand from others, I'll provide a link in this issue latter.