mzl's repositories
Traditional-CV
algorithm about image recognition
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
ChatGLM2-6B
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
Language:PythonNOASSERTION000
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Language:PythonMIT000
Language:C++MIT000
NetWork-Chinese-chess-
**象棋网络对战
PHPcodeigniter-school-lost-and-found
lost and found website
DPCT-samples
Samples for Intel oneAPI toolkits
Language:HTMLMIT000
LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
Language:PythonApache-2.0000
optimum-habana
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Language:PythonApache-2.0000
pratice-for-exam
pratice..
Simple-NetWork-Server
e asio service demo
stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
Language:PythonApache-2.0000
Language:C++000