wanlijin's repositories
tensorrt-inference-server
The TensorRT Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.
BSD-3-Clause000
incubator-brpc
Industrial-grade RPC framework used throughout Baidu, with 1,000,000+ instances and thousands kinds of services, called "baidu-rpc" inside Baidu.
Language:C++Apache-2.0000
tvm
Open deep learning compiler stack for cpu, gpu and specialized accelerators
Language:PythonApache-2.0000
Reco-papers
Classic papers and resources on recommendation
Language:PythonMIT000
Paddle
PArallel Distributed Deep LEarning
Language:C++Apache-2.0000
toy
just for test
Apache-2.0000
k-vim
vim配置
Language:VimL000