wanlijin's repositories

tensorrt-inference-server

The TensorRT Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.

License:BSD-3-ClauseStargazers:0Issues:0Issues:0

incubator-brpc

Industrial-grade RPC framework used throughout Baidu, with 1,000,000+ instances and thousands kinds of services, called "baidu-rpc" inside Baidu.

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

tvm

Open deep learning compiler stack for cpu, gpu and specialized accelerators

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

Reco-papers

Classic papers and resources on recommendation

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

Paddle

PArallel Distributed Deep LEarning

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

toy

just for test

License:Apache-2.0Stargazers:0Issues:0Issues:0

k-vim

vim配置

Language:VimLStargazers:0Issues:0Issues:0