Yuqi's repositories
Awesome-Implicit-NeRF-Robotics
A comprehensive list of Implicit Representations and NeRF papers relating to Robotics/RL domain, including papers, codes, and related websites
ChatGLM-6B
ChatGLM-6B:开源双语对话语言模型
CLIP
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
DearPyGui
Dear PyGui: A fast and powerful Graphical User Interface Toolkit for Python with minimal dependencies
face-alignment
:fire: 2D and 3D Face alignment library build using pytorch
infinigen
Infinite Photorealistic Worlds using Procedural Generation
JNeRF
JNeRF is a NeRF benchmark based on Jittor. JNeRF re-implemented instant-ngp and achieved same performance with original paper.
k-diffusion
Karras et al. (2022) diffusion models for PyTorch
LargeScaleNeRFPytorch
1. State-of-the-art, simple, fast unbounded / large-scale NeRFs. 2. Weekly classified NeRF literature.
list-of-surgical-tool-datasets
List of surgical tool datasets organised by task.
monosdf
[NeurIPS'22] MonoSDF: Exploring Monocular Geometric Cues for Neural Implicit Surface Reconstruction
nerfstudio
A collaboration friendly studio for NeRFs
neural_renderer
A PyTorch port of the Neural 3D Mesh Renderer
nvdiffrast
Nvdiffrast - Modular Primitives for High-Performance Differentiable Rendering
pytorch3d
PyTorch3D is FAIR's library of reusable components for deep learning with 3D data
RAD-NeRF
Real-time Neural Radiance Talking Portrait Synthesis via Audio-spatial Decomposition
sdfstudio
A Unified Framework for Surface Reconstruction
taming-transformers
Taming Transformers for High-Resolution Image Synthesis
TensoRF
[ECCV 2022] Tensorial Radiance Fields, a novel approach to model and reconstruct radiance fields
tiny-cuda-nn
Lightning fast C++/CUDA neural network framework
torch-ngp
A pytorch CUDA extension implementation of instant-ngp (sdf and nerf), with a GUI.
vid2vid
Pytorch implementation of our method for high-resolution (e.g. 2048x1024) photorealistic video-to-video translation.
VPS
Video Polyp Segmentation: A Deep Learning Perspective (MIR 2022)