Justin W. Lin's repositories
evals
Evals is a framework for evaluating LLMs and LLM systems, and an open-source registry of benchmarks.
Language:PythonNOASSERTION000
GroundingDINO
Official implementation of the paper "Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection"
Language:PythonApache-2.0000
justinlinw
'About' repo on GH