syiswell's repositories
acl-style-files
Official style files for papers submitted to venues of the Association for Computational Linguistics
awesome-multimodal-ml
Reading list for research topics in multimodal machine learning
Awesome-Pruning
A curated list of neural network pruning resources.
BERT4Rec-VAE-Pytorch
Pytorch implementation of BERT4Rec and Netflix VAE.
chat_templates
Chat Templates for 🤗 HuggingFace Large Language Models
ChatGPTPapers
Must-read papers, related blogs and API tools on the pre-training and tuning methods for ChatGPT.
ChatPaper
Use ChatGPT to summarize the arXiv papers. 全流程加速科研,利用chatgpt进行论文全文总结+专业翻译+润色+审稿+审稿回复
CpRec-1
A Generic Network Compression Framework for Sequential Recommender Systems
cross-modal-ablation
Code and data for our paper "Vision-and-Language or Vision-for-Language? On Cross-Modal Influence in Multimodal Transformers", EMNLP 2021.
DeepCCA
An implementation of Deep Canonical Correlation Analysis (DCCA or Deep CCA) with pytorch.
DENIM
Code Implementation of ACL24 Paper: Discourse Structure-Aware Prefix for Generation-Based End-to-End Argumentation Mining
Event-Extraction
近年来事件抽取方法总结,包括中文事件抽取、开放域事件抽取、事件数据生成、跨语言事件抽取、小样本事件抽取、零样本事件抽取等类型,DMCNN、FramNet、DLRNN、DBRNN、GCN、DAG-GRU、JMEE、PLMEE等方法
grid-feats-vqa
Grid features pre-training code for visual question answering
IDGL
Code & data accompanying the NeurIPS 2020 paper "Iterative Deep Graph Learning for Graph Neural Networks: Better and Robust Node Embeddings".
LMaaS-Papers
Awesome papers on Language-Model-as-a-Service (LMaaS)
MI_bounds_pytorch
Pytorch implementation of variational lower bounds on Mutual Information
prompt-lib
A set of utilities for running few-shot prompting experiments on large-language models
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
pumpkin-book
《机器学习》(西瓜书)公式推导解析,在线阅读地址:https://datawhalechina.github.io/pumpkin-book
python-recommender-system
A simple recommender system in python implementing: ItemKNN, UserKNN, ItemAverage, UserAverage, UserItemAverage, etc.
RepDistiller
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
WSDM2019-nextitnet
A Simple Convolutional Generative Network for Next Item Recommendation