HKUST-KnowComp's repositories
MLMA_hate_speech
Dataset and code of our EMNLP 2019 paper "Multilingual and Multi-Aspect Hate Speech Analysis"
Knowledge-Constrained-Decoding
Official Code for EMNLP2023 Main Conference paper: "KCTS: Knowledge-Constrained Tree Search Decoding with Token-Level Hallucination Detection"
CSKB-Population
Codes for the EMNLP2021 paper: Benchmarking Commonsense Knowledge Base Population (https://aclanthology.org/2021.emnlp-main.705.pdf). An updated version CKBP v2 (https://arxiv.org/pdf/2304.10392.pdf)
LLM-Multistep-Jailbreak
Code for Findings-EMNLP 2023 paper: Multi-step Jailbreaking Privacy Attacks on ChatGPT
AbsPyramid
Official code repository for the paper: AbsPyramid: Benchmarking the Abstration Ability of Language Models with a Unified Entailment Graph
ConstraintChecker
Official code repository for the EACL2024 paper "ConstraintChecker: A Plugin for Large Language Models to Reason on Commonsense Knowledge Bases"
QaDynamics
Codes for the EMNLP2023 Findings paper: QaDynamics: Training Dynamics-Driven Synthetic QA Diagnostic for Zero-Shot Commonsense Question Answering.
IntentionQA
Codes for the paper: IntentionQA: A Benchmark for Evaluating Purchase Intention Comprehension Abilities of Large Language Models in E-commerce
LiveSum-TTT
Codes and Datasets for the Paper: Text-Tuple-Table: Towards Information Integration in Text-to-Table Generation via Global Tuple Extraction
ChatGPT-Inter-Sentential-Relations
Official code repository for the EACL2024 paper "Exploring the Potential of ChatGPT on Sentence Level Relations: A Focus on Temporal, Causal, and Discourse Relations"
MIND_Distillation
Codes for the paper: MIND: Multimodal Shopping Intention Distillation from Large Vision-language Models for E-commerce Purchase Understanding.
PrivateGraphEncoder
Source Code for CIKM2023 paper "Independent Distribution Regularization for Private Graph Embedding"
MARS
Code and dataset for the paper: MARS: Benchmarking the Metaphysical Reasoning Abilities of Language Models with a Multi-task Evaluation Dataset (https://arxiv.org/pdf/2406.02106).