Nealcly's repositories
BiLSTM-LAN
Hierarchically-Refined Label Attention Network for Sequence Labeling
templateNER
Source code for template-based NER
sarcasm-detection-for-sentiment-analysis
Sarcasm Detection for Sentiment Analysis
KE-Blender
Code for KE-Blender, EMNLP 2021
pytorch-pretrained-BERT
đź“–The Big-&-Extending-Repository-of-Transformers: Pretrained PyTorch models for Google's BERT, OpenAI GPT & GPT-2, Google/CMU Transformer-XL.
cognitive-services-speech-sdk
Sample code for the Microsoft Cognitive Services Speech SDK
MultiTurnResponseSelection
This repo contains our ACL paper data and source code
NCRFpp
NCRF++, an Open-source Neural Sequence Labeling Toolkit. It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components. (code for COLING/ACL 2018 paper)
Nealcly
Config files for my GitHub profile.
nfc-parser
Code for the paper ``Investigating Non-local Features for Neural Constituency Parsing''
NLP-progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
RichWordSegmentor
Neural word segmentation with rich pretraining, code for ACL 2017 paper
self-attentive-parser
High-accuracy NLP parser with models for 11 languages.
simpletransformers
Transformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
word_forms
Accurately generate all possible forms of an English word e.g "election" --> "elect", "electoral", "electorate" etc.