Kamel Alrashedy's repositories
bert
TensorFlow code and pre-trained models for BERT
CodeBERT
CodeBERT
CodeGen
CodeGen is an open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.
CodeT5
Code for CodeT5: a new code-aware pre-trained encoder-decoder model.
CodeRL
This is the official code for the paper CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning (NeurIPS22).
CodeXGLUE
CodeXGLUE
CoTexT
Code implementation for CoTexT: Multi-task Learning with Code-Text Transformer
DOME
Developer-Intent Driven Code Comment Generation
ecco
Visualize, analyze, and explore NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2, BERT, RoBERTA, T5, and T0).
FlexGen
Running large language models like OPT-175B/GPT-3 on a single GPU. Focusing on high-throughput large-batch generation.
Function-level-Vulnerability-Detection
A deep learning-based vulnerability detection framework
google-research
Google Research
kamel773.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
neurips21-self-supervised-bug-detection-and-repair
Replication Code for "Self-Supervised Bug Detection and Repair" NeurIPS 2021
PLBART
Official code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].
ticket-tagger
Machine learning driven issue classification bot.
TransCoder
Public release of the TransCoder research project https://arxiv.org/pdf/2006.03511.pdf
TSSB3M
Mining tool and large-scale datasets of single statement bug fixes in Python
VRepair
Vulnerability Repair scripts