Aurelio Jethro (ajppp)

ajppp

Geek Repo

Location:Singapore

Github PK Tool:Github PK Tool

Aurelio Jethro's repositories

Language:PythonStargazers:0Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:0Issues:0

adapt-mnmt

Dynamic Transfer Learning for Low-Resource Neural Machine Translation

Language:PythonStargazers:0Issues:0Issues:0

anfis-pytorch

Implementation of ANFIS using the pyTorch framework

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

cz2002-assignment

Lab Assignment for CZ2002 2020-2021 Sem 1

Language:JavaStargazers:0Issues:0Issues:0

CZ2002-Lab

Lab Exercises for CZ2002 2020

Language:JavaStargazers:0Issues:0Issues:0

cz2006-grocery-app

Coursework for CZ2006: Software Engineering done in AY20/21 Semester 2

Language:KotlinStargazers:0Issues:1Issues:1

cz3005-lab

Lab Submissions for CZ3005: Artificial Intelligence in 20-21 S2

Language:TeXStargazers:0Issues:0Issues:0

dotfiles

all my dotfiles

Language:Vim ScriptStargazers:0Issues:1Issues:0

huggingface-experiments

Scripts for training translation models to do transfer learning using huggingface

Language:Jupyter NotebookStargazers:0Issues:1Issues:0

cz4045-project2

Source Code for Project 2 of CZ4045

Language:Jupyter NotebookStargazers:0Issues:0Issues:0

dace

DaCe - Data Centric Parallel Programming

License:BSD-3-ClauseStargazers:0Issues:0Issues:0

do-you-even-need-attention

Exploring whether attention is necessary for vision transformers

Stargazers:0Issues:0Issues:0

pyqsp

Python quantum signal processing

License:NOASSERTIONStargazers:0Issues:0Issues:0

pytorch-a3c

PyTorch implementation of Asynchronous Advantage Actor Critic (A3C) from "Asynchronous Methods for Deep Reinforcement Learning".

License:MITStargazers:0Issues:0Issues:0

QSPPACK

A toolbox for solving phase factors in Quantum signal processing.

License:NOASSERTIONStargazers:0Issues:0Issues:0

qsvt_experiments

Experiments with QSVT

License:Apache-2.0Stargazers:0Issues:0Issues:0

the-story-of-heads

This is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the ACL 2021 paper "Analyzing Source and Target Contributions to NMT Predictions".

Stargazers:0Issues:0Issues:0

xattn-transfer-for-mt

Code and data to accompany the camera-ready version of "Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation" in EMNLP 2021

License:MITStargazers:0Issues:0Issues:0