There are 0 repository under knowledge-transfer topic.
Awesome Knowledge Distillation
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
PyContinual (An Easy and Extendible Framework for Continual Learning)
An Extensible Continual Learning Framework Focused on Language Models (LMs)
This repository is mainly dedicated for listing the recent research advancements in the application of Self-Supervised-Learning in medical images computing field
Code and dataset for ACL2018 paper "Exploiting Document Knowledge for Aspect-level Sentiment Classification"
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Code and pretrained models for paper: Data-Free Adversarial Distillation
[ECCV2022] Factorizing Knowledge in Neural Networks
[Paper][AAAI 2023] DUET: Cross-modal Semantic Grounding for Contrastive Zero-shot Learning
PyTorch implementation of (Hinton) Knowledge Distillation and a base class for simple implementation of other distillation methods.
Code for ECML/PKDD 2020 Paper --- Continual Learning with Knowledge Transfer for Sentiment Classification
Code for NeurIPS 2020 Paper --- Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks
A Comprehensive Survey on Knowledge Distillation
[NeurIPS'23] Source code of "Data-Centric Learning from Unlabeled Graphs with Diffusion Model": A data-centric transfer learning framework with diffusion model on graphs.
KT-BT: A Framework for Knowledge Transfer Through Behavior Trees in Multirobot Systems
Adaptive Model-based Transfer Evolutionary Algorithm
[arXiv 2024] PyTorch implementation of RRD: https://arxiv.org/abs/2407.12073
Implementation of NAACL 2024 main conference paper: Named Entity Recognition Under Domain Shift via Metric Learning for Life Science
[TPAMI'2024] Multi-sensor Learning Enables Information Transfer across Different Sensory Data and Augments Multi-modality Imaging
:drum: Teach a newbie how to perform better.
Functional Knowledge Transfer with Self-supervised Representation Learning (ICIP 2023)
A novel knowledge-guided machine learning (KGML) framework where Geographic Information Systems (GIS) and Remote Sensing (RS) provide structured knowledge guidance to enhance deep learning models for power plant detection.
Learning in Growing Robots: Knowledge Transfer from Tadpole to Frog Robot
The Weizenbaum Institut „Knowledge Tool“ supports interdisciplinary debates by providing a workshop methodology based on predefined instructions and prefabricated workshop materials with the aim to structure and record multi-perspective exploration and analysis.
This project implements knowledge distillation from DINOv2 (Vision Transformer) to convolutional networks, enabling efficient visual representation learning with reduced computational requirements.
Multiple methods' implementations to transfer the knowledge between Neural Networks and save/plot/compare the results.
"Knowledge transfer in StackOverflow", a CSS seminar project by TUM students
Bonan & Samo. January 2023. Paper on cross-linguistic bias in health-related content in Transformer-based language models.
Promote documentation best practices
🧠 A DNN that uses knowledge transfer to perform localization and an RNN that studies and generates texts, built for a university Deep Learning course
Knowledge transfer sessions I've presented.
The idea is to use pretrained deep learning algorithms such as mobilenet to predict objects in a image.