Maoxin (Molson) Han (molsonhan)

molsonhan

Geek Repo

Company:City University of Hong Kong, Shenzhen Research Institute

Location:Nanshan District, Shenzhen, Guangdong

Github PK Tool:Github PK Tool

Maoxin (Molson) Han's repositories

DeepLearning

深度学习入门教程, 优秀文章, Deep Learning Tutorial

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:1Issues:0Issues:0

personality-detection

Implementation of a hierarchical CNN based model to detect Big Five personality traits

Language:PythonLicense:MITStargazers:1Issues:0Issues:0

sklearn-doc-zh

:book: [译] scikit-learn(sklearn) 中文文档

Language:CSSLicense:NOASSERTIONStargazers:1Issues:0Issues:0

text_classification

使用rnn,lstm,gru,fasttext,textcnn,dpcnn,rnn-att,lstm-att,兼容huggleface/transformers,以及以transforemrs作为词嵌入模型,后面接入cnn、rnn、attention等等做文本分类。以及各个模型的对比

Language:PythonStargazers:1Issues:0Issues:0

Awesome-Multimodal-Large-Language-Models

:sparkles::sparkles:Latest Papers and Datasets on Multimodal Large Language Models, and Their Evaluation.

Stargazers:0Issues:0Issues:0

d2l-zh

《动手学深度学习》:面向中文读者、能运行、可讨论。中英文版被60个国家的400所大学用于教学。

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

dive_into_deep_learning

✔️李沐 【动手学深度学习】课程学习笔记:使用pycharm编程,基于pytorch框架实现。

Language:PythonStargazers:0Issues:0Issues:0

External-Attention-pytorch

🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

latex_resume

An elegant \LaTeX\ résumé template. 大陆镜像 https://gods.coding.net/p/resume/git

Language:TeXLicense:MITStargazers:0Issues:0Issues:0
Stargazers:0Issues:0Issues:0

Personality

A design framework of imputing personality based on review text

License:MITStargazers:0Issues:0Issues:0

text-preprocessing-techniques

16 Text Preprocessing Techniques in Python for Twitter Sentiment Analysis.

Stargazers:0Issues:0Issues:0

transformer

An annotated implementation of the Transformer paper.

License:MITStargazers:0Issues:0Issues:0

unidiffuser

Code and models for the paper "One Transformer Fits All Distributions in Multi-Modal Diffusion"

Stargazers:0Issues:0Issues:0