luosi's repositories

AI_Learning_Materials

学习AI的必备资料

Stargazers:1Issues:0Issues:0

chatllama

ChatLLaMA 📢 Open source implementation for LLaMA-based ChatGPT runnable in a single GPU. 15x faster training process than ChatGPT

Language:PythonStargazers:1Issues:0Issues:0

albert_zh

A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型

Language:PythonStargazers:0Issues:1Issues:0

bert_in_keras

在Keras下微调Bert的一些例子;some examples of bert in keras

Language:PythonStargazers:0Issues:1Issues:0

books

整理一些书籍 ~

Stargazers:0Issues:0Issues:0

cnn-text-classification-tf

Convolutional Neural Network for Text Classification in Tensorflow

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

deep-learning-from-scratch

『ゼロから作る Deep Learning』のリポジトリ

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

DeepLearning-MuLi-Notes

Notes about courses Dive into Deep Learning by Mu Li

Stargazers:0Issues:0Issues:0

Dive-into-DL-TensorFlow2.0

本项目将《动手学深度学习》(Dive into Deep Learning)原书中的MXNet实现改为TensorFlow 2.0实现,项目已得到李沐老师的认可

License:Apache-2.0Stargazers:0Issues:0Issues:0

ELMo-keras

Re-implementation of ELMo on Keras

Language:PythonStargazers:0Issues:1Issues:0

huanhuan-chat

Chat-甄嬛是利用《甄嬛传》剧本中所有关于甄嬛的台词和语句,基于ChatGLM2进行LoRA微调得到的模仿甄嬛语气的聊天语言模型。

Stargazers:0Issues:0Issues:0

InstructLLaMA

Implements pre-training, supervised fine-tuning (SFT), and reinforcement learning from human feedback (RLHF), to train and fine-tune the LLaMA2 model to follow human instructions, similar to InstructGPT or ChatGPT, but on a much smaller scale.

License:MITStargazers:0Issues:0Issues:0

keras-bert

Implementation of BERT that could load official pre-trained models for feature extraction and prediction

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

keras-monotonic-attention

seq2seq attention in keras

Language:PythonLicense:AGPL-3.0Stargazers:0Issues:1Issues:0

keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

keras-transformer

Keras library for building (Universal) Transformers, facilitating BERT and GPT models

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

keras-transformer-xl

Transformer-XL with checkpoint loader

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

LAW-GPT

中文法律对话语言模型

Stargazers:0Issues:0Issues:0

llm-action

本项目旨在分享大模型相关技术原理以及实战经验。

License:Apache-2.0Stargazers:0Issues:0Issues:0

MINI_LLM

This is a repository used by individuals to experiment and reproduce the pre-training process of LLM.

Stargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:1Issues:0

NCRFpp

NCRF++, an Open-source Neural Sequence Labeling Toolkit. It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components. (code for COLING/ACL 2018 paper)

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

nlp_overview

Overview of Modern Deep Learning Techniques Applied to Natural Language Processing

Language:CSSLicense:MITStargazers:0Issues:1Issues:0

statistic

collecting books, papers and docs.

Stargazers:0Issues:0Issues:0

step_into_llm

MindSpore online courses: Step into LLM

License:Apache-2.0Stargazers:0Issues:0Issues:0

tensorflow2_tutorials_chinese

tensorflow2中文教程,持续更新(当前版本:tensorflow2.0),tag: tensorflow 2.0 tutorials

Language:Jupyter NotebookStargazers:0Issues:1Issues:0

TiramisuASR

A Sweet Automatic Speech Recognition like Tiramisu Cake using Tensorflow 2. Supported languages having small number of characters such as English, Vietnamese, German, etc.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

transformer-word-segmenter

Sequence labeling base on universal transformer (Transformer encoder) and CRF; 基于Universal Transformer + CRF 的中文分词和词性标注

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:MITStargazers:0Issues:1Issues:0

transformers-code

手把手带你实战 Huggingface Transformers 课程视频同步更新在B站与YouTube

Stargazers:0Issues:0Issues:0