KuoCh‘ing's repositories
husky
This project is release by KuoCh'ing Chang. We shooting this project to solve medical text processing.
ZhongYi-NER2
通过收集数据去训练下游的中医文本NER标注模型,在上游使用了Bert作为特征提取器,减少了下游模型的数据需求。
dnimo
Hello world! This is ZHANG Guoqing, an Master student in Kyoto University.
LongText_Husky
A japanese finetuned instruction LLaMA fork form japanese-alpaca-lora, then maintenance and improvement by dnimo
llm-book
「大規模言語モデル入門」(技術評論社, 2023)のGitHubリポジトリ
transformers-bloom-inference
Fast Inference Solutions for BLOOM
FastChat
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and FastChat-T5.
MiniGPT-4
MiniGPT-4: Enhancing Vision-language Understanding with Advanced Large Language Models
Bert-information-Extraction
2021服务外包大赛参赛,抽取金融数据文本中的多元关系
bert4keras
keras implement of transformers for humans
JGLUE
JGLUE: Japanese General Language Understanding Evaluation
PDF_Reader
Update the write method cause pafminiter has upgraded
minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
awesome-multimodal-ml
Reading list for research topics in multimodal machine learning
rocket-recycling
Rocket-recycling with Reinforcement Learning
OpenHowNet
Core Data of HowNet and OpenHowNet Python API
rustdesk
Yet another remote desktop software
somedecorators
some very useful decorators for python (一些非常实用的 Python 装饰器)
tensorflow_macos
TensorFlow for macOS 11.0+ accelerated using Apple's ML Compute framework.
tensorforce
Tensorforce: a TensorFlow library for applied reinforcement learning
bencode
`.torrent`文件解析器
DGCNN-information-Extraction
基于苏剑林项目的复用,应用于金融事件关系抽取
OpenNRE
An Open-Source Package for Neural Relation Extraction (NRE)
LeetCode-Go
✅ Solutions to LeetCode by Go, 100% test coverage, runtime beats 100% / LeetCode 题解
google-research
Google Research
ATLOP
Source code for paper "Document-Level Relation Extraction with Adaptive Thresholding and Localized Context Pooling", AAAI 2021
guwenbert
GuwenBERT: 古文预训练语言模型 a Pre-trained Language Model for Classical Chinese (Literary Chinese)