innerNULL / monolearn

Notes

Repository from Github https://github.cominnerNULL/monolearnRepository from Github https://github.cominnerNULL/monolearn

My Leveling-Up

Simple is good, but controllable is the best. So instead of using static site framework, I just using github + markdowns to manage my personal learning leveling-up.

Activities

2024 Q4

2024-11-02 19:07:00 - Paper Reading

Read paper Self-Attention with Relative Position Representations. This paper is the foundation of understanding how relative position embedding works. Not hard to understand but deserve to read.

2024-10-12 12:37:00 - Paper Reading

Read paper Large Language Models Are Human-Level Prompt Engineers. The idea is of auto prompt engineering is interesting, but the use case is kind of limited as it:

  • Needs Labeled training data
  • Can not optimize prompt in sentence or paragraph level granularity.

2024 Q3

2024-09-19 23:39:00 - Paper Reading

Read paper Parameter-Efficient Transfer Learning for NLP. The approach is too simple even boring, but seems like it's effective and can help to understand what's a basic adapter.

2024-07-13 22:50:12 - Paper Reading

Read paper Unsupervised Extractive Summarization using Pointwise Mutual Information. The style of this paper is similar as Simple Unsupervised Keyphrase Extraction using Sentence Embeddings: Low resources need, easy to understand and intuitive.

2024-07-11 23:15:25 - Paper Reading

Read paper Simple Unsupervised Keyphrase Extraction using Sentence Embeddings. This is a intuitive and elegant approach without depending a lot of resource (labeled data, GPUs, etc).

About

Notes

License:MIT License