VE FORBRYDERNE (VE-FORBRYDERNE)

VE FORBRYDERNE

VE-FORBRYDERNE

Geek Repo

Descartes said "I think, therefore I am" but that does not mean if you cease thinking you cease to exist. That would be denying the antecedent.

Github PK Tool:Github PK Tool

VE FORBRYDERNE's repositories

mtj-softtuner

Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance

Language:PythonLicense:Apache-2.0Stargazers:27Issues:2Issues:4

mesh-transformer-jax

Fork of kingoflolz/mesh-transformer-jax with memory usage optimizations and support for GPT-Neo, GPT-NeoX, BLOOM, OPT and fairseq dense LM. Primarily used by KoboldAI and mtj-softtuner.

Language:PythonLicense:Apache-2.0Stargazers:22Issues:0Issues:0
Language:PythonLicense:AGPL-3.0Stargazers:6Issues:1Issues:0

gpt-j-6b-filter-test

An experimental version of the GPT-J-6B Colab notebook that supports repetition_penalty from Hugging Face's transformers.

Language:Jupyter NotebookStargazers:2Issues:0Issues:0

Clover-Edition

State of the art AI plays dungeon master to your adventures.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

flask-cloudflared

Run a TryCloudflare tunnel to your flask app right from code.

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

luamin

A Lua minifier written in JavaScript

Language:JavaScriptLicense:MITStargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

optax

Optax is a gradient processing and optimization library for JAX.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

transformers

🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

mkultra

Prompt tuning toolkit for GPT-2 and GPT-Neo

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

pv

Pipe Viewer Mirror - 1.6

Language:CStargazers:0Issues:0Issues:0