Stewart Slocum's starred repositories
model-alignment
Model Alignment is a python library from the PAIR team that enable users to create model prompts through user feedback instead of manual prompt writing and editing. The technique makes use of constitutional principles to align prompts to users' desired values.
overcooked_ai
A benchmark environment for fully cooperative human-AI performance.
bombsquad-remote-android
BombSquad Remote App for Android
ChatGPT-System-Prompts
This repository contains a collection of the best system prompts for ChatGPT, a conversational AI model developed by OpenAI. Star this repository to help us reach 5,000 stars!
safety-gymnasium
NeurIPS 2023: Safety-Gymnasium: A Unified Safe Reinforcement Learning Benchmark
python-astar
Simple implementation of the a-star algorithm in Python 🌟
safe-grid-gym
A gym interface for AI safety gridworlds created in pycolab.
vscode-sftp
Super fast sftp/ftp extension for VS Code
Crowded-Valley---Results
This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"
FirstExplicitMethod-HDM
Implementation of Hamiltonian Descent Methods (First explicit method) with Python3
Grad_CAM_plus_plus
A generalized gradient-based CNN visualization technique