There are 2 repositories under badgerdb topic.
Simple key-value store abstraction and implementations for Go (Redis, Consul, etcd, bbolt, BadgerDB, LevelDB, Memcached, DynamoDB, S3, PostgreSQL, MongoDB, CockroachDB and many more)
A multi-purpose OSINT toolkit with a neat web-interface.
Straightforward implementation of Raft Consensus
Low Code log management solution
Custom golang proxy inspired in nginx proxy, and traefik proxy
A flexible and extensible library for key-value storage
Nostr relay with Internet Computer integration for inter-relay synchronization
A CLI dictionnary app. Create your own dictionnary and save everything you want. Anything but text only :)
A Concurrent Search Engine built with Go
Blockchain implementation in go (Para tratar de entender como funciona una Blockchain)
Kafka X-Ray. Generic web frontend to view, search, explore and visualize keys, headers and payloads.
Multiplayer wordle game 🚀
A lightweight, durable, and embedded job queue for Go applications. Powered by BadgerDB.
Custom golang proxy inspired in nginx proxy, and traefik proxy
sjrpc is a Reverse Proxy focused on to reduce remote Ethereum/Web3 JSON-RPC calls to third-party nodes.
An implementation of a cryptocurrency blockchain network that implements P2P networking, local persistence, transaction merkle trees, memory pooling and a proof-of-work consensus layer. [WIP]
Exporting data out of the badger database of step-ca.
A library for the improvement of interactions with key-value data stores.
Hashicorp Raft LogStore + StableStore backed by dgraph-io's BadgerDB
a restful api server written in Golang for html5-moving-vue
A distributed load-testing system leveraging Kafka as the communication infrastructure to orchestrate concurrent, high-throughput load tests on web servers
!! OBSOLETE !! Use new repo https://github.com/honey-badger-io/honey-badger
This project implements 2 simple go applications to produce and consume messages respectively via kafka cluster. Also another go application to interact with the consumer to get the consumed data via HTTP. Relevant metrics in the process will be pushed to prometheus data store and will finally be visualised in grafana.