bigbloom
Memory effective Bloom-filter implementation for huge datasets with Node.js.
This project implements bindings to the bloom-rs
rust project from nicklan using the Neon bindings
Motivation
Multiple great projects already implement some Bloom filter for Node.js but they all do it in pure javascript. This works well as long as the filter isn't really big (think (tens/hundreds/++) millions of expected entries).
They work great until you get the famous JavaScript heap out of memory
.
Installation
npm install bigbloom
Usage
const BloomFilter = require('bigbloom');
const capacity = 1000000;
const errorRate = 0.01;
const filter = new BloomFilter(capacity, errorRate);
filter.contains("foo"); // false
filter.insert("foo"); // true
filter.contains("foo"); // true
filter.insert("foo"); // false, already inserted
This also works in typescript
import { BloomFilter } from 'bigbloom';
TODO
- Add some simple tests
- Add travis for tests and publishing binaries
- Update package.json for npm publishing