There are 97 repositories under edge-computing topic.
High-Performance server for NATS.io, the cloud and edge native messaging system.
WasmEdge is a lightweight, high-performance, and extensible WebAssembly runtime for cloud native, edge, and decentralized applications. It powers serverless apps, embedded functions, microservices, smart contracts, and IoT devices.
Tiny and powerful JavaScript full-text search engine for browser and Node
On device AI inference in minutes—now for MLX & GGUF and Qualcomm NPU, with Android and iOS coming soon.
zenoh unifies data in motion, data in-use, data at rest and computations. It carefully blends traditional pub/sub with geo-distributed storages, queries and computations, while retaining a level of time and space efficiency that is well beyond any of the mainstream stacks.
Everything about federated learning, including research papers, books, codes, tutorials, videos and beyond
OpenYurt - Extending your native Kubernetes to edge(project under CNCF)
This repository provides code for machine learning algorithms for edge devices developed at Microsoft Research India.
⛓️RuleGo is a lightweight, high-performance, embedded, next-generation component orchestration rule engine framework for Go.
👶 Tiny S3 client. Edge computing ready. No-dep. In Typescript. Works with @cloudflare @minio @backblaze @digitalocean @garagehq @oracle
High-performance multiple object tracking based on YOLO, Deep SORT, and KLT 🚀
A Fetch API-compatible PlanetScale database driver
☁️ Securely connect anything with WireGuard® and manage all your networks from a single place.
A lightweight header-only library for using Keras (TensorFlow) models in C++.
TinyChatEngine: On-Device LLM Inference Library
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256KB Memory
A server based on Deno runtime, capable of running JavaScript, TypeScript, and WASM services.
A small form factor OpenShift/Kubernetes optimized for edge computing
List of papers related to neural network quantization in recent AI conferences and journals.
Python Computer Vision & Video Analytics Framework With Batteries Included
Edge server, user dataset for Edge Computing research
AI-in-a-Box leverages the expertise of Microsoft across the globe to develop and provide AI and ML solutions to the technical community. Our intent is to present a curated collection of solution accelerators that can help engineers establish their AI/ML environments and solutions rapidly and with minimal friction.
A curated list of awesome edge computing, including Frameworks, Simulators, Tools, etc.