There are 64 repositories under data-pipelines topic.
Python ETL framework for stream processing, real-time analytics, LLM pipelines, and RAG.
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
An orchestration platform for the development, production, and observation of data assets.
Apache DolphinScheduler is the modern data orchestration platform. Agile to create high performance workflow with low-code
Convert documents to structured data effortlessly. Unstructured is open-source ETL solution for transforming complex documents into clean, structured formats for language models. Visit our website to learn more about our enterprise grade Platform product for production grade workflows, partitioning, enrichments, chunking and embedding.
🦀 event stream processing for developers to collect and transform data in motion to power responsive data intensive applications.
Preswald is a WASM packager for Python-based interactive data apps: bundle full complex data workflows, particularly visualizations, into single files, runnable completely in-browser, using Pyodide, DuckDB, Pandas, and Plotly, Matplotlib, etc. Build dashboards, reports, and notebooks that run offline, load fast, and share like a document.
Meltano: the declarative code-first data integration engine that powers your wildest data and ML-powered product ideas. Say goodbye to writing, maintaining, and scaling your own API integrations.
The dbt-native data observability solution for data & analytics engineers. Monitor your data pipelines in minutes. Available as self-hosted or cloud service with premium features.
The best place to learn data engineering. Built and maintained by the data engineering community.
Fast and efficient unstructured data extraction. Written in Rust with bindings for many languages.
Kickstart your MLOps initiative with a flexible, robust, and productive Python package.
First open-source data discovery and observability platform. We make a life for data practitioners easy so you can focus on your business.
Build data pipelines with SQL and Python, ingest data from different sources, add quality checks, and build end-to-end flows.
Dataform is a framework for managing SQL based data operations in BigQuery
Optimus is an easy-to-use, reliable, and performant workflow orchestrator for data transformation, data modeling, pipelines, and data quality management.
Database replication platform that leverages change data capture. Stream production data from databases to your data warehouse (Snowflake, BigQuery, Redshift, Databricks) in real-time.
dbt package that is part of Elementary, the dbt-native data observability solution for data & analytics engineers. Monitor your data pipelines in minutes. Available as self-hosted or cloud service with premium features.
One framework to develop, deploy and operate data workflows with Python and SQL.
Dataplane is an Airflow inspired unified data platform with additional data mesh and RPA capability to automate, schedule and design data pipelines and workflows. Dataplane is written in Golang with a React front end.
Main repo including core data model, data marts, data quality tests, and terminology sets.
A curated list of awesome projects and resources related to Kubeflow (a CNCF incubating project)
A lightweight CLI tool for versioning data alongside source code and building data pipelines.
Relational data pipelines for the science lab
An Open Source PHP Reporting Framework that helps you to write perfect data reports or to construct awesome dashboards in PHP. Working great with all PHP versions from 5.6 to latest 8.0. Fully compatible with all kinds of MVC frameworks like Laravel, CodeIgniter, Symfony.