There are 2 repositories under datafactory topic.
DataOps for Microsoft Data Platform technologies. https://aka.ms/dataops-repo
Terraform script to deploy almost all Azure Data Services
Threat Detection and Visualization
A data pipeline project build on databricks and azure to demostrate lifecycle of a cloud data project.
Azure Container Instances Proxy implemented in Azure Function App (Consumption Plan)
Generic Pipelines / Templates for Data Factory / Synapse Pipelines w.r.t Different MSFT Offering Integrations / Use Cases
This pipeline is an ETL application, which aims to collect data for a few minutes from Binance's open API for cryptocurrencies: BTCUSDT, ADAUSDT, ETHUSDT, BNBUSDT and LTCUSDT.
Explore the Tokyo Olympics data journey! We ingested a GitHub CSV into Azure via Data Factory, stored it in Data Lake Storage Gen2, performed transformations in Databricks, conducted advanced analytics in Azure Synapse, and visualized insights in Synapse or Power BI.
The ADF Universal Framework is an open-source project designed to provide a comprehensive and flexible solution for building scalable and efficient data integration workflows using Azure Data Factory (ADF).
Proceso ETL
End-to-End data engineering project with Azure Databricks as cloud service and Tokyo olympic data
This repository serves as a sample demonstration of implementing CI/CD practices on Azure Data Factory using Azure DevOps. It provides an example approach to configure a pipeline for continuous integration and continuous deployment (CI/CD) workflows on Azure Data Factory
Tokyo Olympic Azure Data Engineering Project
Repository for Azure Data Factory (ADF) Custom Activity to dynamically create and process Azure Analysis Service (AAS) Tabular Model Partitions
Logic App that calls Azure Resource Management REST API to get information about resources in a subscription
This application shows how to use the Azure .NET library to create and execute Azure Data Factory from C# code.
Azure Data Factory template to refresh Power BI Dataset
Building a Data Lakehouse using the Medallion architecture.
This Project Extracts supply chain data from csv file having 180k records and more than 40 columns from the Azure Datalake Gen2 storage account and do some dataanalysis with Python(Pandas) to find the top 3 countries and filtered the data for top 3 countries and finally transferred it to 3 files in datalake again by creating ETL pipeline in ADF.
Portfolio Data Analysis and Data Science projects and Data Engineer built using Azure Service, SQL and Python.
Azure Data Landing Zone - IAC , Config, Code
Interesting_programs_written_in_Python_language
In this project, I built a comprehensive data pipeline using Azure services to address a business need for insights on customer demographics and sales trends. Data from an on-premises SQL database is ingested, transformed, and visualized in a Power BI dashboard.
This project builds a cloud-based pipeline to extract NYC taxi data from an API and store it in Azure Data Lake Storage (ADLS). Databricks and PySpark are used to transform the data through the medallion architecture (Bronze → Silver → Gold). Delta Lake ensures reliable storage, and Power BI provides visual insights for data-driven decision-making.
Azure data engineering project including data ingestion, storage, transformation, and visualization.
End-to-End Concept Project: Azure Data Factory Pipeline, Azure IoT Central, Azure Storage Account, Azure SQL, .NET, Angular
Azure Data engineering project
Azure Data engineering project
This project develops a data engineering pipeline to analyze restaurant data from various cities on the Swiggy platform. Using PySpark, Spark SQL, and Azure Data Factory, the data is processed and transformed to generate insights on ratings, cuisines, and trends, presented through dashboards and reports.