There are 2 repositories under aws-quicksight topic.
Clickstream Analytics on AWS source code
Bring your own data Labs: Build a serverless data pipeline based on your own data
Voice of the Customer (VoC) to enhance customer experience with serverless architecture and sentiment analysis, using Amazon Kinesis, Amazon Athena, Amazon QuickSight, Amazon Comprehend, and ChatGPT-LLMs for sentiment analysis.
This repository includes some AWS Cloud Quest. it not include the cloud practitioner labs
DevOps에 대한 개념 이해와 AWS 개발자 도구를 활용한 실습 및 연구
Build machine learning-powered business intelligence analyses using Amazon QuickSight
A simple, practical, and affordable system for measuring head trauma within the sports environment, subject to the absence of trained medical personnel made using Amazon Kinesis Data Streams, Kinesis Data Analytics, Kinesis Data Firehose, and AWS Lambda
Build a Visualization and Monitoring Dashboard for IoT Data with Amazon Kinesis Analytics and Amazon QuickSight
AWS Programming and Tools meetup workshop
you run a script to mimic multiple sensors publishing messages on an IoT MQTT topic, with one message published every second. The events get sent to AWS IoT, where an IoT rule is configured. The IoT rule captures all messages and sends them to Firehose. From there, Firehose writes the messages in batches to objects stored in S3. In S3, you set up a table in Athena and use QuickSight to analyze the IoT data.
This project integrates real-time data processing and analytics using Apache NiFi, Kafka, Spark, Hive, and AWS services for comprehensive COVID-19 data insights.
aws-quicksight-tool assists in the use of the AWS QuickSight CLI.
Smart City Realtime Data Engineering Project
Convert DMARC reports to TSV (or CSV) format for easier analysis and visualisation
Scrapped tweets using twitter API (for keyword ‘Netflix’) on an AWS EC2 instance, ingested data into S3 via kinesis firehose. Used Spark ML on databricks to build a pipeline for sentiment classification model and Athena & QuickSight to build a dashboard
A data pipeline to ingest, process, store storm events datasets so we can access them through different means.
This project repo 📺 offers a robust solution meticulously crafted to efficiently manage, process, and analyze YouTube video data leveraging the power of AWS services. Whether you're diving into structured statistics or exploring the nuances of trending key metrics, this pipeline is engineered to handle it all with finesse.
A demand forecasting pipeline deployed on Azure and AWS
Data lake demo using change data capture (CDC) on AWS
This project demonstrates a complete data pipeline for extracting, transforming, and loading (ETL) Reddit data into an Amazon Redshift data warehouse. The pipeline uses various AWS services and tools including Apache Airflow, PostgreSQL, AWS S3, AWS Glue, AWS Athena, and Amazon Redshift. The project is orchestrated using Docker and Apache Airflow
This project is based for legacy applications that works with positional files to process data. The objetive is read these positional files when they arrives in AWS S3, and then send to a dataware-house like AWS Redshift, and finally read the results with a Business Intelligence tool as AWS QuickSight.
A data engineering portfolio project using AWS cloud services to analyze correlations between Malaysian retail performance and fuel prices. Features Terraform IaC, ETL/ELT with AWS S3, Glue, SQL analytics via Athena coupled with data transformation via dbt, and workflow orchestration with Kestra.
US Insurance cost predicting linear regression model. Mainly used to learn about Machine Learning tools in Amazon Web Services (AWS)
Unveiling job market trends with Scrapy and AWS
The testbed showing how to embed QuickSight dashboards into a web app
Put-away is one of the most crucial process in supply chain. If we misplace the goods, all of the rest process could be potentially delayed. That's why we choose this process to be improved by multiclassification machine learning model and dashboarding with AWS.
Veri analizi hakkında hazırladığım sunum
Veri analizi hakkında hazırladığım sunum
A scalable crypto data pipeline that ingests from API, transforms, stores, and loads crypto currencies historical and intra day data using AWS services and Snowflake. Airflow orchestrates the entire pipeline and, CI/CD fully automated through GitHub Actions.
A cost intelligence solution that correlates application performance traces with Infrastructure costs, providing unprecedented transaction-level visibility to drive strategic business decisions.
A data engineering portfolio project using AWS cloud services to analyze correlations between Malaysian retail performance and fuel prices. Features Terraform IaC, ETL/ELT with AWS S3, Glue, SQL analytics via Athena coupled with data transformation via dbt, and workflow orchestration with Kestra.
QuickSightの分析とデータセットの設定をLambdaでバックアップする
This project demonstrates an end-to-end, event-driven workflow for analyzing facial expressions in images using AWS Rekognition. Images are uploaded to S3, processed via Lambda, stored for querying in Athena, and visualized with QuickSight.
🌳 A sustainable Terraform Package to manage QuickSight resources on AWS
Sales Analytics Dashboard using AWS