SeungHoon Choi's starred repositories
techniques
Techniques for deep learning with satellite & aerial imagery
satellite-image-deep-learning
Resources for deep learning with satellite & aerial imagery
Awesome_DaeJeon_Restaurant
대전 맛집 정리
django-nginx-upload
example of using nginx upload module with Django!
libgeos_static
Quickly build GEOS as a static library.
CheatSheet
:eyes: 알아두면 유용하게 사용할 수 있는 개발 관련 문서 및 샘플 코드를 작성하는 프로젝트입니다.
lzukowski.github.io
Build a Jekyll blog in minutes, without touching the command line.
python-dependency-injector
Dependency injection framework for Python
flask-hexagonal-architecture-api
Simple example of Python Flask API following SOLID and Hexagonal Architecture principles
hexagonal-architecture-python
An example backend implementing Hexagonal Architecture in Python using Flask and SqlAlchemy.
go.geojson
Encoding and decoding GeoJSON <-> Go
scientific-visualization-book
An open access book on scientific visualization using python and matplotlib
libtorch_grpc_serving
pytorch during training, libtorch during serving via gRPC
onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
awesome-sushi
🍣 국내 스시 오마카세 맛집 리스트
You-Dont-Know-JS
A book series on JavaScript. @YDKJS on twitter.
dashboards
Responsive dashboard templates 📊✨
open-apis-korea
🇰🇷 한국어 사용자를 위한 서비스에 사용하기 위한 오픈 API 모음
awesome-kubernetes
A curated list for awesome kubernetes sources :ship::tada:
Data-Pipelines-with-Airflow
This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging the data, filling the data warehouse, and running checks on the data quality as the final step. Automate the ETL pipeline and creation of data warehouse using Apache Airflow. Skills include: Using Airflow to automate ETL pipelines using Airflow, Python, Amazon Redshift. Writing custom operators to perform tasks such as staging data, filling the data warehouse, and validation through data quality checks. Transforming data from various sources into a star schema optimized for the analytics team’s use cases. Technologies used: Apache Airflow, S3, Amazon Redshift, Python.