There are 2 repositories under hyperparameter topic.
Sequential model-based optimization with a `scipy.optimize` interface
Example code for paper "Bilevel Optimization: Nonasymptotic Analysis and Faster Algorithms"
PyTorch implementation of Proximal Gradient Algorithms a la Parikh and Boyd (2014). Useful for Auto-Sizing (Murray and Chiang 2015, Murray et al. 2019).
Hyperparameters-Optimization
Example Code for paper "Provably Faster Algorithms for Bilevel Optimization"
Hyperparameter optimisation utility for lightgbm and xgboost using hyperopt.
Some experiments to empirically analyze how the parameters of LWE impact the correctness of the algorithm on a single bit.
This repository Consist of Course Material, Assignment And Quizes Attempted in Specialization Course by Coursera
Deep Learning Specialization. Master Deep Learning, and Break into AI
A portable configuration tool to manage hyperparameters and settings of your experiments.
Study projects developed during data science courses
Interactive exploration of hyperparameter tuning results with ipywidget and plotly in jupyter notebook.
"oxayavongsa/projects" is a public GitHub repository serving as a diverse AI/ML Project Portfolio. Using Python coding and Juptyer notebook for multiple methodologies to model statistical algorithms.
Project-Based Intern from Home Credit Indonesia, Credit Risk Classification based on bad/good credit
Automatically create a config of hyper-parameters from global variables
Hyper-parameter tuner (for computer vision and reinforcement learning)
A simple python interface for running multiple parallel instances of a python program (e.g. gridsearch).
Experiment with different optimizer, layers, filters, regularization for Y-Net(CNN) with CIFAR 10 and CIFAR 100 dataset
Review the topics from Computer Science field.
Mini Project for Implementasi Decision Tree dengan CART
Assignment titled "A Brief Review of Hyperparameter Optimization Methods for Machine Learning" for Research Methods in Computer Science course at Ryerson University
Presentation titled "A Brief Review of Hyperparameter Optimization Methods for Machine Learning" for Research Methods in Computer Science course at Ryerson University
This is kaggle competion. For creating this ML classification models and other preprocessing techniques use
This notebook demonstrates an end-to-end, reproducible ML workflow with business-oriented communication: clear EDA, rigorous CV & hyperparameter tuning, interpretable feature importances, visual diagnostics, and an exported pipeline ready for production validation.