There are 2 repositories under boosted-trees topic.
Boosted trees in Julia
Show how to perform fast retraining with LightGBM in different business cases
Provably Robust Boosted Decision Stumps and Trees against Adversarial Attacks [NeurIPS 2019]
GBDT (Gradient Boosted Decision Tree: 勾配ブースティング) のpythonによる実装
Faster, better, smarter ecological niche modeling and species distribution modeling
The purpose of this project is to process the dataset, analyze it, do some feature engineering and finally make a predictive loan model for an applicant.
Package implements decision tree and isolation forest
We downloaded and processed ten years of historic log data from the Tor project. Then we used boosted regression trees and generalized linear models to predict malicious exit nodes.
Classification Trees, Random Forest, Boosting | Columbia Business School
Regression model for predicting house prices of residential homes in Ames, Iowa. Dataset contains 79 explanatory variables. Project includes key topics such as dataset cleaning, feature selection/engineering, EDA and applying grid search to find the best model.
Classifying the survival of passengers aboard the Titanic via the use of various Machine Learning algorithms.
Compare different classification models to predict the accuracy of identifying credit card transactions as normal or fraud
Custom built Decision Tree + Boosted Trees + KernelPLS in python
Boosting algorithm(Machine learning) on Decision stumps(Decision tree with one split).
Protein classification with deep learning and boosted trees using topological features
R code to reproduce analyses in "Rapid winter warming could disrupt coastal marine fish community structure" (Clark et al, Nature Climate Change, 2020)
This repository implements the basic machine learning classifiers for the problem of Yelp reviews classification. We assume the problem to be a binary classification problem. The models implemented are Naive Bayes, Logistic Regression, Support Vector Machine (linear), Decision Trees, Bagged Decision Trees, Random Fforests, and Boosted Decision Trees.
Neural Networks, Ada Boost, Random Forest, KNN, BoostedForest
Classification prediction model
This is a project that was completed while taking the Udemy course - Python for Machine Learning & Data Science Masterclass.
Determining financial factors affecting the health of an individual
This project aims at developing, validating, and testing several classification statistical models that could predict whether or not an office room is occupied using several data features, namely temperature (◦C), light (lx), humidity (%), CO2 (ppm), and a humidity ratio. The data is modeled using classification techniques i.e. Logistic regression, Classification tree, Bagging-Random forest, and Gradient boosted trees. These models were trained and then after evaluated against validation and test sets and using confusion matrices to obtain classification and misclassification rates. The logistic model was trained using glmnet R package, Tree package for classification tree model, randomForest for both Bagging and Random Forest Models, and gbm package for Gradient Boosted Model. The best accuracy was obtained from the Random Forest Model with a classification rate of 93.21% when it was evaluated against the test set. Light sensor is also the most significant variable in predicting whether the office room is occupied or not, this was observed in all the five models.
This is the repository for my R project on modeling historical weather data in Santa Barbara.
CS760: Machine Learning
Projects using tree methods (CART, Random Forests, Boosted Trees)
R based data analytics on German Cars Market
Analysis of 2015 Mexican Ministry of Health administrative data