There are 23 repositories under xgboost topic.
Python for《Deep Learning》,该书为《深度学习》(花书) 数学推导、原理剖析与源码级别代码实现
Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
A library for debugging/inspecting machine learning classifiers and explaining their predictions
Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby, F#, Rust) with zero dependencies
Mars is a tensor-based unified framework for large-scale data computation which scales numpy, pandas, scikit-learn and Python functions.
Deep Learning API and Server in C++14 support for Caffe, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
A collection of research papers on decision, classification and regression trees with implementations.
A minimal benchmark for scalability, speed and accuracy of commonly used open source implementations (R packages, Python scikit-learn, H2O, xgboost, Spark MLlib etc.) of the top machine learning algorithms for binary classification (random forests, gradient boosted trees, deep neural networks etc.).
[UNMAINTAINED] Automated machine learning for analytics & production
MLBox is a powerful Automated Machine Learning python library.
Distributed ML Training and Fine-Tuning on Kubernetes
A curated list of gradient boosting research papers with implementations.
Time series forecasting with scikit-learn models
Goal of this repo is to provide the solutions of all Data Science Competitions(Kaggle, Data Hack, Machine Hack, Driven Data etc...).
Scalable machine 🤖 learning for time series forecasting.
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
AI比赛相关信息汇总
REST web service for the true real-time scoring (<1 ms) of Scikit-Learn, R and Apache Spark models
Use advanced feature engineering strategies and select best features from your data set with a single line of code. Created by Ram Seshadri. Collaborators welcome.
A python package for simultaneous Hyperparameters Tuning and Features Selection for Gradient Boosting Models.
📘 The MLOps stack component for experiment tracking
An extension of XGBoost to probabilistic modelling
Fast SHAP value computation for interpreting tree-based models
An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
H2O.ai Machine Learning Interpretability Resources