There are 24 repositories under xgboost topic.
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Python for《Deep Learning》,该书为《深度学习》(花书) 数学推导、原理剖析与源码级别代码实现
Alink is the Machine Learning algorithm platform based on Flink, developed by the PAI team of Alibaba computing platform.
Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby, F#, Rust) with zero dependencies
A library for debugging/inspecting machine learning classifiers and explaining their predictions
Mars is a tensor-based unified framework for large-scale data computation which scales numpy, pandas, scikit-learn and Python functions.
Deep Learning API and Server in C++14 support for PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
A collection of research papers on decision, classification and regression trees with implementations.
A minimal benchmark for scalability, speed and accuracy of commonly used open source implementations (R packages, Python scikit-learn, H2O, xgboost, Spark MLlib etc.) of the top machine learning algorithms for binary classification (random forests, gradient boosted trees, deep neural networks etc.).
Provide an input CSV and a target field to predict, generate a model + code to run it.
Automatically Visualize any dataset, any size with a single line of code. Created by Ram Seshadri. Collaborators Welcome. Permission Granted upon Request.
[UNMAINTAINED] Automated machine learning for analytics & production
MLBox is a powerful Automated Machine Learning python library.
Time series forecasting with machine learning models
A curated list of gradient boosting research papers with implementations.
Scalable machine 🤖 learning for time series forecasting.
Goal of this repo is to provide the solutions of all Data Science Competitions(Kaggle, Data Hack, Machine Hack, Driven Data etc...).
An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and more
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
AI比赛相关信息汇总
Use advanced feature engineering strategies and select best features from your data set with a single line of code. Created by Ram Seshadri. Collaborators welcome.
📘 The experiment tracker for foundation model training
An extension of XGBoost to probabilistic modelling
REST web service for the true real-time scoring (<1 ms) of Scikit-Learn, R and Apache Spark models
A python package for simultaneous Hyperparameters Tuning and Features Selection for Gradient Boosting Models.
Automatically Build Multiple ML Models with a Single Line of Code. Created by Ram Seshadri. Collaborators Welcome. Permission Granted upon Request.
Fast SHAP value computation for interpreting tree-based models
An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
H2O.ai Machine Learning Interpretability Resources