XGBoost [1] is an open-source software library that provides the gradient boosting framework for C ++ , Java , Python , [2] R , [3] and Julia . [4] It works on Linux , Windows , [5] and macOS . [6] From the project description, it aims to provide a “Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library”. Other than running on a single machine, it also supports the distributed processing frameworksApache Hadoop , Spark Apache , and Apache Flink . It has gained much popularity and attention recently as it was the algorithm of choice for many winning teams of machine learning competitions. [7]


XGBoost primarily started as a research project by Tianqi Chen [8] as part of the Distributed (Deep) Machine Learning Community (DMLC) group. Initially, it started as a terminal application that could be configured using a libsvm configuration file. After winning the Higgs Machine Learning Challenge, it became known in the ML competition circles. Soon after, the Python and R, Julia, Scala, Java, and so on. This brought the library to more people and became more popular among the Kaggle community where it was used for a large number of competitions. [7]

It has become easier to use in the respective communities. It now has integrations with scikit-learn for Python users, and also with the caret package for R users. It can also be integrated into Data Flow frameworks like Apache Spark , Apache Hadoop , and Apache Flink using the abstracted Rabit [9] and XGBoost4J. [10] The working of XGBoost has also been published by Tianqi Chen and Carlos Guestrin. [11]


  • John Chambers Award (2016) [12]
  • High Energy Physics meets the Machine Learning award (HEP meets ML) (2016) [13]


  1. Jump up^ “GitHub project webpage” .
  2. Jump up^ “Python Package PYPI Index: xgboost” . Retrieved 2016-08-01 .
  3. Jump up^ “CRAN package xgboost” . Retrieved 2016-08-01 .
  4. Jump up^ “Julia package listing xgboost” . Retrieved 2016-08-01 .
  5. Jump up^ “Installing XGBoost for Anaconda in Windows” . Retrieved 2016-08-01.
  6. Jump up^ “Installing XGBoost on Mac OSX” . Retrieved 2016-08-01 .
  7. ^ Jump up to:b “XGBoost – ML winning solutions (incomplete list)” . Retrieved 2016-08-01 .
  8. Jump up^ “Story and Lessons Behind the Evolution of XGBoost” . Retrieved 2016-08-01 .
  9. Jump up^ “Rabit – Reliable Allreduce and Broadcast Interface” . Retrieved 2016-08-01 .
  10. Jump up^ “XGBoost4J” . Retrieved 2016-08-01 .
  11. Jump up^ Chen, Tianqi; Carlos Guestrin (2016). “XGBoost: A Scalable Tree Boosting System” . CoRR . abs / 1603.02754. arXiv : 1603.02754  .
  12. Jump up^ “John Chambers Previous Winners Award” . Retrieved 2016-08-01 .
  13. Jump up^ “HEP meets ML Award” . Retrieved 2016-08-01 .