Hands-On Gradient Boosting with XGBoost and scikit-learn: Perform accessible machine learning and extreme gradient boosting with Python
By 作者:Corey Wade and Kevin Glynn
Paperback : 310 pages
ISBN-10 : 1839218355
ISBN-13 : 9781839218354
Product Dimensions : 19.05 x 1.78 x 23.5 cm
Publisher Finelybook 出版社 : Packt Publishing (16 Oct. 2020)
Language 语言: : English
The Book Description robot was collected from Amazon and arranged by Finelybook
Get to grips with building robust XGBoost models using Python and scikit-learn for deployment
XGBoost is an industry-proven, open source software library that provides a gradient boosting framework for scaling billions of data points quickly and efficiently.
The book starts with an introduction to machine learning and XGBoost before gradually moving on to gradient boosting. You’ll cover decision trees in detail and analyze bagging in the machine learning context. You’ll then learn how to build gradient boosting models from scratch and extend gradient boosting to big data to recognize their limitations. The book also shows you how to implement fast and accurate machine learning models using XGBoost and scikit-learn and takes you through advanced XGBoost techniques By 作者:focusing on speed enhancements, deriving parameters mathematically, and building robust models. With the help of detailed case studies, you’ll practice building and fine-tuning regressors and classifiers and become familiar with new tools such as feature importance and the confusion matrix. Finally, you’ll explore alternative base learners, learn invaluable Kaggle tricks such as building non-correlated ensembles and stacking, and prepare XGBoost models for industry deployment with unique transformers and pipelines.
By the end of the book, you’ll be able to build high performing machine learning models using XGBoost with minimal errors and maximum speed.
What you will learn
Build machine learning bagging and boosting models
Develop XGBoost regressors and classifiers with impressive accuracy and speed
Find out how to analyze variance and bias in machine learning
Compare XGBoost’s results to decision trees, random forests, and gradient boosting
Visualize tree-based models and use machine learning to determine the most important features of a dataset
Implement robust XGBoost models ready for industry deployment
Build non-correlated ensembles and stack XGBoost models to increase accuracy