Skip to content

rnevesp/Hands-On-Gradient-Boosting-with-XGBoost-and-Scikit-learn

 
 

Repository files navigation

Hands-On Gradient Boosting with XGBoost and scikit-learn

Hands-On Gradient Boosting with XGBoost and scikit-learn

This is the code repository for Hands-On Gradient Boosting with XGBoost and scikit-learn, published by Packt.

Perform accessible machine learning and extreme gradient boosting with Python

What is this book about?

XGBoost is an industry-proven, open-source software library that provides a gradient boosting framework for scaling billions of data points quickly and efficiently.

This book covers the following exciting features: <First 5 What you'll learn points>

  • Build gradient boosting models from scratch
  • Develop XGBoost regressors and classifiers with accuracy and speed
  • Analyze variance and bias in terms of fine-tuning XGBoost hyperparameters
  • Automatically correct missing values and scale imbalanced data
  • Apply alternative base learners like dart, linear models, and XGBoost random forests

If you feel this book is for you, get your copy today!

https://www.packtpub.com/

Instructions and Navigations

All of the code is organized into folders. For example, Chapter02.

The code will look like the following:

cross_val(LogisticRegression()) 

Following is what you need for this book: This book is for data science professionals and enthusiasts, data analysts, and developers who want to build fast and accurate machine learning models that scale with big data. Proficiency in Python, along with a basic understanding of linear algebra, will help you to get the most out of this book.

With the following software and hardware list you can run all code files present in the book (Chapter 1-10).

Software and Hardware List

Chapter Software required OS required
1 Anaconda: Jupyter Notebbok/ sklearn 0.23 Windows, Mac OS X, and Linux (Any)
2 Anaconda: Python 3.7 Windows, Mac OS X, and Linux (Any)
3 xgboost 1.2 Windows, Mac OS X, and Linux (Any)

We also provide a PDF file that has color images of the screenshots/diagrams used in this book. Click here to download it.

Related products

  • Hands-On Machine Learning with scikit-learn and Scientific Python Toolkits [Packt] [Amazon]

  • Mastering Machine Learning Algorithms- Second Edition [Packt] [Amazon]

Get to Know the Author

Corey Wade M.S. Mathematics, M.F.A. Writing and Consciousness, is the founder and director of Berkeley Coding Academy, where he teaches machine learning and AI to teens from all over the world. Additionally, Corey chairs the Math Department at the Independent Study Program of Berkeley High School, where he teaches programming and advanced math. His additional experience includes teaching natural language processing with Hello World, developing data science curricula with Pathstream, and publishing original statistics (3NG) and machine learning articles with Towards Data Science, Springboard, and Medium. Corey is co-author of the Python Workshop, also published by Packt.

Suggestions and Feedback

Click here if you have any feedback or suggestions.

Download a free PDF

If you have already purchased a print or Kindle version of this book, you can get a DRM-free PDF version at no cost.
Simply click on the link to claim your free PDF.

https://packt.link/free-ebook/9781839218354

About

Hands-On Gradient Boosting with XGBoost and Scikit-learn Published by Packt

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%