LightGBM: A Highly Efficient Gradient Boosting Decision Tree

Apr 17, 2018 · Greater London, United Kingdom

Note: Please sign-up for the event on the Skills Matter site:

Presented at NIPS 2017, this month we will be looking at the paper ‘LightGBM: A Highly Efficient Gradient Boosting Decision Tree’

Gradient boosting decision trees are a popular algorithm in machine learning, and have demonstrated their utility very visibly by their rise to dominance in competitive situations such as Kaggle.

LightGBM is the most efficient and scaleable version (up to 20x faster than traditional GBDT) yet created, quickly overtaking XGBoost as the connoisseur’s choice for this technique.

How does this algorithm work? What are the trade-offs versus the related approaches? How should we think about applying LightGBM to real world problems? Come along to discuss the paper and the practice.

- LightGBM: A Highly Efficient Gradient Boosting Decision Tree (Ke G. et al, presented at NIPS 2017)

Background material:
- LightGBM docs -
- Blog - What is LightGBM, How to implement it? How to fine tune the parameters? (
- Website with a lot of information on GBM methods -

A note about the Journal Club format:

1. There is no speaker at Journal Club.

2. There is NO speaker at Journal Club.

3. We split into small groups of 6 people and discuss the papers. For the first hour the groups are random to make sure everyone is on the same page. Afterwards we split into blog/paper/code groups to go deeper.

4. Volunteers sometimes seed the discussion by guiding through the paper highlights for 5 mins. You are very welcome to volunteer in the comments.

5. Reading the materials in advance is really helpful. If you don't have time, please come anyway. We need this group to learn together.

Event organizers

Are you organizing LightGBM: A Highly Efficient Gradient Boosting Decision Tree?

Claim the event and start manage its content.

I am the organizer

based on 0 reviews