-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[c++] Fix dump_model()
information for root node
#6569
base: master
Are you sure you want to change the base?
Conversation
Currently the CI is not passing as #6574 is blocking. |
Tests should now be sufficient for the change. |
internal_value_
is not calculated properlyinternal_value_
for root node
@jameslamb |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@shiyu1994 or @guolinke could you help with a review of this?
I'm not sure if this will correctly handle these cases:
- custom
init_score
provided (viaDataset
) boost_from_average=False
passed
@neNasko1 could you also look at #5962 and let us know if you think this change would fix the issue @thatlittleboy reported there?
Thank you for taking the time to look into the PR and linking a relevant issue.
I think those cases are handled as the results are consistent with what leaf values report, I also remade the test to boost from average.
I took the liberty to merge @thatlittleboy's WIP code into mine, additionally fixing the issues that they reported. I will also change the description of the PR to reflect both of the fixes. |
internal_value_
for root nodedump_model()
information for root node
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I left a few questions for your consideration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good to me! 🚀
@jameslamb |
I'll investigate this when I can, hopefully in the next few days. In the interim, you can help move this forward by resolving merge conflicts and pulling in the latest changes on |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
But I'll keep following the discussion about Dask Ranker test (#6569 (comment)).
@jameslamb can you submit a final review on the change, so that we can merge it? Sorry for the caused inconvenience! |
I will look when I can. I have spent most of my limited open source time in the last few weeks investigating and fixing multiple difficult, time-sensitive CI issues in this project, and there is yet another one that is still not done and a primary focus for me right now (#6651). If @StrikerRUS has time to re-review the commits and comments you've pushed since his approval, and if he approves, then my review can be dismissed and this can be merged without another review from me. Otherwise, you will have to be patient a bit longer. |
This PR corrects the output of
dump_model()
and other dump-related functions liketrees_to_dataframe()
. There are 2 fixes implemented:Tree::Split
implementation incorrectly saves the old leaf output value in theinternal_value_
array when called on the root node. This in turn makes inspecting the whole training process from python incomplete.Before:
After: