[MRG+2] Implemented SelectFromModel meta-transformer#4242
[MRG+2] Implemented SelectFromModel meta-transformer#4242MechCoder merged 11 commits intoscikit-learn:masterfrom
Conversation
|
@amueller @agramfort Do you agree with the general direction in which this PR is going? Basically a meta-transformer is coupled with an estimator to provide the transform methods, instead of inheriting from |
|
hum. What's the vision? too fuzzy for me now |
|
Can you please summarize the discussion of #3011 in the description of this PR? |
|
Sure, just give me some time. |
|
#3011 was meant to pilot the idea of having selection based on coefficients/importances performed by a metaestimator rather than a mixin to predictors. Part of the point is not to clutter the individual estimators' APIs with parameters only used in rare cases. Instead, a metaestimator should:
By way of Zen tradeoffs, it prefers "explicit is better than implicit" over "flat is better than nested". Is the vision clearer, @agramfort? Also, it's possible @maheshakya had no intention to continue working on this, but @MechCoder it may be wise to check before simply taking it over, even if it's stale. |
|
The main immediate benefits of this are: there's no need to move the |
|
Thank you @MechCoder for bringing this up again. You can continue this :) |
|
@jnothman I think I get the design decision but my question was more the end user point of view. Somehing along the line of "Users with be able to do XXX with estimators A, B and C by only adding this piece of code in ...". |
|
@jnothman @maheshakya Sorry for taking this PR over without saying. I incorrectly assumed that the lack of activity for around 6 months that the Pull Request is stale. @agramfort A transform can meet two different things now. Though both are indeed transforming, inheriting from So it goes like "Users will be able to extract the n most important features by wrapping around this Metatransformer with all estimators, without having to subclass from _LearntSelectorMixin" |
61505ee to
360e81a
Compare
|
ok got it. I like the idea. I guess we need to document this new
SelectFromModel class and explain its usage. It should also appear in
one of the examples.
|
|
(It's more that stale doesn't strictly imply ripe for adoption. There may On 17 February 2015 at 04:00, Alexandre Gramfort [email protected]
|
|
I agree with the direction, thought I didn't have too close a look. Is the idea to remove transform from the models that already have the mixin? |
|
Sorry, for the huge delay. I just had a look at the pull request today and found out all the hard work has already been done by @maheshakya . I've updated the pull request description to list out todo's about what all is left to be done. I will not be able to work on this in the near future. I've updated the label as easy as this should be a very good issue for a new developer. |
|
@MechCoder maybe this would be a good project for you to finish? It would be very helpful. |
|
OK, if you insist. |
5dff9a9 to
22bac70
Compare
|
Thinking about it, is it necessary to test each and every model that has a @amueller could you verify that the tests in |
doc/modules/feature_selection.rst
Outdated
|
doctests raised some |
|
|
822560e to
4f09146
Compare
|
@amueller thanks for your comments. The last commit should address all of them. |
There was a problem hiding this comment.
I think this is one of those cases where default None is not very helpful, but the description should say "By default, [this is its behaviour]..."
There was a problem hiding this comment.
I have mentioned that below. (and might be too long to write here)
There was a problem hiding this comment.
What I meant is that "default None" could be dropped, but say "By default,..." rather than "When None,..." below. But it's a really minor nitpick.
|
Once those are fixed up, it's good for merge (LGTM). |
4f09146 to
c805fbc
Compare
|
@jnothman I'll merge after Travis passes? |
|
Yes, with two whats_new entries: one under API changes to tell people their |
[MRG+1] Implemented SelectFromModel meta-transformer
|
Thanks a lot! @maheshakya @glouppe @amueller @jnothman |
|
🍻 was lange währt währt endlich gut. Thanks @MechCoder :) |
Continuation of #3011