-
-
Notifications
You must be signed in to change notification settings - Fork 211
Reinstantiate model if needed. Better errors if can't. #722
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…is not possible. Automatically reinstantiate flow model if possible when run_flow_on_task is called.
mfeurer
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good, but it would be great if you could check for the new error message in the unit tests.
| self.assertEqual(server_flow.parameters['categories'], '[[0, 1], [0, 1]]') | ||
| self.assertEqual(server_flow.model.categories, flow.model.categories) | ||
|
|
||
| def test_get_flow_reinstantiate_model(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This appears to be a simple version of tests.test_flows.test_flows:TestFlow.test_sklearn_to_upload_to_flow. Could you please check that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But it might be good to have such a test next to the one below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think tests.test_flows.test_flows:TestFlow.test_sklearn_to_upload_to_flow ever actually tests if the model is reinstantiated properly. I think the model is ignored by both assert_flows_equal and _to_xml.
Of course, it is easy to add the check there. But I think it's good to test for this separately from such a complicated flow setup. If this unit test fails, it's very likely due to an error in reinstantiation. If the other test fails, it can be due to any number of reasons, such as serializing subflows, generating xml, etc.
|
|
||
| def test_get_flow_reinstantiate_model_no_extension(self): | ||
| # Flow 10 is a WEKA flow | ||
| self.assertRaises(RuntimeError, openml.flows.get_flow, flow_id=10, reinstantiate=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please use the more specific assertRaisesRegex? Then you can check for the new error message you added.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
|
Seem to be random errors with |
Addresses some concerns raised in #718. Mainly:
run_flow_on_taskis called, and the flow has no model, it is now automatically reinstantiated if requirements are satisfied.Not addressed: