Fixes #4392 Add AddPredictionEnginePool overload for implementation factory#4393
Conversation
eerhardt
left a comment
There was a problem hiding this comment.
I think this change looks good. Thanks for the contribution @NeoXtreem!
Once you fix up the doc comments, and the CI is green, I can merge.
Additional changes need to be reviewed.
Codecov Report
@@ Coverage Diff @@
## master #4393 +/- ##
=========================================
Coverage ? 74.72%
=========================================
Files ? 906
Lines ? 159287
Branches ? 17147
=========================================
Hits ? 119035
Misses ? 35444
Partials ? 4808
|
| { | ||
| /// <summary> | ||
| /// Adds a <see cref="PredictionEnginePoolBuilder{TData, TPrediction}"/> to the service collection. | ||
| /// Adds a <see cref="PredictionEnginePool{TData, TPrediction}"/> and required config services to the service collection. |
There was a problem hiding this comment.
This should stay PredictionEnginePoolBuilder, that is what is added to the service collection.
There was a problem hiding this comment.
Are you sure? The code seems pretty clear in that it adds a PredictionEnginePool<TData, TPrediction> object:
services.AddSingleton<PredictionEnginePool<TData, TPrediction>What is returned is a PredictionEnginePoolBuilder<TData, TPrediction> object which simply wraps IServiceCollection. Speaking of which, this class doesn't do anything with the generic arguments which could probably even be removed?
There was a problem hiding this comment.
Speaking of which, this class doesn't do anything with the generic arguments which could probably even be removed?
You can't change the generic arguments because this class is public and has shipped as a stable API.
Also, the generic arguments are used by the extension methods.
|
@NeoXtreem Sorry for the delay in merging your PR. We have had a bit of instability in our CI builds that we are trying to work through. We have checked in a temporary fix yesterday that should improve the situation a bit. Can you please fetch from master, rebase your PR branch and resubmit your PR. We should be able to merge your PR then. |
|
@NeoXtreem any success in rebasing? |
|
Is this available in 1.4 already? Can I get the complete code samples? |
|
These changes will not be available until the pull request is formally merged, but there are still some test failures that are preventing that. |
|
Thanks for your swift response gvashishtha. This has been sitting there since Feb 19. In the mean time, is there an alternative way to load a newly trained model into the prediction engine pool past the startup ? If so, can you make a sample that checks a folder for a new model and load it maybe from a controller or middleware? |
|
Guys - can we prioritize this? Correct me if I'm wrong but with this change in place, I can load a newly trained model into an existing prediction engine pool during runtime(post StartUp.cs) e.g. inside a Web API controller....yes? |
|
@natke to follow up as I have since moved teams |
|
Fellow Microsoft FTE here, It will be very helpful if this change is made available. I will try to send email internally to get some update. |
|
/azp run |
|
Azure Pipelines successfully started running 2 pipeline(s). |
|
This PR has passed CI and merged into master, thanks. |
|
Thanks a lot for making good progress guys! In which version will this be released with? Will there be sample codes on how I can add/update/remove a model from the engine pool inside a web api controller or something? |
|
@iluveu28 This PR will be released in new 1.5.2 version I think, add @natke and @harishsk to comment on release date. I don't find any existing sample code on add/update/remove a model from engine pool but these can be really easy.We have file change monitor that keep monitoring the model file, so if any changes to model file ML.NET will automatically load the new model into memory and the new model will be ready to use. Below is sample usage from one of our customer that shows how to update a model: https://github.com/alexandermujirishvili/DotnetMLWatchForChanges |
|
Yes I know but when a new model is trained, I need to restart the service in order for it to be loaded into the engine pool. I need to be able to add a new model into the pool outside of the startup.cs. Try to create a web api Controller that accepts a training file to train then immediately predict after that and you’ll see what I mean. |
|
Note that the new model uploaded by users will have different Model names every time. We don’t overwrite the same file as we have multiple models for different use cases. Thanks again. |
|
@iluveu28 I see what you mean now. What you ask is more of a ASP.NET question instead of a ML.NET question to me. |
|
@frank-dong-ms yes that might work but this PR is to make the pool accessible through the middleware which is a better solution correct? Also on your note about the watchForChanges - https://github.com/alexandermujirishvili/DotnetMLWatchForChanges Thanks a lot for the tips! |
|
@iluveu28 Yes, check out Eric's reply on so for how to define custom model loader: For file watch bug, yes we already have fixed the issue and this fix should come out with this PR at next release. |
|
@frank-dong-ms Sure. This PR is still needed for the custom model loader to work due to below reply from @NeoXtreem in https://stackoverflow.com/questions/57682393/how-to-add-model-to-predictionenginepool-in-middleware-ml-net/57698799#57698799 "I tried using a custom ModelBuilder but found I'm unable to access middleware services during Startup.ConfigureServices, and AddPredictionEnginePool doesn't offer an overload that accepts an implementation factory where I would normally be able to access IServiceProvider as I would with AddAddSingleton, AddScoped and AddTransient. Using the method I mentioned before (ServiceProvider.BuildServiceProvider) results in a warning that an additional copy of singleton services would be created. I've fixed this and raise a PR here. – Neo Oct 26 '19 at 4:01" Thanks for making ML.NET better and I can't wait to try out the next version 👍 |
|
Hi, Is this released already? v1.5.2 ? |
|
We are about to release v1.5.2. (The nugets have just been uploaded, so in theory you can make use of them. The rest of the release process should be complete by Monday) |
|
Thanks a lot guys. Is there a method that I can call or override to load a newly trained model into the prediction engine pool? I have 2 microservices (training and prediction). My users can upload a training data into the training microservice and should be able to predict right after training is completed by calling the prediction micro service without having to restart the service. The model name can be new and different from what's already loaded up into the engine pool during StartUp.cs ConfigureServices method. I hope there is something like AddPredictionEnginePool() accessible outside the StartUp.cs e.g. in the API controller. A sample code would be appreciated. |
This is an additional overload for
AddPredictionEnginePoolto allow middleware dependencies to be used in a customModelLoader. This will allow the following code to be used inStartupin a similar manner to the corresponding overloads onAddSingleton,AddScopedandAddTransient:This fixes issue #4392, and is related to a answer to a question on Stack Overflow given by @eerhardt.