Skip to content

Fixes #4392 Add AddPredictionEnginePool overload for implementation factory#4393

Merged
frank-dong-ms-zz merged 7 commits intodotnet:masterfrom
NeoXtreem:features/prediction-engine-pool-dependency-support
Aug 18, 2020
Merged

Fixes #4392 Add AddPredictionEnginePool overload for implementation factory#4393
frank-dong-ms-zz merged 7 commits intodotnet:masterfrom
NeoXtreem:features/prediction-engine-pool-dependency-support

Conversation

@NeoXtreem
Copy link
Copy Markdown
Contributor

@NeoXtreem NeoXtreem commented Oct 26, 2019

This is an additional overload for AddPredictionEnginePool to allow middleware dependencies to be used in a custom ModelLoader. This will allow the following code to be used in Startup in a similar manner to the corresponding overloads on AddSingleton, AddScoped and AddTransient:

services.AddPredictionEnginePool<Foo, Bar>(serviceProvider =>
{
    services.AddOptions<PredictionEnginePoolOptions<Foo, Bar>>().Configure(options =>
    {
        options.ModelLoader = new MyModelLoader(serviceProvider.GetService<IMyService>());
    });
    return new PredictionEnginePool<Foo, Bar>();
});

This fixes issue #4392, and is related to a answer to a question on Stack Overflow given by @eerhardt.

@NeoXtreem NeoXtreem requested a review from a team as a code owner October 26, 2019 03:51
Comment thread src/Microsoft.Extensions.ML/ServiceCollectionExtensions.cs Outdated
eerhardt
eerhardt previously approved these changes Oct 28, 2019
Copy link
Copy Markdown
Member

@eerhardt eerhardt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this change looks good. Thanks for the contribution @NeoXtreem!

Once you fix up the doc comments, and the CI is green, I can merge.

Comment thread src/Microsoft.Extensions.ML/ServiceCollectionExtensions.cs Outdated
@eerhardt eerhardt dismissed their stale review November 7, 2019 23:07

Additional changes need to be reviewed.

Comment thread src/Microsoft.Extensions.ML/ServiceCollectionExtensions.cs Outdated
Comment thread src/Microsoft.Extensions.ML/ServiceCollectionExtensions.cs Outdated
Comment thread src/Microsoft.Extensions.ML/ServiceCollectionExtensions.cs
@codecov
Copy link
Copy Markdown

codecov Bot commented Nov 11, 2019

Codecov Report

❗ No coverage uploaded for pull request base (master@b7db4fa). Click here to learn what that means.
The diff coverage is n/a.

@@            Coverage Diff            @@
##             master    #4393   +/-   ##
=========================================
  Coverage          ?   74.72%           
=========================================
  Files             ?      906           
  Lines             ?   159287           
  Branches          ?    17147           
=========================================
  Hits              ?   119035           
  Misses            ?    35444           
  Partials          ?     4808
Flag Coverage Δ
#Debug 74.72% <ø> (?)
#production 70.09% <ø> (?)
#test 90.11% <ø> (?)

Comment thread src/Microsoft.Extensions.ML/ServiceCollectionExtensions.cs Outdated
{
/// <summary>
/// Adds a <see cref="PredictionEnginePoolBuilder{TData, TPrediction}"/> to the service collection.
/// Adds a <see cref="PredictionEnginePool{TData, TPrediction}"/> and required config services to the service collection.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should stay PredictionEnginePoolBuilder, that is what is added to the service collection.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure? The code seems pretty clear in that it adds a PredictionEnginePool<TData, TPrediction> object:

services.AddSingleton<PredictionEnginePool<TData, TPrediction>

What is returned is a PredictionEnginePoolBuilder<TData, TPrediction> object which simply wraps IServiceCollection. Speaking of which, this class doesn't do anything with the generic arguments which could probably even be removed?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Speaking of which, this class doesn't do anything with the generic arguments which could probably even be removed?

You can't change the generic arguments because this class is public and has shipped as a stable API.

Also, the generic arguments are used by the extension methods.

Copy link
Copy Markdown
Member

@eerhardt eerhardt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@harishsk
Copy link
Copy Markdown
Contributor

@NeoXtreem Sorry for the delay in merging your PR. We have had a bit of instability in our CI builds that we are trying to work through. We have checked in a temporary fix yesterday that should improve the situation a bit. Can you please fetch from master, rebase your PR branch and resubmit your PR. We should be able to merge your PR then.

@gvashishtha
Copy link
Copy Markdown
Contributor

@NeoXtreem any success in rebasing?

@iluveu28
Copy link
Copy Markdown

iluveu28 commented May 8, 2020

Is this available in 1.4 already? Can I get the complete code samples?

@gvashishtha
Copy link
Copy Markdown
Contributor

These changes will not be available until the pull request is formally merged, but there are still some test failures that are preventing that.

@iluveu28
Copy link
Copy Markdown

iluveu28 commented May 9, 2020

Thanks for your swift response gvashishtha.

This has been sitting there since Feb 19. In the mean time, is there an alternative way to load a newly trained model into the prediction engine pool past the startup ? If so, can you make a sample that checks a folder for a new model and load it maybe from a controller or middleware?

@iluveu28
Copy link
Copy Markdown

Guys - can we prioritize this? Correct me if I'm wrong but with this change in place, I can load a newly trained model into an existing prediction engine pool during runtime(post StartUp.cs) e.g. inside a Web API controller....yes?

@gvashishtha
Copy link
Copy Markdown
Contributor

@natke to follow up as I have since moved teams

@sanketshah11
Copy link
Copy Markdown

Fellow Microsoft FTE here, It will be very helpful if this change is made available. I will try to send email internally to get some update.

@frank-dong-ms-zz
Copy link
Copy Markdown
Contributor

/azp run

@azure-pipelines
Copy link
Copy Markdown

Azure Pipelines successfully started running 2 pipeline(s).

@frank-dong-ms-zz frank-dong-ms-zz merged commit 9141290 into dotnet:master Aug 18, 2020
@frank-dong-ms-zz
Copy link
Copy Markdown
Contributor

frank-dong-ms-zz commented Aug 18, 2020

This PR has passed CI and merged into master, thanks.

@iluveu28
Copy link
Copy Markdown

Thanks a lot for making good progress guys!

In which version will this be released with? Will there be sample codes on how I can add/update/remove a model from the engine pool inside a web api controller or something?

@frank-dong-ms-zz
Copy link
Copy Markdown
Contributor

@iluveu28 This PR will be released in new 1.5.2 version I think, add @natke and @harishsk to comment on release date.

I don't find any existing sample code on add/update/remove a model from engine pool but these can be really easy.We have file change monitor that keep monitoring the model file, so if any changes to model file ML.NET will automatically load the new model into memory and the new model will be ready to use. Below is sample usage from one of our customer that shows how to update a model: https://github.com/alexandermujirishvili/DotnetMLWatchForChanges

@iluveu28
Copy link
Copy Markdown

Yes I know but when a new model is trained, I need to restart the service in order for it to be loaded into the engine pool. I need to be able to add a new model into the pool outside of the startup.cs. Try to create a web api Controller that accepts a training file to train then immediately predict after that and you’ll see what I mean.

@iluveu28
Copy link
Copy Markdown

Note that the new model uploaded by users will have different Model names every time. We don’t overwrite the same file as we have multiple models for different use cases. Thanks again.

@frank-dong-ms-zz
Copy link
Copy Markdown
Contributor

@iluveu28 I see what you mean now. What you ask is more of a ASP.NET question instead of a ML.NET question to me.
Check out the so thread below on adding service outside startup class:
https://stackoverflow.com/questions/42433321/add-service-outside-startup

@iluveu28
Copy link
Copy Markdown

@frank-dong-ms yes that might work but this PR is to make the pool accessible through the middleware which is a better solution correct?

Also on your note about the watchForChanges - https://github.com/alexandermujirishvili/DotnetMLWatchForChanges
Is this issue resolved? I did have the same issue whereby the retrained model is not refreshed until the service restarts although the watchForChanges flag is set to true. The temporary workaround I did is when the model is stale, I create a new predict engine instead of using the pool until the service is restarted then the IsStale flag is set to false in the DB. I think I can solve this by reloading the model into the pool once this PR is released? Or is there a bug in the watchForChanges implementations that you guys will fix soon?

Thanks a lot for the tips!

@frank-dong-ms-zz
Copy link
Copy Markdown
Contributor

@iluveu28
Copy link
Copy Markdown

@frank-dong-ms Sure. This PR is still needed for the custom model loader to work due to below reply from @NeoXtreem in https://stackoverflow.com/questions/57682393/how-to-add-model-to-predictionenginepool-in-middleware-ml-net/57698799#57698799

"I tried using a custom ModelBuilder but found I'm unable to access middleware services during Startup.ConfigureServices, and AddPredictionEnginePool doesn't offer an overload that accepts an implementation factory where I would normally be able to access IServiceProvider as I would with AddAddSingleton, AddScoped and AddTransient. Using the method I mentioned before (ServiceProvider.BuildServiceProvider) results in a warning that an additional copy of singleton services would be created. I've fixed this and raise a PR here. – Neo Oct 26 '19 at 4:01"

Thanks for making ML.NET better and I can't wait to try out the next version 👍

@iluveu28
Copy link
Copy Markdown

Hi,

Is this released already? v1.5.2 ?

@harishsk
Copy link
Copy Markdown
Contributor

We are about to release v1.5.2. (The nugets have just been uploaded, so in theory you can make use of them. The rest of the release process should be complete by Monday)

@iluveu28
Copy link
Copy Markdown

Thanks a lot guys.

Is there a method that I can call or override to load a newly trained model into the prediction engine pool?

I have 2 microservices (training and prediction). My users can upload a training data into the training microservice and should be able to predict right after training is completed by calling the prediction micro service without having to restart the service. The model name can be new and different from what's already loaded up into the engine pool during StartUp.cs ConfigureServices method. I hope there is something like AddPredictionEnginePool() accessible outside the StartUp.cs e.g. in the API controller.

A sample code would be appreciated.

@ghost ghost locked as resolved and limited conversation to collaborators Mar 20, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants