Skip to content

[2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal, remove CLIs#6650

Merged
dbczumar merged 14 commits intomlflow:branch-2.0from
dbczumar:remove_sagemaker_deploy
Sep 1, 2022
Merged

[2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal, remove CLIs#6650
dbczumar merged 14 commits intomlflow:branch-2.0from
dbczumar:remove_sagemaker_deploy

Conversation

@dbczumar
Copy link
Copy Markdown
Collaborator

@dbczumar dbczumar commented Aug 30, 2022

Signed-off-by: dbczumar [email protected]

What changes are proposed in this pull request?

Make mlflow.sagemaker.deploy() / delete() APIs internal, since SageMakerDeploymentClient should be used instead.

This is a follow-up for #6651

How is this patch tested?

Existing tests

Does this PR change the documentation?

  • No. You can skip the rest of this section.
  • Yes. Make sure the changed pages / sections render correctly by following the steps below.
  1. Click the Details link on the Preview docs check.
  2. Find the changed pages / sections and make sure they render correctly.

Release Notes

Remove the deprecated mlflow.sagemaker.deploy() / delete() APIs, which have been replaced by SageMakerDeploymentClient with MLflow Deployments.

Is this a user-facing change?

  • No. You can skip the rest of this section.
  • Yes. Give a description of this change to be included in the release notes for MLflow users.

What component(s), interfaces, languages, and integrations does this PR affect?

Components

  • area/artifacts: Artifact stores and artifact logging
  • area/build: Build and test infrastructure for MLflow
  • area/docs: MLflow documentation pages
  • area/examples: Example code
  • area/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registry
  • area/models: MLmodel format, model serialization/deserialization, flavors
  • area/pipelines: Pipelines, Pipeline APIs, Pipeline configs, Pipeline Templates
  • area/projects: MLproject format, project running backends
  • area/scoring: MLflow Model server, model deployment tools, Spark UDFs
  • area/server-infra: MLflow Tracking server backend
  • area/tracking: Tracking Service, tracking client APIs, autologging

Interface

  • area/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev server
  • area/docker: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
  • area/sqlalchemy: Use of SQLAlchemy in the Tracking Service or Model Registry
  • area/windows: Windows support

Language

  • language/r: R APIs and clients
  • language/java: Java APIs and clients
  • language/new: Proposals for new client languages

Integrations

  • integrations/azure: Azure and Azure ML integrations
  • integrations/sagemaker: SageMaker integrations
  • integrations/databricks: Databricks integrations

How should the PR be classified in the release notes? Choose one:

  • rn/breaking-change - The PR will be mentioned in the "Breaking Changes" section
  • rn/none - No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" section
  • rn/feature - A new user-facing feature worth mentioning in the release notes
  • rn/bug-fix - A user-facing bug fix worth mentioning in the release notes
  • rn/documentation - A user-facing documentation change worth mentioning in the release notes

Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
@dbczumar dbczumar changed the title [2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal [2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal, remove CLIs Aug 30, 2022
Signed-off-by: dbczumar <[email protected]>
@dbczumar dbczumar requested review from AveshCSingh and harupy August 30, 2022 19:54
@github-actions github-actions bot added area/scoring MLflow Model server, model deployment tools, Spark UDFs rn/breaking-change Mention under Breaking Changes in Changelogs. labels Aug 30, 2022
@dbczumar dbczumar changed the title [2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal, remove CLIs [WIP][2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal, remove CLIs Aug 30, 2022
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
from mlflow.utils import env_manager as _EnvManager


def build_docker(
Copy link
Copy Markdown
Collaborator Author

@dbczumar dbczumar Aug 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Introduce an API equivalent of mlflow models build-docker so that users don't have to call mlflow.sagemaker.build_and_push_container() with special "no_push" args in order to build an MLflow Model Server docker image. This build_and_push_container should arguably be moved to push_container since it duplicates image building work done by mlflow models build-docker, but that can happen in a follow-up PR.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is tested sufficiently by mlflow docker CLI tests, since the CLI calls this method directly. See

@pytest.mark.parametrize("enable_mlserver", [True, False])
def test_build_docker(iris_data, sk_model, enable_mlserver):
with mlflow.start_run() as active_run:
if enable_mlserver:
mlflow.sklearn.log_model(
sk_model, "model", extra_pip_requirements=[PROTOBUF_REQUIREMENT]
)
else:
mlflow.sklearn.log_model(sk_model, "model")
model_uri = "runs:/{run_id}/model".format(run_id=active_run.info.run_id)
x, _ = iris_data
df = pd.DataFrame(x)
extra_args = ["--install-mlflow"]
if enable_mlserver:
extra_args.append("--enable-mlserver")
image_name = pyfunc_build_image(model_uri, extra_args=extra_args)
host_port = get_safe_port()
scoring_proc = pyfunc_serve_from_docker_image(image_name, host_port)
_validate_with_rest_endpoint(scoring_proc, host_port, df, x, sk_model, enable_mlserver)
def test_build_docker_virtualenv(iris_data, sk_model):
with mlflow.start_run():
model_info = mlflow.sklearn.log_model(sk_model, "model")
x, _ = iris_data
df = pd.DataFrame(iris_data[0])
extra_args = ["--install-mlflow", "--env-manager", "virtualenv"]
image_name = pyfunc_build_image(model_info.model_uri, extra_args=extra_args)
host_port = get_safe_port()
scoring_proc = pyfunc_serve_from_docker_image(image_name, host_port)
_validate_with_rest_endpoint(scoring_proc, host_port, df, x, sk_model)
@pytest.mark.parametrize("enable_mlserver", [True, False])
def test_build_docker_with_env_override(iris_data, sk_model, enable_mlserver):
with mlflow.start_run() as active_run:
if enable_mlserver:
mlflow.sklearn.log_model(
sk_model, "model", extra_pip_requirements=[PROTOBUF_REQUIREMENT]
)
else:
mlflow.sklearn.log_model(sk_model, "model")
model_uri = "runs:/{run_id}/model".format(run_id=active_run.info.run_id)
x, _ = iris_data
df = pd.DataFrame(x)
extra_args = ["--install-mlflow"]
if enable_mlserver:
extra_args.append("--enable-mlserver")
image_name = pyfunc_build_image(model_uri, extra_args=extra_args)
host_port = get_safe_port()
scoring_proc = pyfunc_serve_from_docker_image_with_env_override(
image_name, host_port, gunicorn_options
)
_validate_with_rest_endpoint(scoring_proc, host_port, df, x, sk_model, enable_mlserver)
def test_build_docker_without_model_uri(iris_data, sk_model, tmp_path):
model_path = tmp_path.joinpath("model")
mlflow.sklearn.save_model(sk_model, model_path)
image_name = pyfunc_build_image(model_uri=None)
host_port = get_safe_port()
scoring_proc = pyfunc_serve_from_docker_image_with_env_override(
image_name,
host_port,
gunicorn_options,
extra_docker_run_options=["-v", f"{model_path}:/opt/ml/model"],
)
x = iris_data[0]
df = pd.DataFrame(x)
_validate_with_rest_endpoint(scoring_proc, host_port, df, x, sk_model)

)


def _get_flavor_backend(model_uri, **kwargs):
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also needs to be called by mlflow.models.build_docker, so move into mlflow.models.flavor_backend_registry and import it from mlflow.models.cli and mlflow.models.



def deploy(
def _deploy(
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make deploy internal

Comment on lines +442 to 443
def _delete(
app_name,
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make delete internal



def run_local(model_uri, port=5000, image=DEFAULT_IMAGE_NAME, flavor=None):
def run_local(name, model_uri, flavor=None, config=None): # pylint: disable=unused-argument
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make run_local conform to the interface required by mlflow deployments run-local -t sagemaker

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes are tested by test cases that call score_model_in_sagemaker_docker_container, such as

scoring_response = score_model_in_sagemaker_docker_container(
model_uri=model_path,
data=spark_model_iris.pandas_df,
content_type=pyfunc_scoring_server.CONTENT_TYPE_JSON_SPLIT_ORIENTED,
flavor=mlflow.pyfunc.FLAVOR_NAME,
)

)
),
)
def run_local(model_uri, port, image, flavor):
Copy link
Copy Markdown
Collaborator Author

@dbczumar dbczumar Aug 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Obviated by mlflow deployments run-local -t sagemaker

" asynchronously using the `--async` flag, this value is ignored."
),
)
def delete(app_name, region_name, archive, asynchronous, timeout):
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

obviated by mlflow deployments delete -t sagemaker

" asynchronously using the `--async` flag, this value is ignored."
),
)
def deploy(
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Obviated by mlflow deployments create/update -t sagemaker

@@ -1,875 +0,0 @@
import os
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of the coverage provided in this test suite for mlflow.sagemaker.deploy() / delete() is covered for SageMakerDeploymentClient in https://github.com/mlflow/mlflow/blob/master/tests/sagemaker/test_sagemaker_deployment_client.py

f" -C image=mlflow-pyfunc -C port={port} --flavor {flavor}"
)
proc = _start_scoring_proc(
cmd=["mlflow", "sagemaker", "run-local", "-m", model_uri, "-p", "5000", "-f", flavor],
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mlflow sagemaker run-local is obviated by mlflow deployments run-local -t sagemaker and is removed above, so we migrate this test helper function that relies on sagemaker local deployment to use the mlflow deployments run-local API

Signed-off-by: dbczumar <[email protected]>
@dbczumar dbczumar changed the title [WIP][2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal, remove CLIs [2.0] Make mlflow.sagemaker.deploy() / delete() APIs internal, remove CLIs Aug 31, 2022
Signed-off-by: dbczumar <[email protected]>
To export a custom model to SageMaker, you need a MLflow-compatible Docker image to be available on Amazon ECR.
MLflow provides a default Docker image definition; however, it is up to you to build the image and upload it to ECR.
MLflow includes the utility function ``build_and_push_container`` to perform this step. Once built and uploaded, you can use the MLflow container for all MLflow Models. Model webservers deployed using the :py:mod:`mlflow.sagemaker`
The :py:mod:`mlflow.deployments` and py:mod:`mlflow.sagemaker` modules can deploy
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
The :py:mod:`mlflow.deployments` and py:mod:`mlflow.sagemaker` modules can deploy
The :py:mod:`mlflow.deployments` and :py:mod:`mlflow.sagemaker` modules can deploy

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching this!

Comment on lines +1901 to +1902
mlflow deployments run-local -t sagemaker -m <path-to-model> - test the model locally
mlflow deployments -t sagemaker create - deploy the model remotely
Copy link
Copy Markdown
Member

@harupy harupy Aug 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
mlflow deployments run-local -t sagemaker -m <path-to-model> - test the model locally
mlflow deployments -t sagemaker create - deploy the model remotely
mlflow deployments run-local -t sagemaker -m <path-to-model> # test the model locally
mlflow deployments -t sagemaker create # deploy the model remotely

Can we replace - with #? - test the model locally looks like a command argument, but it's not.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Absolutely! Fixed!

mlflow deployments run-local --target sagemaker \\
--name my-local-deployment \\
--model-uri "/mlruns/0/abc/model" \\
--flavor python_function\\
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
--flavor python_function\\
--flavor python_function \\

nit

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed!

flavor="python_function",
config={
"port": 5000,
"image": mlflow-pyfunc,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"image": mlflow-pyfunc,
"image": "mlflow-pyfunc",

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed! Thanks for catching this!

@harupy
Copy link
Copy Markdown
Member

harupy commented Aug 31, 2022

Can we revert the changes in examples/pipelines/sklearn_regression?

mlflow sagemaker run-local -m <path-to-model> - test the model locally
mlflow sagemaker deploy <parameters> - deploy the model remotely
mlflow deployments run-local -t sagemaker -m <path-to-model> - test the model locally
mlflow deployments -t sagemaker create - deploy the model remotely
Copy link
Copy Markdown
Member

@harupy harupy Aug 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
mlflow deployments -t sagemaker create - deploy the model remotely
mlflow deployments create -t sagemaker - deploy the model remotely

mlflow deployments -t sagemaker create gave the following error.

mlflow deployments -t sagemaker create
Usage: mlflow deployments [OPTIONS] COMMAND [ARGS]...
Try 'mlflow deployments --help' for help.

Error: No such option: -t

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great catch! Fixed!

* :py:func:`deploy <mlflow.sagemaker.deploy>` deploys the model on Amazon SageMaker. MLflow
uploads the Python Function model into S3 and starts an Amazon SageMaker endpoint serving
the model.
* :py:func:`deploy <mlflow.sagemaker.SageMakerDeploymentClient>` deploys the model on Amazon
Copy link
Copy Markdown
Member

@harupy harupy Aug 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this line correct? In the code block below, create deploys the model.

Suggested change
* :py:func:`deploy <mlflow.sagemaker.SageMakerDeploymentClient>` deploys the model on Amazon
* :py:func:`create <mlflow.sagemaker.SageMakerDeploymentClient.creaet_deployment>` deploys the model on Amazon

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed and made the link read mlflow deployments create -t sagemaker, also updated run-local to mlflow deployments run-local -t sagemaker

Signed-off-by: dbczumar <[email protected]>
Copy link
Copy Markdown
Collaborator Author

@dbczumar dbczumar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the awesome feedback, @harupy ! I've addressed your comments :)

To export a custom model to SageMaker, you need a MLflow-compatible Docker image to be available on Amazon ECR.
MLflow provides a default Docker image definition; however, it is up to you to build the image and upload it to ECR.
MLflow includes the utility function ``build_and_push_container`` to perform this step. Once built and uploaded, you can use the MLflow container for all MLflow Models. Model webservers deployed using the :py:mod:`mlflow.sagemaker`
The :py:mod:`mlflow.deployments` and py:mod:`mlflow.sagemaker` modules can deploy
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching this!

* :py:func:`deploy <mlflow.sagemaker.deploy>` deploys the model on Amazon SageMaker. MLflow
uploads the Python Function model into S3 and starts an Amazon SageMaker endpoint serving
the model.
* :py:func:`deploy <mlflow.sagemaker.SageMakerDeploymentClient>` deploys the model on Amazon
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed and made the link read mlflow deployments create -t sagemaker, also updated run-local to mlflow deployments run-local -t sagemaker

Comment on lines +1901 to +1902
mlflow deployments run-local -t sagemaker -m <path-to-model> - test the model locally
mlflow deployments -t sagemaker create - deploy the model remotely
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Absolutely! Fixed!

mlflow sagemaker run-local -m <path-to-model> - test the model locally
mlflow sagemaker deploy <parameters> - deploy the model remotely
mlflow deployments run-local -t sagemaker -m <path-to-model> - test the model locally
mlflow deployments -t sagemaker create - deploy the model remotely
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great catch! Fixed!

mlflow deployments run-local --target sagemaker \\
--name my-local-deployment \\
--model-uri "/mlruns/0/abc/model" \\
--flavor python_function\\
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed!

flavor="python_function",
config={
"port": 5000,
"image": mlflow-pyfunc,
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed! Thanks for catching this!

Copy link
Copy Markdown
Member

@harupy harupy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@dbczumar dbczumar merged commit 32ee711 into mlflow:branch-2.0 Sep 1, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area/scoring MLflow Model server, model deployment tools, Spark UDFs rn/breaking-change Mention under Breaking Changes in Changelogs.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants