Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

How to load model programmatically? #1094

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
velociraptor111 opened this issue Oct 16, 2019 · 3 comments
Closed

How to load model programmatically? #1094

velociraptor111 opened this issue Oct 16, 2019 · 3 comments

Comments

@velociraptor111
Copy link

velociraptor111 commented Oct 16, 2019

I notice that whenever I run a new batch transform jobs, it will create a new model and saves it.
I can see all the models from my batch transform jobs in my AWS Sagemaker Dashboard/inference/models

Here is the script that I run

sagemaker_model = MXNetModel(model_data = 's3://' + sagemaker_session.default_bucket() + '/model/yolo_object_person_detector.tar.gz',
                             role = role, 
                             entry_point = 'entry_point.py',
                             py_version='py3',
                             framework_version='1.4.1',
                            sagemaker_session = sagemaker_session)

transformer = sagemaker_model.transformer(instance_count=1, instance_type='ml.m4.xlarge', output_path=batch_output)

transformer.transform(data=batch_input, content_type='application/x-image')

transformer.wait()

I've looked into the source code for declaration of class MXNetModel

class MXNetModel(FrameworkModel):
    """An MXNet SageMaker ``Model`` that can be deployed to a SageMaker ``Endpoint``."""

    __framework_name__ = "mxnet"
    _LOWEST_MMS_VERSION = "1.4"

def __init__(
        self,
        model_data,
        role,
        entry_point,
        image=None,
        py_version="py2",
        framework_version=MXNET_VERSION,
        predictor_cls=MXNetPredictor,
        model_server_workers=None,
        **kwargs
    ):

...

But I am not seeing anywhere where I can simply load the MXNetModel object using a URL Endpoint to the models in my dashboard.

If I go to console and click one of those models, I can see a button for Create batch transform job, so I know internally this is possible. But I can't find anything on the docs to do it programmatically.

Also as a side question:
How many models does the Free tier provide? In the free tier page: https://aws.amazon.com/sagemaker/#/ it just says the number of hours, but not necessarily the number of models

@ChoiByungWook
Copy link
Contributor

Hello @velociraptor111,

As of now, the Python SDK doesn't have the ability to reference a defined model in the SageMaker platform. I have added an item in the backlog (MLFW-2709), as I believe implementing this ability in conjunction with the existing Model class will take some time.

It is possible to do this using the AWS SDK or boto3 directly, however that won't be as convenient. If you do go this route, you would specify the model name when generating the endpoint configuration.

Also as a side question:
How many models does the Free tier provide? In the free tier page: https://aws.amazon.com/sagemaker/#/ it just says the number of hours, but not necessarily the number of models

Let me reach out to the corresponding team and get back to you on that.

Thanks!

@laurenyu
Copy link
Contributor

laurenyu commented Oct 23, 2019

Just wanted to circle back on this:

Also as a side question:
How many models does the Free tier provide? In the free tier page: https://aws.amazon.com/sagemaker/#/ it just says the number of hours, but not necessarily the number of models

The free tier limit for model deployment is 125 hours of m4.xlarge or m5.xlarge for real-time inference and batch transform, and this usage can be for a single model or aggregated across multiple models.

@strawberrypie
Copy link

@ChoiByungWook is there any update on loading a model from SageMaker by its name?

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Projects
None yet
Development

No branches or pull requests

6 participants