Amazon SageMaker is an end-to-end platform used by data scientists and developers to build, train, tune, and deploy machine learning (ML) models at scale. Amazon SageMaker lets you begin training your model with a single click in the console or with a simple API call. When the training is complete, and you’re ready to deploy your model, you can launch it with a single click in the Amazon SageMaker console. After you deploy a model into production using the Amazon SageMaker hosting service, you have a persistent HTTPS endpoint where your machine learning model is available to provide inferences via the InvokeEndpoint API action.
The Amazon SageMaker InvokeEndpoint API action now supports a new HTTP header called CustomAttributes. Use this new header to provide additional information about a request for an inference submitted to a model hosted at an Amazon SageMaker endpoint. You can also provide additional information in the response to the inference request returned by a model hosted at an Amazon SageMaker endpoint. Use the CustomAttributes header, for example, to provide a Trace ID or application-specific identifier that you can use to track a request or to provide other metadata that a service endpoint is programmed to process.
The information provided by you in the CustomAttributes header is an opaque value that is forwarded verbatim. Calls to InvokeEndpoint are authenticated by using AWS Signature Version 4. The custom attributes cannot exceed 1024 characters and must consist of visible US-ASCII characters as specified in Section 3.3.6. Field Value Components of the Hypertext Transfer Protocol (HTTP/1.1).
The following code snippets show how you can provide a custom attribute header to your model using the AWS SDK. In these examples, the CustomAttributes from my client application in the request and in the response is a trace ID.
Sample request:
1 |
|
Sample response:
1 |
|
To help you debug your endpoints, training jobs, and notebook instance lifecycle configurations, anything an algorithm container, a model container, or a notebook instance lifecycle configuration sends to stdout or stderr is also sent to Amazon CloudWatch Logs. In addition to debugging, you can use these for progress analysis. See all the Log Group and Stream names in the documentation.
This feature is available in all AWS Regions where Amazon SageMaker is available. For more information, see the Amazon SageMaker Developer Guide.
About the Author
Urvashi Chowdhary is a Senior Product Manager for Amazon SageMaker. She is passionate about working with customers and making machine learning more accessible. In her spare time, she loves sailing, paddle boarding, and kayaking.