SageMaker TensorFlow Serving Container is an a open source project that builds docker images for running TensorFlow Serving on Amazon SageMaker. Some of the build and tests scripts interact with resources in your AWS account. Be sure to set your default AWS credentials and region using aws configure before using these scripts. Amazon SageMaker uses Docker containers to run all training jobs and inference endpoints. The Docker images are built from the Dockerfiles in docker/. The Dockerfiles are grouped based on the version of TensorFlow Serving they support. Each supported processor type (e.g. "cpu", "gpu", "ei") has a different Dockerfile in each group. If your are testing locally, building the image is enough. But if you want to your updated image in SageMaker, you need to publish it to an ECR repository in your account. You can also run your container locally in Docker to test different models and input inference requests by hand.
Features
- Supported versions of TensorFlow are 1.4.1, 1.5.0, 1.6.0, 1.7.0, 1.8.0, 1.9.0, 1.10.0, 1.11.0, 1.12.0, 1.13.1, 1.14.0, 1.15.0, 2.0.0
- Supported versions of TensorFlow for Elastic Inference are 1.11.0, 1.12.0, 1.13.1, 1.14.0
- ECR repositories for SageMaker built TensorFlow Serving Container
- ECR repositories for SageMaker built TensorFlow Serving Container for Elastic Inference
- Documentation covers building and testing these docker images
- For notebook examples, see Amazon SageMaker Examples