News Hub

Google launches containers for machine learning applications, three months after AWS did the same

Written by Thu 27 Jun 2019

AWS released containers for machine learning applications in March

Google Cloud Platform (GCP) has launched Deep Learning Containers, portable and consistent environments for developing, testing and deploying machine learning applications.

The docker images, released in beta, work on cloud, on-premises, and across GCP products and services, including Google Kubernetes Engine (GKE), AI Platform, Cloud Run, Compute Engine, Kubernetes, and Docker Swarm.

The main benefit the containers afford developers/organisations is the ability to test machine learning applications on-premises and quickly move to cloud at scale.

The containers also come prepackaged with Jupyter and Google Kubernetes Engine (GKE) clusters, optimised drivers for Nvidia GPUs and Intel CPUs, and popular machine learning frameworks. Google said the prepackaged frameworks, libraries, and drivers allow developers to forge machine learning prototypes quicker than before.

AWS launched Deep Learning Containers in March with support for TensorFlow and Apache MXNet frameworks out the box and PyTorch support planned.

Google’s ML containers, on the other hand, don’t support Apache MXNet but come pre-installed with PyTorch, TensorFlow scikit-learn and R.

“If your development strategy involves a combination of local prototyping and multiple cloud tools, it can often be frustrating to ensure that all the necessary dependencies are packaged correctly and available to every runtime,” said Mike Cheng, software engineer at Google Cloud in a blog post.

“Deep Learning Containers address this challenge by providing a consistent environment for testing and deploying your application across GCP products and services, like Cloud AI Platform Notebooks and Google Kubernetes Engine (GKE).”

Written by Thu 27 Jun 2019


containers DevOps gcp Google machine learning
Send us a correction Send us a news tip