Microsoft unveils autoscaling for Azure Kubernetes Service
At Microsoft Connect(); 2018 today, Microsoft launched a bevy of updates across its cloud, artificial intelligence (AI), and development product suites. That’s no exaggeration — the company made its Azure Machine Learning service generally available, enhanced several of its core Azure IoT Edge offerings, and debuted Cloud Native Application Bundles alongside the open source ONNX Runtime inferencing engine.
And that only scratched the surface of today’s small mountain of announcements.
First off, Azure Kubernetes Service (AKS), which let developers elastically provision quick-starting pods inside of Azure Container Instances (ACI), is available in public preview starting today. Customers can switch it on within the Azure portal, where it cohabitates in the virtual network with other resources.
Also launching today: the aforementioned cluster autoscaling for AKS. It’s based on the upstream Kubernetes cluster autoscaler project and automatically adds and removes nodes to meet workload needs (subject to minimum and maximum thresholds, of course). It works in conjunction with horizontal pod autoscaling and enters public preview this week.
Dovetailing with those announcements, Microsoft debuted graphics processing unit (GPU) support in ACI, which lets developers run demanding jobs — such as AI model training — containerized in accelerated virtual machines. And the Seattle company detailed the Azure Serverless Community Library, an open source set of prebuilt components based on common use cases, such as resizing images in Blob Storage, reading license plate numbers with OpenALPR, and conducting raffles, all of which can be deployed on Azure subscriptions with minimal configuration.
GPU support in ACI launches in preview today, and the Azure Serverless Community Library is available on GitHub and the Serverless Library website.
Rounding out today’s news, Microsoft took the wraps off a new consumption plan for Linux-based Azure Functions, which was teased at Microsoft Ignite earlier this year. Now it’s possible to deploy Functions built on top of Linux using the pay-per-execution model, enabling serverless architectures for developers with code assets or prebuilt containers.
Finally, Microsoft launched Azure API Management (APIM) — a turnkey solution for publishing APIs to external and internal customers — in public preview in a consumption-based usage plan (Consumption Tier). Effectively, it allows APIM to be used in a serverless fashion, with instant provisioning, automated scaling, and pay-per-play pricing.