Category: Software, Data, artificial-intelligence

The rate at which machine learning operations (MLOps) and DevOps best practices are converging is poised for rapid acceleration in 2021. At the recent online AWS re:Invent conference, Amazon Web Services (AWS) announced it is adding a bevy of capabilities to Amazon SageMaker, a managed MLOps service, that includes a continuous integration/continuous delivery (CI/CD) service for MLOps it is calling Amazon SageMaker Pipelines.

Rather than having to acquire and deploy a standalone CI/CD platform for building artificial intelligence (AI) models, AWS is making the case for using a managed service that enables developers to manage workflows from within the Amazon SageMaker Studio tools it already provides for building AI models.

Amazon SageMaker Pipelines also logs each event in Amazon SageMaker Experiments, which enables IT teams to organize and track machine learning experiments and versions of AI models. In addition, a Deep Profiling for Amazon SageMaker Debugger now makes it possible for developers to more quickly train AI models by automatically monitoring system resource utilization and generating alerts whenever bottlenecks are detected.

Related Articles