MLOps for batch inference with model monitoring and retraining using Amazon SageMaker, HashiCorp Terraform, and GitLab CI/CD aws.amazon.com Post date August 29, 2023 No Comments on MLOps for batch inference with model monitoring and retraining using Amazon SageMaker, HashiCorp Terraform, and GitLab CI/CD Related External Tags Advanced (300), Amazon EventBridge, Amazon SageMaker, artificial-intelligence, AWS Lambda, Technical How-to ← How to Deploy a Rust API to Posit Connect → 5 Skills All Marketing Analytics and Data Science Pros Need Today Leave a Reply Cancel reply This site uses Akismet to reduce spam. Learn how your comment data is processed.