From the course: MLOps Essentials: Model Deployment and Monitoring
Unlock the full course today
Join today to access over 24,400 courses taught by industry experts.
Building resiliency in serving
From the course: MLOps Essentials: Model Deployment and Monitoring
Building resiliency in serving
- [Instructor] Resiliency of software is its ability to handle issues gracefully and continue to provide the services to the end user with minimal interruptions. Resiliency is a critical yet overlooked part of model inference. It is key to successful operation of ML services. Without resiliency, these services would suffer from inconsistency, customer concerns, and loss of value. Resiliency should be built at a model, service, and solution levels. Let's begin with model resiliency. Model resiliency is the ability of the model to overcome issues with input data or resources, and continue to maintain the expected performance and operational goals. How do we ensure model resiliency? First, all input that is received during inference need to be validated to make sure that they comply with expected sanctity of data. This includes checking for exceptions, unknown values, out of distribution values, et cetera. Inference…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.