From the course: Microservices: Security

Unlock the full course today

Join today to access over 22,400 courses taught by industry experts or purchase this course individually.

Throttling and rate limiting

Throttling and rate limiting

From the course: Microservices: Security

Start my 1-month free trial

Throttling and rate limiting

- [Instructor] As an API gains adoption and its usage increases, it's important to maintain the experience and to meet the performance demands of its clients. Often, the operations found on an API are composed from those made available in a cluster of microservices. These microservices must remain stable as traffic increases. Scaling is the first strategy to consider when demand for a micro-service begins to exceed its capacity. By design, microservices are intended to scale easily through features like auto-scaling that are provided by container orchestrators. However, auto scaling has its limitations because constraints such as costs or available resources may limit the number of microservice instances that are able to be spun up. Underlying each container orchestrator is a plane of host VMs. Once their capacity is exhausted, there's nowhere for another container to be deployed. In some cases, such as a denial of service…

Contents