Improve API Performance | Exam Question Solution | SAA-C03 | AWS Certified Solutions Architect - Associate

Improving API Performance

Prev Question Next Question

Question

You have built a REST API using API gateway and distributed to your customers.

However, your API is receiving large number of requests and overloading your backend system causing performance bottlenecks and eventually causing delays and failures in serving the requests for your important customers.

How would you improve the API performance? (Choose 2 options)

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D.

Answer: A, C.

Option A is correct.

To prevent your API from being overwhelmed by too many requests, Amazon API Gateway throttles requests to your API.

Specifically, API Gateway sets a limit on a steady-state rate and a burst of request submissions against all APIs in your account.

For more information on throttling, refer documentation here.

https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-request-throttling.html

Option B is not correct.

This is not a viable solution.

Resource policies cannot have a time range based condition.

Following documentation shows the conditions supported for API Gateway resource policies.

https://docs.aws.amazon.com/apigateway/latest/developerguide/apigateway-resource-policies-aws-condition-keys.html

Option C is correct.

You can enable API caching in Amazon API Gateway to cache your endpoint's responses.

With caching, you can reduce the number of calls made to your endpoint and also improve the latency of requests to your API.

When you enable caching for a stage, API Gateway caches responses from your endpoint for a specified time-to-live (TTL) period, in seconds.

API Gateway then responds to the request by looking up the endpoint response from the cache instead of making a request to your endpoint.

The default TTL value for API caching is 300 seconds.

The maximum TTL value is 3600 seconds.

TTL=0 means caching is disabled.

For details on enabling caching, refer documentation here.

https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-caching.html#enable-api-gateway-caching

Option D is not correct.

We can improve performance by increasing the capacity of backend systems if above settings does not help.

Simply adding a load balancer does not improve any performance.

To improve the performance of your REST API and prevent overloading your backend system, there are a number of strategies you can employ. Two of the most effective options are:

A. Enable throttling and control the number of requests per second: API Gateway offers a powerful feature called throttling that enables you to limit the number of requests per second that are allowed to access your API. By setting appropriate limits, you can prevent excessive traffic from overwhelming your backend system, reducing the likelihood of performance bottlenecks and failures. With throttling, you can control the maximum number of requests per second for each API key or client IP address. You can also set up burst limits that allow short-term bursts of traffic above the regular request rate. By controlling the traffic to your backend systems, you can ensure that your API remains responsive and reliable for all your customers.

C. Enable API caching to serve frequently requested data from the API cache: API caching can be used to store frequently requested data in memory, reducing the number of requests that need to be processed by your backend system. By caching data at the API Gateway level, you can significantly reduce the load on your backend system, as requests that are identical to previously requested data can be served directly from the cache. This can reduce latency and improve the overall performance of your API. You can configure API caching to work with specific resources, methods, or stages of your API, and you can set cache TTLs (time-to-live) to control how long data is stored in the cache.

The other two options - B and D - may also be useful in certain scenarios, but they may not be the most effective ways to address the specific problem described in this question. A resource policy (B) can be used to control access to your API, but it does not directly address the issue of overloading your backend system. Similarly, a load balancer (D) can distribute traffic across multiple backend servers, but it may not be sufficient to prevent overloading and performance bottlenecks. It is important to choose the right set of strategies based on the specific needs of your API and the challenges you are facing.