Azure API Management (APIM) offers a powerful caching mechanism that can greatly enhance the performance and efficiency of your APIs. By storing frequently accessed data in the cache and serving responses directly from it, you can significantly reduce the load on your backend services, minimize latency for client applications, and lower costs if your backend is cloud-based. This also improves the overall user experience with faster response times.
Caching can save valuable computing resources by reducing the need for repeated calls to backend services. In Azure APIM, you can apply a caching policy to HTTP operations like GET, where cached responses are delivered without reaching the backend again. This results in:
However, by default, Azure APIM caching only applies to the GET HTTP verb. But what if you’re using POST for certain requests, such as complex search operations? That’s where custom caching policies come into play.
Let’s start by implementing caching for a GET operation. For this example, we’ll use a simple REST API with a GET operation.
Now, when you make a GET request, APIM checks the cache first. If the cache is missed, it forwards the request to the backend. After that, the response will be stored in the cache for subsequent requests.
When testing the GET request, you can trace the request in APIM to see how the cache behaves:
While GET operations are inherently cacheable, POST operations typically aren’t. However, there are scenarios where caching POST responses can be beneficial, especially for complex search operations with large request payloads. To implement caching for POST operations, we need to use custom policies. Here’s a step-by-step guide:
Here’s a sample policy code for caching POST operations:
<policies> <inbound> <base /> <!-- Parse the JSON body and extract the "id" field --> <set-variable name="requestBody" value="@(context.Request.Body.As<JObject>(preserveContent: true))" /> <set-variable name="ID" value="@(context.Variables.GetValueOrDefault<JObject>("requestBody")?["id"]?.ToString())" /> <!-- Attempt to retrieve cached response based on the ID --> <cache-lookup-value key="@((string)context.Variables["ID"])" variable-name="cachedResponseValue" /> <choose> <when condition="@(context.Variables.ContainsKey("cachedResponseValue"))"> <return-response> <set-header name="Content-Type" exists-action="override"> <value>application/json</value> </set-header> <set-body>@((string)context.Variables["cachedResponseValue"])</set-body> </return-response> </when> </choose> <set-backend-service id="apim-generated-policy" backend-id="nfacto" /> </inbound> <backend> <base /> </backend> <outbound> <base /> <!-- Store the response in a variable --> <set-variable name="responseValue" value="@(context.Response.Body.As<string>(preserveContent: true))" /> <!-- Cache the response using the ID as the key --> <cache-store-value key="@((string)context.Variables["ID"])" value="@((string)context.Variables["responseValue"])" duration="400" /> <!-- Ensure the response is JSON formatted --> <set-header name="Content-Type" exists-action="override"> <value>application/json</value> </set-header> </outbound> <on-error> <base /> </on-error> </policies>
This policy does the following:
To ensure your caching policies are working correctly:
Azure API Management’s caching mechanism is a simple yet effective way to improve API performance and reduce backend load. While caching is natively supported for GET operations, you can extend this feature to POST operations using custom policies.
Whether you’re working with GET or POST, caching can lead to better performance, reduced latency, and lower costs, especially for high-traffic APIs. If you’re looking to enhance your APIs, implementing caching is a step in the right direction.
thumbnail creditsLegal Stuff