HomeAbout Me

Azure APIM Caching Policy | Cache-lookup Policy | HTTP GET | Non-Cache HTTP POST

By Sri Gunnala
Published in Microsoft Azure
September 21, 2024
2 min read
Azure APIM Caching Policy | Cache-lookup Policy | HTTP GET | Non-Cache HTTP POST

Azure API Management (APIM) offers a powerful caching mechanism that can greatly enhance the performance and efficiency of your APIs. By storing frequently accessed data in the cache and serving responses directly from it, you can significantly reduce the load on your backend services, minimize latency for client applications, and lower costs if your backend is cloud-based. This also improves the overall user experience with faster response times.



Why Use Caching in APIM?

Caching can save valuable computing resources by reducing the need for repeated calls to backend services. In Azure APIM, you can apply a caching policy to HTTP operations like GET, where cached responses are delivered without reaching the backend again. This results in:

  • Reduced backend load: Minimize the number of requests that hit your backend.
  • Lower latency: Serve responses from the cache for faster client-side performance.
  • Cost savings: Especially relevant if your backend is hosted on a cloud platform, as fewer backend calls reduce resource consumption.

However, by default, Azure APIM caching only applies to the GET HTTP verb. But what if you’re using POST for certain requests, such as complex search operations? That’s where custom caching policies come into play.

Setting Up Caching for GET Requests

Let’s start by implementing caching for a GET operation. For this example, we’ll use a simple REST API with a GET operation.

  1. Create or use an existing API in Azure APIM.
  2. In the Design section of your API, select the GET operation.
  3. Click on Add policy in the inbound processing section, and select Cache the response.
  4. You can either apply caching at the operation level or across all operations, but it’s recommended to configure it per operation to have granular control. For instance, query parameters can be used as caching keys.
  5. Set the cache duration (e.g., 60 minutes) and save the configuration.

Now, when you make a GET request, APIM checks the cache first. If the cache is missed, it forwards the request to the backend. After that, the response will be stored in the cache for subsequent requests.

Testing the Cache for a GET Request

When testing the GET request, you can trace the request in APIM to see how the cache behaves:

  • The first request will result in a cache miss, meaning the backend serves the response.
  • Subsequent requests will result in a cache hit, and the response will be served from the cache, bypassing the backend entirely.

Extending Caching to POST Operations

While GET operations are inherently cacheable, POST operations typically aren’t. However, there are scenarios where caching POST responses can be beneficial, especially for complex search operations with large request payloads. To implement caching for POST operations, we need to use custom policies. Here’s a step-by-step guide:

  1. Create a POST operation in your API.
  2. Open the operation’s policy editor.
  3. Implement a custom caching logic using policy expressions.

Here’s a sample policy code for caching POST operations:

<policies>
    <inbound>
        <base />
        <!-- Parse the JSON body and extract the "id" field -->
        <set-variable name="requestBody" value="@(context.Request.Body.As<JObject>(preserveContent: true))" />
        <set-variable name="ID" value="@(context.Variables.GetValueOrDefault<JObject>("requestBody")?["id"]?.ToString())" />
        <!-- Attempt to retrieve cached response based on the ID -->
        <cache-lookup-value key="@((string)context.Variables["ID"])" variable-name="cachedResponseValue" />
        <choose>
            <when condition="@(context.Variables.ContainsKey("cachedResponseValue"))">
                <return-response>
                    <set-header name="Content-Type" exists-action="override">
                        <value>application/json</value>
                    </set-header>
                    <set-body>@((string)context.Variables["cachedResponseValue"])</set-body>
                </return-response>
            </when>
        </choose>
        <set-backend-service id="apim-generated-policy" backend-id="nfacto" />
    </inbound>
    <backend>
        <base />
    </backend>
    <outbound>
        <base />
        <!-- Store the response in a variable -->
        <set-variable name="responseValue" value="@(context.Response.Body.As<string>(preserveContent: true))" />
        <!-- Cache the response using the ID as the key -->
        <cache-store-value key="@((string)context.Variables["ID"])" value="@((string)context.Variables["responseValue"])" duration="400" />
        <!-- Ensure the response is JSON formatted -->
        <set-header name="Content-Type" exists-action="override">
            <value>application/json</value>
        </set-header>
    </outbound>
    <on-error>
        <base />
    </on-error>
</policies>

This policy does the following:

  • Extracts the request body and a unique identifier (e.g., “id”) from the payload.
  • Checks if a cached response exists for the given identifier.
  • If found, returns the cached response immediately.
  • If not found, forwards the request to the backend.
  • Caches the backend response before returning it to the client.

Testing and Verifying Caching Behavior

To ensure your caching policies are working correctly:

  1. Send a request to your API operation.
  2. Check the trace information in Azure API Management.
  3. Look for “cache lookup” results:
    • “Miss” indicates the response wasn’t in the cache.
    • “Hit” shows that the response was served from the cache.

Conclusion

Azure API Management’s caching mechanism is a simple yet effective way to improve API performance and reduce backend load. While caching is natively supported for GET operations, you can extend this feature to POST operations using custom policies.

Whether you’re working with GET or POST, caching can lead to better performance, reduced latency, and lower costs, especially for high-traffic APIs. If you’re looking to enhance your APIs, implementing caching is a step in the right direction.

thumbnail credits

Tags

#Azure#APIM
Previous Article
Export Azure Logic App Run History | Performance Analysis
Sri Gunnala

Sri Gunnala

Learner | Reader | Blogger | Azure Enthusiast

Topics

Front End
Microsoft Azure
Microsoft .NET

Newsletter

Sri Gunnala - Make sure to subscribe to newsletter and be the first to know the news.

Related Posts

Export Azure Logic App Run History | Performance Analysis
August 06, 2024
3 min

Legal Stuff

Privacy NoticeCookie PolicyTerms Of Use

Social Media