// Navigation

System Cache Performance

Enhancing Web API Performance with Smart Caching

Discover how to implement system-cached lists that deliver sub-second response times for high-traffic APIs while maintaining data freshness and system efficiency.

In the world of web development, performance optimization is a critical aspect that directly influences user experience and overall system efficiency. Caching is a technique that helps to mitigate the load on backend services, reducing response times and enhancing the user experience. One particular approach, the System Cached List, offers a smart and efficient way to cache data and deliver it seamlessly to users.

The Challenge: Sub-Second Response Times for a Weather API

Performance Requirements
You're running a Web API that provides weather forecasts to a large number of users. Each forecast request takes approximately 10 seconds to complete. Your consumers expect near-instantaneous responses and your API is handling thousands of hits every minute.

Caching Strategy

Data Retrieval

  • Check cache for existing data
  • Serve cached data directly

Cache Miss

  • Wait for data if cache empty
  • Serve stale data if past refresh time

Background Refresh

  • Asynchronous cache refresh
  • Non-blocking main thread
  • Single refresh process control

Understanding the System Cached List

The System Cached List is a caching mechanism that maintains a cached dataset, periodically updating it while ensuring minimal impact on the application's performance. This technique employs the power of the MemoryCache class and a custom CachedData class to manage cached data efficiently.

Sample Implementation

I used a minimal Api sample project to start this solution. I then added a generic implementation of caching a list of T using system memory as the caching mechanism. You can view the source for this article in my a.text-decoration-none(href='https://github.com/markhazleton/sandbox/tree/main/WebApiCache' target='_blank' rel='noopener noreferrer') sandbox repository on GitHub | .

Advantages of the System Cached List

Improved Performance

Significantly improves performance by reducing backend load. Cached data provides faster response times and smoother user experience.

Reduced Latency

With cached data readily available, there's notable latency reduction. Users experience quicker load times and higher satisfaction levels.

Efficient Resource Usage

Memory caching optimizes resource usage and reduces database queries, leading to lower resource consumption.

Customizability

Allows customization of cache expiration times based on data volatility, ensuring freshness while avoiding unnecessary updates.

Concurrency Handling

Synchronization locks prevent concurrency issues during cache updates, ensuring data integrity with single-threaded updates.

Graceful Updates

Smart asynchronous update process avoids blocking the application, maintaining responsive user interactions.

Disadvantages of the System Cached List

Stale Data Risk

Risk of serving outdated information if cache expiration settings are not properly configured or updates are delayed.

Increased Complexity

Requires careful consideration of thread safety, cache management, and update scheduling, potentially leading to maintenance challenges.

Memory Usage

Large datasets in memory can increase consumption, potentially affecting overall system performance.

Cold Start Overhead

Initial cache population creates slight overhead during application start or cache expiration, momentarily impacting user experience. // Implementation Section

Implementing the System Cached List

The implementation showcases two primary components that work together to provide efficient memory caching with smart update strategies.

CachedData Class

This class encapsulates the cached data along with metadata such as the last update time and the expected next update time, providing a complete picture of cache state.

class CachedData
{
  public DateTime LastUpdated { get; set; }
  public DateTime NextUpdate { get; set; }
  public object Data { get; set; }
}

SystemValuesCache Class

This static class acts as the core of the caching mechanism. It initializes a MemoryCache instance and provides the GetCachedData method to fetch and manage cached data. It also uses a synchronization lock to ensure thread safety.

FetchDataFunction Delegate

The GetCachedData method's fetchDataFunction argument plays a crucial role in the functionality. It is a delegate that defines a function responsible for retrieving the data that will be cached.

This callback function performs the actual data retrieval from any data source (database, web API, file) and is asynchronous (Task<List<T>>), allowing non-blocking execution.

public static CachedData&lt;T&gt; GetCachedData&lt;T&gt;(
  string cacheKey,
  Func&lt;Task&lt;List&lt;T&gt;&gt;&gt; fetchDataFunction,
  double cacheTimeInSeconds)
{
  List&lt;T&gt; cachedValues = _cache.Get(cacheKey) as List&lt;T&gt; ?? new List&lt;T&gt;();
  if (cachedValues.Count == 0 || DateTime.Now - _lastUpdateTime > TimeSpan.FromSeconds(cacheTimeInSeconds))
  {
    lock (LockObject)
    {
      Task.Run(async () =>
      {
        var data = await fetchDataFunction();
        cachedValues.Clear();
        cachedValues.AddRange(data);
        var cachePolicy = new CacheItemPolicy
        {
          AbsoluteExpiration = DateTimeOffset.Now.AddSeconds(cacheTimeInSeconds)
        };
        _cache.Set(cacheKey, cachedValues, cachePolicy);
        _lastUpdateTime = DateTime.Now;
      }).Wait();
    }
  }
  return new CachedData&lt;T&gt;()
  {
    Data = cachedValues,
    LastUpdateTime = _lastUpdateTime,
    NextUpdateTime = _lastUpdateTime.AddSeconds(cacheTimeInSeconds)
  };
}

Conclusion

The System Cached List provides an effective approach to enhance web application performance by intelligently managing and serving cached data.

Performance Enhancement

Combines memory caching efficiency with smart update strategies for optimal performance.

Reduced Backend Load

Ensures users receive quick responses while significantly reducing backend resource usage.

Implementation Considerations

While offering numerous advantages, it's crucial to carefully consider cache expiration times and handle potential complexities to reap the full benefits of this caching technique. Proper configuration and monitoring are essential for optimal results.