Get a Performance Boost with Client-Side Caching

By Guy Y.

Feb 4, 2018

Network caching is a great way to improve app performance.

It reduces:

  1. Bandwidth usage.
  2. CPU (central processing unit) usage since there is less data to process.
  3. Redundant computations, and can process and render the same data multiple times.

Our goal is to minimize the number of requests the app makes.

For example, consider this scenario by our user:

  • Visit the profile page (a GET request is made to load the user details).
  • Go back to the home page.
  • Return to profile page.

Every visit to the profile page loads the data, brings it to screen, binds events… a lot can happen.

Chances are the GET response to the first and second visit to the profile page is exactly the same.

With SPA (single-page application), it is very common to face such issues. We can minimize each visit load by following a strict architecture, but as the project grows it is natural we will see more and more performance slowdowns and issues.

The proper way to fix such issues is by handling the tech debt and by better managing the app state.

But if you need an easy win – and fast – client-side caching could be a good option.

Server caching

HTTP supports caching. We won’t cover this subject here. But it is advisable that you cache static assets; it has a drastic impact on the app & network usage.

Service workers

Service workers are also a great option, especially if you want to build a PWA (progressive web app). It can be implemented in any framework, however, it might be hard to integrate into mature apps.

AngularJS default cache

AngularJS $ http service supports caching with a simple flag setting.

We can cache all requests by configuring the $httpProvider.defaults.cache flag,
or specific requests, i.e. $http({url: '...', cache: true}).

Enhanced caching

AngularJS caching does a great job out of the box. The down side is that the cache is in-memory. Whenever the app reloads, the cache is cleared, so the performance boost doesn’t affect the app load time.

We can enhance requests config.cache by storing the data in persistent memory, i.e. LocalStorage or sessionStorage.

But first we need to talk about middlewares.

Very broadly, a middleware is a module that can be placed between two parts of the app, and has access to the data stream between them.

For example, service-workers allow us to add a middleware on top of the browser’s fetch module.

In our case, we want to place a middleware between AngularJS $http & the browser xmlhttprequest.

This will allow us to access a request before it is sent, and a response before it’s processed.

Luckily for us, AngularJS supports such middelwares out of the box – interceptors.

  1. We review every incoming response, decide whether to cache it or not, timestamp it, and store it into persistent storage.
  2. For each outgoing request, if there is a stored response that is not too old when the request is placed, then: config.cache.
$provide.factory('myHttpInterceptor', ($q, $cacheFactory) => {
  const storage = window.localStorage // We can use session storage instead.
  const cache_ = $cacheFactory('MyCache');

  const response = (response) => {
    try {
      // shouldCache_ decides if the response should be cached. For example,
      // you don't want to cache POST or error responses.
      if (shouldCache_(response)) {
        const key = buildUrl_(response.config);
        const cachedResponse = constructCachedResponse_(response);
        // convert data to string so it can be kept under storage_.
        const value = JSON.stringify({
          timestamp: Date.now(),
          response: cachedResponse,
        });
        storage_.setItem(key, value);
      }
      return response;
    }
    catch(error) {
      throw `CacheInterceptor.response: ${error}`;
    }
  };


  const request = (config) => {
    try {
      // generate a key by serializing & concatenate config params to url, i.e. http://domain.com?a=1&b=2
      const key = buildUrl_(config);
      const value = storage_.getItem(key);
      if (value) {
        const json = JSON.parse(value);
        updateCache(key, json);
        // Append cache to request config.
        config.cache = cache_;
      }
      return config;
    }
    // In anything goes wrong, throw and error so we don't fail silently.
    catch(error) {
      throw `CacheInterceptor.request: ${error}`;
    }
  };

  const constructCachedResponse_ = (response) => {
    // build an cacheObject to be stored at $cacheFactory instance.
    // Based on https://github.com/angular/angular.js/blob/master/src/ng/http.js#L1372.
    return [response.status, response.data, response.headers(),
      response.statusText, response.status];
  }

  updateCache_(key, json) {
    if (Date.now() - json.timestamp > CACHE_EXPIRATION_MS_) {
      // Remove the response from cache if its too old.
      // By default $http will send the request when no response is found in the cache.
      cache_.remove(key);
      return;
    }
    // Insert the response to cache.
    // By default $http will load it instead of making a request.
    cache_.put(key, json.response);
  };

  return {
    request,
    response,
  }
}

Conclusions

  1. With this simple interceptor, we were able to improve outgoing app load time by 25%, and the overall UX. That’s pretty impressive for such a minor change, but this also proves that we will have major issues with the app design, which should be fixed.
  2. The interceptor is a completely separate module; it can be easily switched on and off. We can do A/B testing and see the reaction, or whether any bugs surface.

Leave a Reply

Your email address will not be published.