How To Optimize Mobile Performance

About The Author

Danny Bluestone is the CEO of Cyber-Duck, an agency that delivers UX-driven digital transformation for brands like the Bank of England, The Commonwealth, … More about Danny ↬

Email Newsletter

Weekly tips on front-end & UX.
Trusted by 200,000+ folks.

The best designs balance aesthetics and performance by working with mobile in mind from the start. In this article, Danny Bluestone will share the current mobile performance optimization processes he uses at Cyber-Duck. Aim to create a website that can balance aesthetics and performance on mobile, and achieve real conversion metrics. A collaborative, iterative performance optimization process will help you achieve this. Right from the start of the project, build an understanding of the client and server-side factors that determine website performance on mobile.

You can’t underestimate the importance of consistent, high-quality web design across devices of all shapes and sizes. Responsive web design is the way forward — but it’s often linked to performance issues. This is critical when 64% of smartphone users unforgivingly expect websites to load in under four seconds, yet average page weights continue to rise.

The best designs balance aesthetics and performance by working with mobile in mind from the start. From setting strict performance budgets to implementing client- and server-side optimization techniques, I’ll share the current mobile performance optimization processes we use at Cyber-Duck.

Mobile Performance
Designing with mobile in mind from the start helps balance website aesthetics and performance across devices. (Large preview)

Become Mobile-Minded

Performance is a key part of the user experience, so it can’t be an afterthought at the end of the development process. It’s preferable to manage projects through a mobile-minded structure, with designers and developers collaborating from the start.

Collaborative Review

For each project, review the design and development scope with the internal team, and define key performance indicator (KPI) goals. These are the milestone metrics that indicate project success, based on business objectives. Given their importance, performance-related goals should appear here.

Don’t sign off significant project milestones (like the art direction and wireframes) with stakeholders until the entire internal team has reviewed the output. Otherwise, we’ve found developers can request design adjustments (to reduce page size) during implementation. With designs already signed off, changes at this stage can create complications, opening further rounds of client approvals. When developers are involved from the outset, they can estimate the size and programming power required for interfaces, and avoid this.

Cyber-Duck team meeting
Designers and developers should review key milestones together, evaluating potential performance before sending for approval. (Large preview)

Performance Budgets

The best way to get into the mobile mind-set is setting and adhering to a strict performance budget: establishing a target for the final website’s speed and size. When the team is working towards a clear high-performance goal, they must choose whether to implement expensive features like carousels.

Specific business goals and user requirements determine whether we set figure-based performance budgets. For instance, our own website revamp aimed to dramatically improve load times across devices, and drive up mobile conversions. We set strict limits of no more than 40 HTTP requests or 500KB of data for mobile. Google Analytics data can inform which goals to select during revamps, as historical interactions indicate the behavior of your target audience.

Generally we define targets for page size, with a 500KB limit for mobile homepages. Server requests are more difficult to predict, so we’re less likely to set exact figures. These rough guidelines suit our needs for client projects. But Daniel Mall has a great practical guide for adding detail to budgets: from allocating weight for HTML and CSS, to JavaScript, images and web fonts.

Optimization Techniques

On mobile, website loading speed is driven by client- and server-side factors. Using targeted optimization techniques that address both of these factors can help you meet the performance budgets set for your project.

Client-Side Optimization

With a varied mobile landscape – over 5,000 unique smartphone devices in 2014 - developers have significantly less control over individual device performance than server-side factors. So, client-side optimization is crucial. The following techniques aim to reduce the processing time and power required from mobile devices to load websites.

Optimize Code

Many developers fall into the trap of writing in jQuery to power a website. But there is no such thing. In fact, you are writing in JavaScript, while using a library of helpful shortcuts and functions. Although this speeds up development – useful, when you need to get a product to market quickly – there can be a performance cost. The jQuery library adds weight, and the flexibility of plugins (and functions) means they can often be bloated.

Here’s an example, with JavaScript and jQuery used for the same function. Writing in plain JavaScript avoids pulling another external library into your application, and will save another precious HTTP request.

// jQuery
var con = $('#my_container');
con.css('width','75%');

// Plain JavaScript
var con = document.getElementById('my_container');
el.style.width = '75%';

You can optimize CSS and JS files further by using systems like Grunt or Gulp, or with front-end compiler apps like Prepos, Codekit or Hammer. These reduce HTTP requests and file size by performing a variety of tasks: concatenating files, compiling Sass, Less or CoffeeScript, Uglify JS (compresses JavaScript), and minify/compressing files for production use.

Prioritize Above The Fold

Google Pagespeed Insights (and similar tools) recommends prioritizing the loading size and speed of content above the fold. Separate the CSS used to render the visible part of the page (above the fold) first; defer the rest of the styles to load after the page has been rendered.

Adding the top CSS directly into the page header can do this. But, bear in mind this will not be cached like the rest of the CSS file, so must be restricted to key content. A variety of tools can help you determine the CSS to separate, including Scott Jehl’s Critical CSS and Paul Kinlam’s Bookmarklet tool.

Optimize Images

Considering the current preference for rich design, it’s unfortunate that images are often the culprit of heavy page size. But image-led design is still possible if each is optimized and compressed before and after export to the right format. Always ensure you use the appropriate image type. Heavy colored photos work better as JPEG files, whereas flat color graphics should be in PNG8. Gradients and more complex icons work best as PNG24/32 with alpha transparency, or SVGs.

Photoshop and Fireworks can help you customize the levels of optimization across different areas of the image. This means the main subject can remain high quality, while the rest is optimized to increase performance. Lossless image compression tools like ImageOptim and TinyPNG can squeeze the most out of file size, without losing image quality.

You could also make use of the new HTML5 <picture> element and srcset and size attributes for images. These two additions to the language help you define responsive images directly in the HTML, so the browser will only download the image that matches the given condition.

<picture>
  <source media="(min-width: 960px)" srcset="picture-large.png">
  <source media="(min-width: 465px)" srcset="picture-small.png">
  <img src="images/picture.png" alt="Picture alt">
</picture>

However, this technique should be used carefully. Only a few browsers support it: some modern browsers (like Safari), Android browsers and IE10/11 (and older) do not. Polyfill alternatives can make this method work across older browsers, but these are external JavaScript libraries that have to be loaded separately, and may not be worth it given that other techniques are available. It’s worth considering your target audience, and what technologies they will be using, to see if the extra weight of the polyfill is required.

Data URLs are a final option. Instead of linking to an external image file, image data can be converted into a base64 (or ASCII) encoded string and embedded directly into the CSS or HTML file. A simple online conversion tool is available. Data URLs are helpful, as they save HTTP requests and can transfer small files more quickly. But, as demonstrated below, the embedded code size is larger than linking to external images. The added length can make HTML and CSS documents more difficult to maintain, and image changes will have to be reencoded and embedded each time.

<img width="32" height="32" alt="Camera" src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAYAAABzenr0AAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAAYZJREFUeNrsVsttwzAMtYUAvfrck0fIBukIyQAF5AkaTxB0gowQAR3AWcEbdASfeva1p5YEmIAgZEmWZKeHEhD8k2Ty8fFRZZFg3x/PL3DpYFSOac3T65eZ+qiKNLt4fo52Bker7A7AphoudcBU/PlxCQROM+a+TaGgFo7ei4JaIXonCmqF6J0oqJWiv6MgX5QU1R7LJTKyGBtgtKAP15J+3hWPsYOiyB9lZ7Ui7DarN5aXnzDeGeG2nk1GGKj1Pd3fGL+DoX1SjRz4kXlBcjByuvhhiEzjRMlWlGI9tcEmAT5nl0MjxxpwpKfGFYRASAoMbN7MFLCLDQkbAlsP7BhVKzaXOnKvczYN1+wlJ2KU0PCcM57wasL7jr7xdJgcUtzLWnbVuWdtlAOjYLlLR+qptbmOZMkW40Al8jp4mo51bYoDO/HcOua2nrVRDmh+sqFSO4hoB66ojC9BOhCSAmR3I5y4+jpfrhTcUNAzj3E6VIpniVJqM0p1YJF2/Od14N+BrPYrwAAH54zsDNHtwgAAAABJRU5ErkJggg==" />

Automate CMS Media Optimization

Applying the asset optimization techniques from the previous section meant we could choose a classic, image-led design for BAM, enabling them to showcase new construction project photography.

But we also needed to give BAM the freedom to update content without needing us to optimize each image. Of course, no solution would be as effective as optimization by hand but we did manage to achieve a reasonable degree of automated optimization. We reconfigured their existing Sitefinity CMS to create flexibility. Standard options were used to resize (and optimize) the images automatically, fitting the context of each web page:

<thumbnailResizeSettings
    compositingQuality="HighQuality"
    interpolationMode="HighQualityBicubic"
    smoothingMode="HighQuality">
</thumbnailResizeSettings>

Sitefinity can also resize images from the URL by using URL parameters, and even faster rendering can be achieved by caching the resized image, using the following option:

/images/image-opt.jpg?size=480
BAM website on mobile
The homepage of the BAM website relies on regular project photography updates, so we implemented automated image optimization. (Large preview)

Most CMS systems allow some degree of media optimization. For instance, you can define media settings to ensure future users only add images that fit the website templates. Here’s a quick example from WordPress.

Wordpress media settings
In WordPress, implement media settings like these to ensure future image uploads fit website templates.
// Wordpress example
<div class="avatar">
    <?php the_thumbnail( 'thumbnail' ); ?>
</div>

Streamline Fonts And Icons

Fonts are an important part of the user experience and branding of a website or application, but might not be the first priority for the user. For this reason, web fonts can be another factor to optimize.

By deferring font loading, the browser will display copy in whatever font it has available to begin with. This means the user will always get the content first. Deferring font loading can be achieved by separating the part of the CSS that links to the font files, and loading it after the rest of the page has been rendered. Note, however, that the text may briefly flash to change when the web font is loaded.

Similarly, icons are another area to optimize, as they are small files that need to be loaded frequently. You could consider also using font files for icons. Use a service like Fontello to choose a variety of icons, and generate a font file limited to your selection. This technique can create high-quality vector icons for all screen resolutions, with a light performance impact.

Alternatively, image sprites are a well-known option. They combine images into one file (that uses only one request to load) and display just the part required for design by using the background position. Paul Stamatiou describes how this is done and outlines a few limitations.

Loading Technologies

The following techniques avoid sending a website’s entire content to mobile browsers. Instead, only the precise data needed is downloaded, by optimizing for each breakpoint. Mobile loading speed was a key consideration for Velocity Drive’s website, which provides trailer technologies. JavaScript libraries must load at all breakpoints, to test for browser capabilities and avoid glitches. But we optimized assets carefully for each breakpoint: the homepage load size is only 323KB on mobile, rising to 828KB on large desktops.

Take this further with conditional lazy loading techniques to raise the perceived page speed. They load visible sections in stages, with key content placed above the fold. Expensive items (like images) found towards the end of pages aren’t loaded, unless the user chooses to scroll through the content. This technique was key for the ‘Insights’ section of the Niu Solutions website, covering their IT innovations. We used a small jQuery plugin called jScroll to load further articles as the user scrolls down. Here’s a sample of how we would set up this plugin, which simply requires the link to more content:

<a href="articles.php" class="more">Load more</a>
  // Insights javascript
  $('.insights-container).jscroll({
      nextSelector: '.more',
      loadingHtml: '<p>Loading...</p>'
  });

Preloading technologies present further opportunities. They can anticipate and prepare for the user’s next move by loading the page they are likely to view next before they do so, to provide a faster experience. However, discovering the typical traffic structure is easier when revamping an existing website, as you can study the behavior flow funnels on Google Analytics.

Enhance From A Core Experience

The BBC’s Responsive News refers to the idea of giving the user the core experience they request, then evaluating the user’s environment and enhancing the experience accordingly. A simple example of this is loading low-resolution images initially, and then showing high-resolution depending on the bandwidth the user has.

This idea is part of progressive enhancement, where web technologies are layered to provide the best experience across environments. Progressive enhancement can be based on a number of different factors. These include the technology a user has access to, like their browser, operating system, and environment (such as internet speed). Here, define a basic set of features that must work on the least capable browsers, and only add further complexity after testing whether browsers can handle it.

Detecting whether the browser can support HTML5 and CSS features helps us write conditional code to cover all eventualities: enhancing and adding features when supported, while staying safe and simple for devices and browsers that do not.

Reduce Feature Testing

Incorporating feature-testing libraries like Modernizr or has.js is a common, recommended practice. But too many developers implement the entire library; they test for all capabilities, even though only a small number of results are needed to determine whether to add features.

Tim Kadlec reports the parsing and execution time of the same library (minimized jQuery 2.1.1) across a range of devices. This demonstrates there’s often a larger mobile performance cost (even between old and new devices) for implementing these libraries, in comparison with desktop. We tend to tailor the library, testing relevant website features only. This will save time and precious mobile processing power.

Reducing the size of the Modernizr testing library
Tailoring testing libraries is crucial. This image compares the size of implementing the entire library (top), with limiting tests to just what we needed (bottom). (Large preview)

Server-Side Optimization

Server response time is a key factor in website speed: many aim for less than 200ms. But network latency (the delay as data moves between the server and device) is the real bottleneck for mobile performance, leaving mobile users with a slower experience.

This is influenced by network speed. According to Ofcom, the average download speeds on popular 3G and 4G networks were 6.1Mbps and 15.1Mbps in the UK. Some interpret this as a clear limit on maximum website size. But the reality is more complex, as the speed varies depending on the coverage and environmental context. Users often connect to slow Edge (E) and GPRS when out of range.

There are a variety of techniques available to improve server-side website performance.

Caching, Prerendering, And Static Content

Dynamic web pages require multiple database queries, taking valuable time to process output and format data, then render to browser-legible HTML. It’s recommended to cache content previously rendered for that device. For returning visitors, instead of processing from scratch, it will check the cache, and only send updates.

Many also choose JavaScript template libraries like Handlebars and Mustache to handle web content. But parsing and executing JavaScript is power- and time-consuming. Mobile devices can’t process these template libraries as fast as desktop computers, and drain their processing resources. Rendering pages completely on the server is much faster. Twitter opted for this approach as early as 2012, and explained the value on their blog.

Recently, our senior front-end developer pushed the boundaries of this technique for his personal portfolio. It was built with the file-based Statamic CMS, which just added html_cache support. When implemented, this feature reduced the average load time of all pages from roughly 1.8 seconds to 225 milliseconds.

Browser Caching

Granular optimization can streamline website loading by preventing regular transfer of files you know aren’t updated often. Use a server handler (like an .htaccess file) to instruct the browser on which type of content to store, and how long they should keep copies. Here’s how you can implement browser caching on the Apache server:


<IfModule mod_expires.c>
    ExpiresActive on
    ExpiresDefault                          "access plus 1 month"
  # CSS
    ExpiresByType text/css                  "access plus 1 year"
  # Data interchange
    ExpiresByType application/json          "access plus 0 seconds"
    ExpiresByType application/ld+json       "access plus 0 seconds"
    ExpiresByType application/xml           "access plus 0 seconds"
    ExpiresByType text/xml                  "access plus 0 seconds"
  # Favicon and cursor images
    ExpiresByType image/x-icon              "access plus 1 week"
  # HTML components (HTCs)
    ExpiresByType text/x-component          "access plus 1 month"
  # HTML
    ExpiresByType text/html                 "access plus 0 seconds"

  # JavaScript
    ExpiresByType application/javascript    "access plus 1 year"

  # Manifest files
    ExpiresByType application/x-web-app-manifest+json   "access plus 0 seconds"
    ExpiresByType text/cache-manifest       "access plus 0 seconds"
  # Media
    ExpiresByType audio/ogg                 "access plus 1 month"
    ExpiresByType image/gif                 "access plus 1 month"
    ExpiresByType image/jpeg                "access plus 1 month"
    ExpiresByType image/png                 "access plus 1 month"
    ExpiresByType video/mp4                 "access plus 1 month"
    ExpiresByType video/ogg                 "access plus 1 month"
    ExpiresByType video/webm                "access plus 1 month"
  # Web feeds
    ExpiresByType application/atom+xml      "access plus 1 hour"
    ExpiresByType application/rss+xml       "access plus 1 hour"
  # Web fonts
    ExpiresByType application/font-woff     "access plus 1 month"
    ExpiresByType application/vnd.ms-fontobject  "access plus 1 month"
    ExpiresByType application/x-font-ttf    "access plus 1 month"
    ExpiresByType font/opentype             "access plus 1 month"
    ExpiresByType image/svg+xml             "access plus 1 month"
</IfModule>

Content Delivery Networks (CDNs)

You can improve asset loading by using a CDN like CloudFlare alongside your usual hosting service. Here, static content (like images, fonts and CSS) is stored on a network of global servers. Every time a user requests this content, the CDN detects their location and delivers assets from the nearest server, which reduces latency. It increases speed by allowing the main server to focus on delivering the application instead of serving static files.

Although it adds expense, use a dedicated CDN to improve the loading speed of asset-heavy websites. Aside from initial setup, CloudFlare doesn’t require manual configuration; the cache is built and updated for you, based on historical traffic and which assets are best to serve. But implement this with future independent content management in mind: ensure all assets uploaded from a CMS are also transparently served through the CDN.

A CDN was the best choice for our Eurofighter Typhoon website, as striking high-res photography of defense aircraft was a crucial feature to showcase their ability. In the last 30 days, reports indicate CloudFlare saved 76% of requests and 48% of bandwidth, increasing the speed of the image-heavy website.

Eurofighter website on mobile
We implemented a CDN for Eurofighter Typhoon’s website, to speed up the loading of the high-res photography. (Large preview)

Testing

There’s no replacement for testing throughout production. Aim to use various tools to test work in progress by simulating the mobile experience and diagnosing potential performance issues.

As production progresses, always keep an eye on the numbers: from ensuring design assets are properly generated and exported, to checking the page file size and amount of HTTP requests via the developer tools on your browser. Here, the Network tab gives you a complete overview of the resources loaded, total file size and rendering time:

Developer Tools - Cyber-Duck Website
Developer Tools gives us a complete overview of the performance metrics of the Cyber-Duck website. (Large preview)

Note the blue and red vertical lines at the right of the timeline in Chrome Inspector above. These represent DOM Ready and Page Load events respectively. At the bottom of the window, it displays the amount of HTTP requests and total file size loaded at the current breakpoint.

Other tools include:

  • WebPagetest offers a wide variety of options for testing live URLs: from choosing any location around the world, to shaping specific 3G and 4G connection speeds and latency. You can even experience how the website loads for these users, through the filmstrip view and video.
  • Google’s Pagespeed Insights is a more visual, introductory tool for analyzing page speed. It splits results into desktop or mobile, and suggests techniques to improve targeted areas of your site: indicating resources to cache or images to optimize.

Test On Real Devices

But don’t rely on simulators alone. We also test projects throughout production on a variety of real mobile devices.

Create your own device lab or use OpenDeviceLabs. Ideally, get a sense of the real user experience by avoiding the powerful office Wi-Fi. Create a test site in a web server (ideally the same as the live server) that you can access from outside the office network. Then, test while on the move in typical environments like a crowded coffee shop or hotel, on a network connection.

Mobile Performance Summary

Above all, aim to create a website that can balance aesthetics and performance on mobile, and achieve real conversion metrics. A collaborative, iterative performance optimization process will help you achieve this.

Right from the start of the project, encourage the internal team to work together under a mobile mindset by setting a strict performance budget. Build an understanding of the client and server-side factors that determine website performance on mobile. Then you can meet the goal set by implementing a mixture of the targeted optimization techniques I have described. Of course, there’s still a trade-off between having a striking design, high performance and security in some cases; a collaborative design and development team can decide what’s best for the business, checking with relevant project managers and stakeholders.

Our optimization project for a global technology consultancy demonstrates how these techniques can combine to improve loading speed and size significantly. The project involved caching templates and pages, optimizing assets and fonts, and reducing feature testing, among other techniques. So far, tests demonstrate the rendering and total load time has been cut to less than 1.4 seconds, from almost 4 seconds before we began work; similarly, the file size has been reduced to 1MB from over 3MB.

Further Reading

Smashing Editorial (da, ml, jb, og, il, mrn)