One problem which is still not solved – even when using one of the responsive images solutions out there – is, that larger photos, even in low quality (or a bit lower size), can cause a lot of traffic and loading time. This is especially true on low-connectivity or mobile networks, which often leave you staring at an empty gray box as you wait for images to download.

The situation

Imagine your webpage has a huge cover photo. a “hero” image, or maybe even one of these full screen image backgrounds, and on top of that about 10 to 20 further images for teasers, news and so on, in all different sizes. With a setting like this, you can easily reach about 800 – 1000KB ballast, or even more, depending on missing/ inefficient image optimization.

In 2014 the median e-commerce page took 6.5 seconds to render feature content, and 11.4 seconds to fully load (see Tammy Everts, Radware). Sad but true, one reason for the fact, that a lot of sites became slower instead of faster over the last years is the heavy use of (large) image resources.

Keeping in mind, that most consumers say they expect pages to load in 3 seconds or less, everything above is critical. Slow sites are a bad user experience – and Google is all about good UX these days (‘User Experience’ is not jet a ranking facto, but it is mentioned 16 times in the Google quality raters guide).

2 seconds is the threshold for ecommerce website acceptability. At Google, we aim for under a half second. Maile Ohye, Google

What does that mean?

That doesn’t mean to simple reduce the resources, well at least not in all cases, but we should start to think about and find smarter solution to speed up page load. Brad Frost created a little useful thing to help you make performance budgets, which gives you a good impression how much “weight” your image data should have on page load. So how do you get something valuable out of 200 kb? A typical hero image is 500 KB and more?

An interesting approach to solve this

One possible solution comes from facebook. The problem they faced in their native apps was that their “cover photos” are often take a while to load, leaving the user with a less-than-ideal experience when the background suddenly changes from a solid color to an image. The (ingenious) solution was to return a tiny image (around 40 pixels wide) and then scale that tiny image up whilst applying a gaussian blur. This instantly shows a background that looks aesthetically pleasing, and gives a preview of how the cover image would look.

With a bit of CSS and some JavaScript this idea is easily ported to the web. This is the outline of how it works:

Solution draft

  1. Onload use a very tiny image (for example 40×23 pixels, 16:9). This can either be a background-image or an image element.
  2. Setting up some CSS rules to apply gaussian blur these images
  3. Grep the url of the “full” image resource from an data-attribute (or something simular)
  4. When the “full”image is loaded, remove the blur filter (to beautify use an animation for removing)

The Css part:


/* animation to remove the filter */
@keyframes sharpen {
	from {
		-webkit-filter: blur(50px);
		filter: blur(50px);
	}
	to {
		-webkit-filter: blur(0);
		filter: blur(0);
	}
}

/* the gaussian filter for the tiny images */
.blurry {
	-webkit-filter: blur(50px);
	filter: blur(50px);
	-moz-transform: translateZ(0);
	-ms-transform: translateZ(0);
	-webkit-transform: translateZ(0);
	transform: translateZ(0);
}

/* class to apply the animation */
.blurry.enhanced {
	-moz-animation: sharpen 1s both;
	-webkit-animation: sharpen 1s both;
	animation: sharpen 1s both;
}

The JavaScript part:


window.onload = function load() {
	if (!('addEventListener' in window)) { return; }
	var collection = document.querySelectorAll(".blurry"),
			enhancedClass = 'enhanced';
	// loop over all found blurry image elements...
	[].forEach.call(collection, function(elm) {
			// define type (image or background)
		var type = (elm.tagName.toLowerCase() == 'img') ? 'src' : 'bg',
			// getting current image or background src
			imgSrc = (type === 'bg') ?
				elm.style.backgroundImage.slice(4, -1).replace(/"/g, "") :
				elm.src,
			// getting the data attribute containing the path
			// to the full image or use a name replace as fallback
			target = (elm.dataset.url) ?
				elm.dataset.url :
				imgSrc.replace('-small',''),
			// a new image element
			img = new Image(),
			doLoad = function() {
				// if done, return
				if (elm.className.indexOf(enhancedClass) != -1) {
					return;
				}
				// sets the image source to the element
				if(type === 'bg') {
					elm.style.backgroundImage = 'url('+target+')';
				} else {
					elm.src = target;
				}
				elm.className += ' ' + enhancedClass;
			};
		// on load, change src of the element image
		img.onload = doLoad;
		//trigger the load
		img.src = target;
	});
}

Get the code from github

Like this you could easily reduce the data for the initial load. In combination with concepts like lazy loading and responsive images this is could definitely help to speed up your site.