A curling green ocean wave

In an upcoming image gallery article, I wanted to ensure that the images were treated with progressive enhancement to create an increasingly animated, accessible, high-fidelity presentation.

The Chicken & Egg Conundrum

JavaScript can’t be guaranteed to “hide” HTML images before they load: if a script fails to execute in time, a “flash of content” may appear as images load and are then hidden by the script. If images are loaded solely via JavaScript and scripting is blocked in the browser or the code fails to work, leaving the page bare of any content at all.

A combination of new and old techniques is traditionally used to solve the problem: placing a 1px × 1px transparent “filler” GIF as the src of an image, the actual image path is applied as the value of a data- attribute.

<img src="1x1.gif" class="lazy" data-src="real-image.jpg" alt>

Then, having JavaScript replace the former with the latter, often initiated by a scroll event:

var lazy = document.getElementsByClassName('lazy');
for(var i=0; i<lazy.length; i++){
 lazy[i].src = lazy[i].getAttribute('data-src');

However, this doesn’t address the problem of JavaScript being blocked. To address this, a <noscript> tag containing the real images is used on the same page:

    <img src="real-image.jpg" alt>

The <noscript> content is shown if JavaScript is not supported. If JavaScript is running, the filler GIFs are replaced by the actual images, and the <noscript> content is ignored.

However, this technique has two disadvantages:

  1. The content on the page is repeated twice (once in the data-src images, and again inside the <noscript> tag), complicating code maintenance.
  2. There’s no fallback if the browser supports JavaScript but there is an error in your code: in that case, the <noscript> will not appear, and the image swap will not take place, leaving the page blank.

An Alternative

Over the weekend, I developed an alternative to this approach which works very well, if you assume that the browser supports CSS animations. My particular use-case was to load the images onto the page but to have them remain invisible if JavaScript was running.

The images are placed as usual on the page: in this case I’ve placed them inside a <div> with two classes; I’ve left alt values empty for clarity:

<div class="shuffle reveal">
    <img src="umbrellas.jpg" alt>
    <img src="shopping-at-night.jpg" alt>
    <img src="lanterns.jpg" alt>
    <img src="outdoor-dining.jpg" alt>
    <img src="blade-runner.jpg" alt>
    <img src="square-umbrellas.jpg" alt>

The images are set to an opacity of 0:

.shuffle img { 
  width: 33%; 
  opacity: 0; 

The images also have an animation applied that will bring them into full opacity after a 1 second delay: not long enough for anyone to get concerned or confused, but enough time for the JavaScript that follows to do its work:

@keyframes reveal {
  to { 
    opacity: 1; 
.reveal img { 
  animation: reveal 1s 1s forwards; 

The animation can be made more like the final, JavaScript enhanced version of the gallery by adding a sequential fade-in with CSS:

.reveal img:nth-child(1) { animation-delay: .5s; }
.reveal img:nth-child(2) { animation-delay: 1s; }
.reveal img:nth-child(3) { animation-delay: 1.5s; }

The JavaScript

The associated script finds the element with the reveal class using a querySelector and removes it with classList. This stops the animation from occurring:

var reveal = document.querySelector(".reveal");

However, the images inside the reference are still available to the script:

var revealedImages = reveal.querySelectorAll("img"),

…meaning that the images will be loaded, but remain hidden until they are manipulated with JavaScript.


If JavaScript is not supported or the code fails, the images will appear after one second, fading in with the animation, which resembles the full JavaScript version (the fade is sequentially, if you’ve chosen to add the extra CSS.) If the JavaScript works, the images will remain hidden, waiting for the script to modify them.

I’ve tested the result after throttling the browser down to 2G speeds, and the code appears to work very well. The one aspect I don’t like is the timed nature of the CSS: if anything blocks the JavaScript from running, that animation will run. However, a page built to modern performant standards should do well, so I hope it might be a useful technique for you too.

Photograph by Phillip Gibbs, licensed under Creative Commons

Enjoy this piece? I invite you to follow me at twitter.com/dudleystorey to learn more.