Category: "HTML / CSS"

Browser Cache Management - Ensure Updates

This post describes one way to ensure the javascript and CSS are requested by the browser when a new release is distributed.

The first approach is to prefix the file names with a version-release string, and create symlinks to the files during installation or the first execution. Many systems have a version identification mechanism.

To manage the symlinks, the following could be used:


#!/bin/bash

fclearold()
{
        echo 'fclearold'
        for g in $(find *.$1 -maxdepth 1 -type l);do
                echo $g
                rm -f $g
        done
}

fcreatenew()
{
        echo 'fcreatenew'
        for g in $(ls *.$1); do
                ln -s $g $2.$g
        done

}

version=`cat "$BASE/config/version`;
for f in 'js' 'css'; do
        echo $f
        pushd $f > /dev/null
        fclearold $f
        echo $version;
        fcreatenew $f $version
        popd > /dev/null
done

fclearold removes the old symlinks, fcreatenew makes new ones. It is assumed the javascript is in the js directory and all javascript files have a .js extension, and the CSS is in the css directory and all CSS files have a .css extension.

httpd.conf (or equivalent)

# Cache js and CSS files for 6 months - with a timestamp
<FilesMatch "\.(js|css)$">
  ExpiresActive On
  ExpiresDefault "access plus 6 months"
  Header set Cache-Control "max-age=15552000"
  RewriteEngine On
  RewriteRule (.*)/[0-9]+\.(.*)$ $1/$2
  FileETag MTime
</FilesMatch>

Timestamp Management Code (PHP)


        function sTimestamp()
        {
                $sTimestampFile = 'cache/timestamp';
                if (is_file($sTimestampFile))
                        $sRetVal = file_get_contents($sTimestampFile);
                else
                {
                        $sRetVal = time();
                        file_put_contents($sTimestampFile,$sRetVal);
                }
                return $sRetVal;
        }

Once the timestamp is set, it is cached in the cache/timestamp file. In this system, the cache is cleared when new releases are installed, so the absence of the timestamp file ensures an update and a new set of requested files.

The timestamp can applied to .js and .css file requests like so:

<script type="text/javascript" src="js/<?php echo $timestamp ?>.code.js"></script>

Zend Framework - Navigation View Helper - Active Style Rules

The objective of these rules are to ensure only the active link is highlighted.

The key is the ‘element>element’ selector, which indicates that only a tags which have a parent of li.active will be affected by the rule.

#mainNav a
{
color:#888;
}
#mainNav li.active>a
{
color:#000;
}
#mainNav a:hover
{
color:#444;
}

Quick Sprite Builder

Sprites allow you to combine many images into a single file, reducing both the number of requests and bandwidth required to deliver pages.

I had 51 images, each was about 10K, so the total was about 510K.

These images had dimensions of about 250px width and 125px height.

I wanted to combine them all into a single image, and generate the CSS to compute the offsets into the sprite.


#!/bin/bash

# Remove the prior appended image
rm appended.jpg

# Create a list of all the original jpg files
ls *.jpg > jpg_files

# Resize all the images to ensure they have the same height and width
for f in $(cat jpg_files); do
        convert "$f" -resize 250x125! +repage "$f";
done

# Break the list into rows of 10 images
split -l 10 jpg_files jpg_row.

# Generate the ImageMagick command to append the images into rows 
c=convert
for f in $(ls jpg_row.*); do 
        h=`cat "$f" | tr '\n' ' '`
        c="$c"' ( '"$h"' +append ) '
done
# Combine the rows into the appended image, reduce the quality to save space
c="$c"' -background transparent -append -quality 70  appended.jpg'
`$c`

echo '.tag{height:125px;width:250px;overflow:hidden;background-color:transparent;background-image:url("appended.jpg");}' > ap.css

# Generate the CSS
r=0
for f in $(ls jpg_row.*); do 
        c=0
        for g in $(cut -f 1 -d '.' "$f"); do
                echo ."$g"'{background-position:-'$((c*250))'px -'$((r*125))'px;}' >> ap.css
                c=$((c+1))
        done
        r=$((r+1))
done

The final image was about 260K, still large, but the quality is good. Compressed for transfer, this image will serve well.

This code isn’t generalized, if you would like to use it, you’ll need to adjust the image dimensions and the number used to calculate the offsets.

Great New Web Resource

CoderZone.org launched recently.

It’s great new resource for web people, from ‘n00bs’ to ‘w00ts’. What makes it special:

  • A great team of moderators. These guys are experienced and know the web.
  • A library of code snippets, little bits of code that will save you a tremendous amount of time. You can contribute code, too.
  • XHTML/HTML & CSS sandboxes so you can test out ideas quickly.
  • An SQL sandbox for testing queries.
  • It’s free.
  • A very cool design.
  • No ads, the forum is there to help people, not distract you with ads you aren’t going to click on anyway.

An Ounce of Performance and Prevention

The Wicked Good Gallery was a lot of fun to build. It meets the stated requirements, I like the design, and I learned some good stuff.

One issue that always concerns me is performance. The Wicked Good Gallery demo has seven images, and on each page load, it reads the image directories, tests for a thumbnail and detailed image, and if they don’t exist, it creates them from the first .jpg in the directory it finds. The issue is that in most cases, when images are posted, they won’t change. This isn’t a versioned gallery, only the last image uploaded will be offered to site visitors.

For that reason, you can assume that the only time you need to test for new files is when the list of directories changes. You can further extend it to only rebuild new directories, but then you risk the problem of only testing for content in new directories, not refreshing any existing images.

Therefore, a simple caching solution was implemented. When the page loads, the code tests for the presence of a cache file. If the file is present, it compares the directories listed in the cache file to those it reads on the server. It they match, the cached list of artwork is used, otherwise, the cache is rebuilt.

$bUpdateCache=false;
$aDirectoryCache=array();
if (is_file($sDirectoryCache))
        include $sDirectoryCache;
else
        $bUpdateCache=true;
$aDirectories=glob($sArtworkDir.'/*');
if ($aDirectoryCache!=$aDirectories)
{
        foreach ($aDirectories as $k => $v)
        {
                if (is_file($v.'/'.$sArt) && is_file($v.'/'.$sThumb))
                        $aPieces[]=$v;
                else
                        if (make_images($v,$sArt,$iArtHeight,$sThumb,$iThumbHeight))
                                $aPieces[]=$v;
        }
        $bUpdateCache=true;
}
else
        $aDirectories=$aDirectoryCache;
if ($bUpdateCache)
        file_put_contents($sDirectoryCache,'<?php $aDirectoryCache='.var_export($aDirectories,true).';'.PHP_EOL.
                '$aPieces='.var_export($aPieces,true).'; ?>');

There was about a 0.0002 second improvement in performance. Although that sounds trivial, more rigorous testing, with more images, would likely justify the implementation. There is a performance penalty if the cache is removed, but it’s very small, since the cache is built with var_exports.

The prevention added was a base http://www.w3schools.com/TAGS/tag_base.asp tag. This tag ensures all files are drawn from the correct directories, while still allowing relative references in the code.


<base href="http://wirehopper.com/wickedgoodgallery/" />
<link rel="stylesheet" type="text/css" href="css/style.css" />

In this case, the base URL is http://wirehopper.com/wickedgoodgallery/, and the style tag will draw from http://wirehopper.com/wickedgoodgallery/css/style.css, regardless of any slashes submitted in the URI, the page will display properly, and future references, derived by page navigation (clicking on images and links) will be valid.