Slimming Down the Gallery: Build-Time Thumbnail Preprocessing in Practice
My personal website, yuxu.ge, is a place for my thoughts, projects, and—importantly—my photos. The gallery is one of my favorite sections, a visual diary of travels and moments. But recently, I noticed it was getting heavy. Really heavy. A quick peek into the browser's developer tools confirmed my suspicion: the gallery page was a behemoth, and it was time to put it on a diet.
This is the story of how I implemented a build-time thumbnail preprocessing pipeline that cut my gallery's first-screen load from a staggering 8.5MB to a lean 0.7MB—a ~91% improvement—with zero runtime cost.
The Problem: Serving Full-Course Meals as Appetizers
The core issue was simple: I was serving full-resolution images for the thumbnail grid. My photography workflow involves exporting images from my Sony A7C at a reasonable web resolution, typically around 2000px wide. After running them through an initial compression script, they land at an average of ~469KB each.
That's perfectly fine for a full-screen lightbox view, but completely unnecessary for a thumbnail grid where each image is displayed in a cell that's maybe 300px wide on a desktop monitor.
Let's do the math. The initial view of my gallery loads about 18 images.
18 images × 469 KB/image ≈ 8,442 KB
That's ~8.5MB just to render the initial grid. Ouch. On a fast connection, it's a noticeable delay. On a mobile network, it's a data-guzzling, battery-draining disaster.
The problem wasn't confined to the gallery page. The homepage features a "Life Moments" film strip, a little carousel of recent photos. The cells there are even smaller, just 120×80px. Yet, it was also pulling down the same ~469KB originals. It was inefficient, wasteful, and delivering a subpar user experience.
The Solution: Pre-computation is the Best Computation
For a static site, the best optimizations are the ones you can do at build time. There's no need for a fancy on-demand image resizing service or a complex server-side setup. The solution was clear: generate smaller versions—thumbnails—of each image when I build the site.
Here was the plan:
- Generate Thumbnails: Create a second, smaller version of every photo. I settled on a width of 600px and a JPEG quality of 75%. This size is large enough to look crisp on high-density displays (2x for a 300px cell) but small enough to be lightweight.
- Update the Data Source: Modify the script that builds the JSON data for my gallery to include paths to these new thumbnails.
- Update the Frontend: Change the gallery's HTML to load the thumbnails in the grid view, while keeping the full-resolution images for the lightbox click-through.
The results of this approach were immediately promising. The new 600px thumbnails clocked in at an average of just ~25KB, a staggering 94.7% reduction per image. This was the key to unlocking massive performance gains.
Implementation 1: Beefing Up the Bash Script
I already had a compress-photos.sh script that handled my initial image processing (resizing to 2000px, setting quality to 85%). I decided to extend this script to generate the thumbnails in a second pass. This keeps all image-related logic in one place.
The core of the logic uses ImageMagick, a command-line powerhouse for image manipulation.
# Thumbnail settings
THUMB_WIDTH=600
THUMB_QUALITY=75
# ... inside the main loop iterating over each image `img` ...
# Naming: insert -thumb before extension
# e.g., A7C00748.JPG → A7C00748-thumb.JPG
ext="${img##*.}"
base="${img%.*}"
thumb_file="${base}-thumb.${ext}"
# Generate the thumbnail
magick "$img" \
-auto-orient \
-resize "${THUMB_WIDTH}x>" \
-quality "$THUMB_QUALITY" \
-strip \
-interlace Plane \
"$thumb_file"
Let's break down those magick flags:
-auto-orient: Reads EXIF data from the camera to automatically rotate the image to the correct orientation. Essential for photos taken in portrait mode.-resize "${THUMB_WIDTH}x>": This is a clever bit of ImageMagick syntax. It resizes the image to have a width of 600px. The>symbol means it will only perform the resize if the image is larger than the target width, preventing upscaling of smaller images. Thexwithout a second number maintains the aspect ratio.-quality "$THUMB_QUALITY": Sets the JPEG compression quality. 75 is a great sweet spot between file size and visual fidelity for web thumbnails.-strip: Removes all profile and comment data (like EXIF) from the image. This metadata is useful in the original but just dead weight in a thumbnail.-interlace Plane: Creates a progressive JPEG. This allows the image to load in successive passes, from blurry to sharp, which can improve perceived performance for users on slower connections.
To make the script efficient, I added a simple caching mechanism. It touches a file named .thumb-cache in each directory after processing. On subsequent runs, if this cache file exists, the script skips that directory entirely. This is perfect for a build process, as it avoids regenerating thousands of thumbnails every time I add one new photo album.
I also added --force and --dry-run flags for better control—--force to ignore the cache and regenerate everything, and --dry-run to see what the script would do without actually touching any files. Finally, I made sure the script filters out any files that already have -thumb in their name, preventing it from creating thumbnails of thumbnails recursively.
Implementation 2: Updating the Photo JSON
My site's frontend is powered by a JavaScript file, build-photos-json.js, which scans the photo directories and generates a photos.json file. The gallery page fetches this JSON to know what to display.
I needed to update this script to be aware of the new thumbnails. The goal was to produce a JSON structure where each album had parallel arrays for full-size images and their corresponding thumbs.
The logic for constructing the thumbnail path is straightforward string manipulation:
// Inside a loop processing each photo group (year/event)
groupThumbs = groupFiles.map(img => {
const dot = img.lastIndexOf('.');
return `/_content/photos/${year}/${event}/${img.substring(0, dot)}-thumb.${img.substring(dot + 1)}`;
});
This takes a filename like A7C00748.JPG, finds the last dot, and injects -thumb right before it. The resulting JSON output for an album now looks like this:
{
"year": "2026",
"location": "Newcastle upon Tyne",
"images": ["/_content/photos/2026/20260328-Newcastle/A7C00748.JPG"],
"thumbs": ["/_content/photos/2026/20260328-Newcastle/A7C00748-thumb.JPG"]
}
This clean, parallel structure makes it trivial for the frontend to access either the thumbnail or the original image.
Implementation 3: The Frontend (Gallery & Homepage)
With the thumbnails generated and the JSON updated, the final step was to update the HTML templates.
Gallery Page
In gallery/index.html, the code that renders the grid cells was changed to use the thumb path for the src attribute. The full-resolution img path is passed to the lightbox for the click-through view.
I also added a simple but effective fallback mechanism using the onerror attribute:
<img src="${thumb}" alt="${group.location}" loading="lazy"
onerror="this.onerror=null;this.src='${encodedImg}'">
This is a neat little trick for robustness. If a thumbnail fails to load for any reason (maybe the build script failed for that one image, or there's a network glitch), the browser will execute the onerror code. This code does two things:
this.onerror=null;: It removes theonerrorhandler to prevent an infinite loop if the fallback image also fails to load.this.src='${encodedImg}': It tells the<img>tag to try loading the full-resolution original instead.
The user might see a slightly longer load time for that one image, but the gallery remains functional. It's a graceful failure.
Homepage Film Strip
The "Life Moments" film strip on the homepage got a similar treatment. The logic that prepares the flat list of all photos now intelligently picks the thumbnail if it exists, falling back to the original if it doesn't.
let allPairs = photos.flatMap(p =>
p.images.map((img, i) => ({
thumb: p.thumbs && p.thumbs[i] ? p.thumbs[i] : img,
original: img
}))
);
Since the film strip cells are tiny (120×80px), the 600px thumbnails are more than enough to look sharp, while being a fraction of the original's file size.
Bonus Round: The Avatar
This optimization journey didn't stop at the gallery. My avatar image, displayed in the site's header and about page, was a 352KB file. It was needlessly large for its small display size. I applied the same thumbnailing logic, creating a 32KB version. Another quick 91% reduction!
The Pitfall: A Tale of Two File Systems
Everything seemed perfect... until I deployed. On my local macOS machine, all the thumbnails loaded correctly. On my Linux production server, they were all broken.
After some head-scratching, I found the culprit: case-sensitive file extensions.
My camera saves files with an uppercase .JPG extension. My original JavaScript code in build-photos-json.js used Node's path module to construct the thumbnail names:
// The old, buggy way
const ext = path.extname(img); // -> '.jpg' (lowercase!)
const base = path.basename(img, ext); // -> 'A7C00748'
const thumbFile = `${base}-thumb${ext}`; // -> 'A7C00748-thumb.jpg'
The problem is that path.extname normalizes the extension to lowercase. My bash script, however, preserved the original case (A7C00748-thumb.JPG).
macOS, by default, uses a case-insensitive file system (APFS). So, a request for ...-thumb.jpg would happily find the file ...-thumb.JPG. Linux, however, uses case-sensitive file systems like ext4. To Linux, .jpg and .JPG are completely different files. The browser requested the lowercase version, the server couldn't find it, and I got a sea of broken images.
The fix was to abandon the path module for this specific task and use simple string manipulation, which doesn't alter the case:
// The new, robust way
const dot = img.lastIndexOf('.');
return `.../${img.substring(0, dot)}-thumb.${img.substring(dot + 1)}`;
// -> '.../A7C00748-thumb.JPG' ✓
This preserved the original .JPG casing, matched the files generated by the bash script, and fixed everything on the production server. It was a classic "works on my machine" bug and a great reminder of the subtle differences between development and production environments.
The Results: By the Numbers
The impact of this change was immediate and dramatic.
| Scenario | Before | After | Reduction |
|---|---|---|---|
| Gallery first screen (~18 images) | ~8.5MB | ~0.7MB | ~91% |
| Per image average | ~469KB | ~25KB | ~94.7% |
| Homepage film strip (30 images) | ~14MB | ~0.75MB | ~94.6% |
| Avatar | 352KB | 32KB | 91% |
A 10x reduction in payload size is not a micro-optimization; it's a fundamental improvement to the user experience. Pages feel snappy, mobile users are spared from massive data downloads, and the whole site feels more professional and polished.
Conclusion
Implementing a build-time thumbnail pipeline was one of the highest-impact optimizations I've done for my personal site. By pre-processing images when the site is built, I offloaded all the performance cost from the user's browser, resulting in a drastically faster experience with zero runtime overhead.
The approach is simple, robust (thanks to the onerror fallback), and easily generalizable to any image-heavy static site. It's a powerful reminder that sometimes the most effective solutions aren't about complex frameworks or expensive services, but about doing simple work at the right time.