OneImage
OneImage
·
technicalwebassemblyperformance

Building an Enhanced Squoosh: High-Performance Local Image Compression with libimagequant-wasm

Building an Enhanced Squoosh: High-Performance Local Image Compression with libimagequant-wasm

A deep dive into how we built a privacy-first, high-performance image compression tool by combining Google Squoosh's architecture with libimagequant-wasm, Web Workers, and modern web APIs.

When Google announced the shutdown of Squoosh in early 2023, the web development community lost a valuable tool for client-side image compression. At OneImage, we saw an opportunity not just to preserve this functionality, but to enhance it. This article details the technical approach we took to build a production-ready, enhanced version of Squoosh that prioritizes privacy, performance, and developer experience.

Understanding Google Squoosh's Architecture

Google Squoosh pioneered the concept of running image encoders entirely in the browser using WebAssembly (WASM). The original architecture consisted of:

  • Client-side processing: All compression happens locally, ensuring privacy
  • WebAssembly codecs: Native-speed encoders compiled to WASM (MozJPEG, OxiPNG, WebP, AVIF)
  • Web Workers: Offloading heavy computation to prevent UI blocking
  • Canvas API: Image manipulation and preview generation

While groundbreaking, Squoosh had limitations:

  • PNG compression relied on OxiPNG alone, which prioritized compression ratio over speed
  • No built-in batch processing capabilities
  • Limited preset configurations for common use cases
  • Tightly coupled UI and compression logic

Our Enhancement Strategy

1. Integrating libimagequant-wasm for Superior PNG Compression

The cornerstone of our enhancement is libimagequant-wasm, a WebAssembly port of the industry-standard pngquant library. This library uses a sophisticated color quantization algorithm that produces visually superior results compared to simple palette reduction.

Why libimagequant?

  • Perceptual quality: Uses a modified median cut algorithm optimized for human perception
  • Adaptive palettes: Generates optimal palettes of 2-256 colors based on image content
  • Transparency handling: Preserves alpha channels while compressing
  • Performance: Runs at near-native speed thanks to WASM

Implementation Details

Here's how we integrated libimagequant into our compression pipeline:

import LibImageQuant from '@fe-daily/libimagequant-wasm'; 
import * as wasmModule from '@fe-daily/libimagequant-wasm/wasm/libimagequant_wasm.js';

async function compressPNG(imageData: ImageData, level: number): Promise<Uint8Array> {
  const quantizer = new LibImageQuant({ wasmModule });
  
  // Map compression level (0-10) to color count (256-2)
  const maxColors = Math.max(2, 256 - (25.6 * level));
  
  const quantized = await quantizer.quantizeImageData(imageData, {
    maxColors: Math.floor(maxColors),
    speed: 1,          // Balance between quality and speed
    quality: {
      min: 0,
      target: 100      // Aim for highest quality within color limit
    }
  });
  
  return new Uint8Array(quantized.pngBytes);
}

Key parameters explained:

  • maxColors: Controls the palette size. Fewer colors = smaller file, but potentially worse quality
  • speed: Range 1-10, where 1 is slowest but highest quality
  • quality.target: Sets the target quality threshold (0-100)

Our Squoosh tool uses this implementation to deliver compression ratios of 60-80% with minimal perceptual loss.

2. Architecting a Robust Web Worker System

To prevent the browser from freezing during compression (especially for large images or batch operations), we built a dedicated Web Worker architecture:

// compression-worker.ts
import { EncoderOptions, CompressResult } from './squoosh-types';
import LibImageQuant from '@fe-daily/libimagequant-wasm'; 
import * as wasmModule from '@fe-daily/libimagequant-wasm/wasm/libimagequant_wasm.js';

interface CompressMessage {
  type: 'compress';
  imageData: ImageData;
  options: EncoderOptions;
}

self.onmessage = async (e: MessageEvent<CompressMessage>) => {
  const { type, imageData, options } = e.data;

  if (type === 'compress') {
    try {
      const result = await compress(imageData, options);
      self.postMessage({ success: true, result });
    } catch (error) {
      self.postMessage({
        success: false,
        error: error instanceof Error ? error.message : 'Unknown error',
      });
    }
  }
};

async function compress(
  imageData: ImageData,
  options: EncoderOptions
): Promise<CompressResult> {
  switch (options.type) {
    case 'png':
      return await compressPNGWithQuantization(imageData, options);
    case 'jpeg':
      return await compressJPEG(imageData, options);
    case 'webp':
      return await compressWebP(imageData, options);
    case 'avif':
      return await compressAVIF(imageData, options);
  }
}

Worker benefits:

  • Non-blocking UI during compression
  • Ability to cancel long-running operations
  • Parallel processing for batch operations (multiple workers)
  • Memory isolation preventing main thread memory leaks

3. Building Smart Compression Presets

Rather than exposing raw encoder parameters, we created three presets optimized for common use cases:

const PRESET_CONFIGS = {
  highQuality: {
    png: { level: 3 },     // ~200 colors
    jpeg: { quality: 90 },
    webp: { quality: 90 },
    avif: { quality: 85 }
  },
  balanced: {
    png: { level: 5 },     // ~128 colors
    jpeg: { quality: 80 },
    webp: { quality: 80 },
    avif: { quality: 70 }
  },
  minSize: {
    png: { level: 8 },     // ~50 colors
    jpeg: { quality: 60 },
    webp: { quality: 60 },
    avif: { quality: 50 }
  }
};

These presets were calibrated through extensive testing on diverse image types (photos, illustrations, screenshots, UI elements) to find optimal quality-size trade-offs.

4. Implementing Efficient Batch Processing

For users compressing multiple images, we built a queue system with progress tracking:

class BatchCompressor {
  private queue: BatchItem[] = [];
  private activeWorkers: Set<Worker> = new Set();
  private maxConcurrency = navigator.hardwareConcurrency || 4;

  async processBatch(files: File[], options: ProcessorOptions) {
    const batchId = Date.now();
    
    for (const file of files) {
      this.queue.push({
        id: `${batchId}-${file.name}`,
        file,
        status: 'pending',
        options
      });
    }

    await this.processQueue();
  }

  private async processQueue() {
    while (this.queue.some(item => item.status === 'pending')) {
      if (this.activeWorkers.size < this.maxConcurrency) {
        const item = this.queue.find(i => i.status === 'pending');
        if (item) {
          item.status = 'processing';
          await this.processItem(item);
        }
      } else {
        await new Promise(resolve => setTimeout(resolve, 100));
      }
    }
  }
}

This approach:

  • Maximizes CPU utilization by running parallel workers
  • Prevents browser crashes by limiting concurrency
  • Provides real-time progress updates to users

Try our batch processing in action at OneImage Squoosh.

Performance Optimizations

Memory Management

Large images can quickly exhaust browser memory. We implemented several mitigation strategies:

async function processLargeImage(file: File): Promise<CompressResult> {
  const MAX_DIMENSION = 4096;
  const img = await loadImage(file);
  
  // Downscale if necessary
  let { width, height } = img;
  if (width > MAX_DIMENSION || height > MAX_DIMENSION) {
    const scale = Math.min(MAX_DIMENSION / width, MAX_DIMENSION / height);
    width = Math.floor(width * scale);
    height = Math.floor(height * scale);
  }
  
  // Use OffscreenCanvas when available for better memory handling
  const canvas = typeof OffscreenCanvas !== 'undefined'
    ? new OffscreenCanvas(width, height)
    : document.createElement('canvas');
  
  canvas.width = width;
  canvas.height = height;
  
  const ctx = canvas.getContext('2d');
  ctx.drawImage(img, 0, 0, width, height);
  
  const imageData = ctx.getImageData(0, 0, width, height);
  
  // Immediately release canvas and image resources
  URL.revokeObjectURL(img.src);
  
  return await compress(imageData, options);
}

WASM Module Caching

WebAssembly modules benefit from aggressive caching:

let cachedWasmModule: typeof wasmModule | null = null;

async function getWasmModule() {
  if (!cachedWasmModule) {
    cachedWasmModule = await import(
      '@fe-daily/libimagequant-wasm/wasm/libimagequant_wasm.js'
    );
  }
  return cachedWasmModule;
}

This reduces initialization time from ~500ms to near-instant on subsequent compressions.

Expanding the Toolkit

While Squoosh focuses on compression, we built a full suite of complementary tools:

All tools share the same architectural principles: privacy-first, WASM-powered, and fully client-side.

Browser Extension Integration

We extended the web app architecture into a browser extension, enabling:

  • Instant access via toolbar popup
  • Context menu integration for right-click compression
  • Tab-based state management
  • Local storage for preset preferences

The extension reuses the same Web Worker and WASM infrastructure, ensuring consistent behavior across platforms.

Deployment and Infrastructure

Edge Computing with Cloudflare

We deploy OneImage Squoosh to Cloudflare Pages, leveraging:

  • Global CDN for <50ms initial load times worldwide
  • HTTP/3 and Brotli compression for assets
  • Smart caching headers for WASM modules
  • Zero cold starts (static assets only)

Build Optimization

Our Next.js configuration includes:

// next.config.ts
module.exports = {
  webpack: (config, { isServer }) => {
    // Support for .wasm files
    config.experiments = {
      asyncWebAssembly: true,
      layers: true,
    };
    
    // Optimize worker imports
    config.module.rules.push({
      test: /\.worker\.(ts|js)$/,
      use: { loader: 'worker-loader' }
    });
    
    return config;
  },
  
  // Aggressive code splitting
  experimental: {
    optimizePackageImports: [
      '@jsquash/jpeg',
      '@jsquash/png',
      '@jsquash/webp',
      '@jsquash/avif'
    ]
  }
};

This ensures encoders are loaded on-demand, keeping the initial bundle under 100KB (gzipped).

Testing and Quality Assurance

We maintain comprehensive test coverage for compression logic:

// __tests__/advanced-compressor.test.ts
describe('AdvancedImageCompressor', () => {
  it('should compress PNG with libimagequant', async () => {
    const compressor = new AdvancedImageCompressor();
    const mockFile = createMockImageFile('test.png', 1000, 1000);
    
    const result = await compressor.compress(mockFile, {
      encode: { type: 'png', options: { level: 5 } }
    });
    
    expect(result.format).toBe('png');
    expect(result.size).toBeLessThan(mockFile.size);
    expect(result.data).toBeInstanceOf(Uint8Array);
  });
  
  it('should handle batch processing with concurrency', async () => {
    const files = Array(10).fill(null).map((_, i) => 
      createMockImageFile(`test-${i}.png`, 500, 500)
    );
    
    const startTime = Date.now();
    await batchCompress(files, { preset: 'balanced' });
    const duration = Date.now() - startTime;
    
    // Should be faster than sequential processing
    expect(duration).toBeLessThan(10 * 1000); // <1s per image
  });
});

Lessons Learned

  1. WASM is production-ready: With proper module loading and caching, WASM performance rivals native applications
  2. Web Workers are essential: For any CPU-intensive task, offloading to workers is non-negotiable
  3. User presets > raw controls: Most users prefer "good defaults" over granular tuning
  4. Memory matters: Always profile memory usage on large images and implement safeguards
  5. Privacy sells: Emphasizing "no server uploads" resonates strongly with users

Open Source and Community

While OneImage Squoosh is a commercial product, we contribute to the ecosystem:

  • Bug reports and PRs to @jsquash maintainers
  • Documentation improvements for libimagequant-wasm
  • Sharing performance benchmarks and best practices

Conclusion

Building an enhanced Squoosh required more than just integrating libimagequant-wasm. It demanded careful architectural decisions around Web Workers, memory management, user experience, and deployment infrastructure. The result is a tool that respects user privacy while delivering professional-grade compression performance.

Try OneImage Squoosh today, or explore our browser extension for even faster access. For developers building similar tools, we hope this technical deep-dive provides a useful blueprint.

---

References and Further Reading