I fell down a rabbit hole last weekend and came back with a thing: CETsize — a tiny web app that measures whale body length from drone photos, right in your browser, no uploads, no servers.

Live: whales.luisg.me

Here’s a quick peek at the app UI — load a photo, zoom/pan, and lay down straight or curved measurements:

CETsize UI showing image, toolbar and measurements

This started with Jan Hofman’s R program (massive shout‑out 🙌). I loved the idea and wanted something biologists could run anywhere without installing R or touching a terminal. Also… I’ve been exploring agents + GitHub Copilot, so this was the perfect excuse to see how far I could push a browser‑only build.

The plan

  • Keep everything client‑side (privacy first).
  • Read the metadata we already have (EXIF/XMP/DNG).
  • Do the math once and make the UI feel nice (zoom/pan, stable handles).
  • Ship it fast on Cloudflare Pages.

What it actually does

  • Parses image metadata in‑browser:
    • EXIF: GPSAltitude (MSL), focal length, image width
    • DJI XMP: RelativeAltitude and other goodies when present
    • DNG/TIFF: basic tag parsing + preview extraction
  • Measures:
    • Straight distance (two points)
    • Curved length (polyline) — handy for following body curvature
  • Stays readable at any zoom (handles/lines don’t scale visually)
  • Exports results as JSON or CSV
  • Has a built‑in sensor width cheat‑sheet (by camera/module codes like DJI FC####), with manual override

The math (aka turning pixels into meters)

Take altitude, focal length, sensor width, and image width; produce a meters‑per‑pixel scale. Prefer AGL when available; otherwise EXIF GPSAltitude (MSL) is used with a clear hint to double‑check.

This is the panel where you can review/override camera parameters (sensor width, focal length, altitude) when metadata is missing or needs correction:

Camera parameters panel where sensor width, focal and altitude can be edited

Roughly:

meters_per_px = altitude_m * (sensor_width_mm / focal_length_mm) / image_width_px

All calculations happen in original image pixels to avoid rounding issues. The overlay is SVG on top, but the math stays in image space.

The fun bits

  • Zoom/pan that feels natural (cursor‑centered zoom).
  • Non‑scaling UI: control points and stroke widths stay readable at any zoom.
  • Gentle “guard rails”: if we can’t find sensor width or focal length, the app asks — you’re always in control.

Agents + Copilot as co‑pilots

I used agents and GitHub Copilot to:

  • Scaffold components fast (the measurement overlay came together in hours, not days).
  • Iterate on EXIF/XMP/DNG parsing without yak‑shaving libraries.
  • Fuzz edge cases (missing EXIF, 35mm‑eq only, negative altitudes).
  • Draft tests and error messages, then I tightened them up.

The human bits were the important ones: picking the architecture, validating the math, and polishing the UX.

Tech + deployment

  • Frontend: React + SVG overlay
  • Parsing: lightweight EXIF/XMP/DNG utilities tailored for the browser
  • Node 18+ toolchain
  • Cloudflare Pages for deploys (push → build → live)

What’s next

  • More camera presets
  • Optional annotated export (image + overlay)
  • Mobile polish
  • A short guide for consistent field workflows

If you work with drone imagery or marine biology and have ideas, I’d love to hear them.

Live app: whales.luisg.me

Huge thanks again to Jan Hofman for the original R inspiration. This was a fun weekend ship — and a neat exploration of what orchestrating an agentic LLM feels like.