A shadow is the cheapest witness in any investigation. It can't be bribed, it doesn't get tired, and it has no political opinions — it just falls where physics tells it to fall. If a photo claims to be from Donetsk at 11:40 in July but the shadows are pointing the wrong way, you don't need a confession. The sun already gave you one.
This is GEOINT shadow analysis — the sub-discipline of geospatial intelligence that turns the geometry of light into evidence. It's used to verify or break claims about where and when a photo or video was taken, to narrow down a continent-sized search to a county-sized one, and to catch fakes that look perfect to the eye and absurd to the math.
What "shadow and sun" actually means in OSINT
The sun's position in the sky is described by two angles: azimuth (its compass direction relative to north) and elevation (its angle above the horizon). Azimuth controls the direction of a shadow. Elevation controls its length. Both depend on three inputs only: latitude/longitude, date, and time. Lock those down and the sky becomes deterministic.
The relationship is brutally simple. Shadow length equals object height divided by the tangent of solar elevation — L = h / tan(α). Solar altitude itself is given by α = arcsin(sin φ · sin δ + cos φ · cos δ · cos H), where φ is latitude, δ is the sun's declination for that date, and H is the hour angle. You don't need to memorise it. You need to know that any shadow you can measure in an image is a constraint on where and when it was taken.
That's the whole game: convert pixels into angles, angles into a constraint, constraints into a verdict.
The core workflow
Forget the marketing copy. In an actual investigation, the loop looks like this:
- Find an upright object of known orientation in the frame — a lamppost, a wall corner, a person standing straight, a flagpole. Vertical is your reference.
- Draw a line from the base of the object to the tip of its shadow. That line, transferred onto a map, is the solar azimuth in the world frame of the photo.
- Take the suspect location, suspect date, and suspect time. Plug them into a sun calculator. Read off the predicted azimuth and elevation.
- Compare. If your measured azimuth from the photo and the predicted azimuth from the calculator agree within tolerance, the claim survives. If they're off by 30 degrees, somebody's lying — to you, to the camera, or to themselves.
That's it. The whole field — every tool, every technique below — is a refinement on this four-step loop.
The tools that do the heavy lifting
None of these are magic. They're calculators with maps glued on. The skill is in the operator.
SunCalc.org is the workhorse. Drop a pin, set the date and time, and it draws the sun's azimuth as a yellow line and the shadow direction as the opposite. Bellingcat's toolkit explicitly recommends it for chronolocation work — the catch is that it takes a location as input and lets you scrub time, so you confirm or reject candidate locations rather than discovering them. Trial-and-error, but fast trial-and-error.
PhotoPills is a paid mobile app aimed at landscape photographers, which is exactly why it's good for OSINT. Photographers obsess over the same questions investigators do: where is the sun at 17:42 on March 14, what does the shadow do at sunrise, when does golden hour hit this wall. Augmented-reality overlays on the camera let you walk a scene mentally without being there.
Stellarium is open-source planetarium software. It does the sun, but its real superpower is when shadows are weak, when the sun is below the horizon, or when you can see stars or planets in the frame. Night-time chronolocation, dawn or dusk shots, identifiable constellations behind a window — Stellarium covers what SunCalc can't.
Bellingcat's Shadow Finder inverts the problem. Instead of testing one location, it answers: given a measured shadow length, an object height, and a date/time, where on Earth could this physically have happened? The output is a thin band of latitudes that wraps the globe — usually a circle on the map. Combine that with a hemisphere assumption or a daylight constraint, and you've gone from "anywhere on the planet" to a manageable strip in minutes. It's open-source under MIT and runs in a Colab notebook, so no install drama.
ShadowCalc visualises the shadow footprint of buildings — useful when the object you're working from is architectural rather than a single pole. ShadeMap goes a step further and renders terrain shadows from mountains, trees, and 3D buildings dynamically. If your photo is in a valley at 16:00 and the sun "should" be hitting it but isn't, ShadeMap can tell you a ridge ate the light.
Google Earth Pro with the Sunlight tracker is rougher than SunCalc but indispensable for the ground-overlay trick — drop a screenshot from the suspect photo onto the 3D model of a candidate location, align landmarks, and see if the modelled shadows line up with the photographed ones. Marble (KDE) is a lightweight open-source alternative when you don't want Google in your workflow. GeoSetter handles the EXIF side: it's where you check if a photo's claimed timestamp and GPS metadata even survive a sanity test before you start measuring shadows.
Hugin doesn't compute sun angles, but it's the photogrammetry layer underneath everything. When the photo you're analysing is a wide shot or a stitched panorama, Hugin lets you correct lens distortion and recover true angular geometry — without that step, your measured azimuth is junk.
The techniques separating amateurs from operators
Plugging a screenshot into SunCalc is the entry-level move. The interesting work happens after.
The pinhole-shadow method. Find any small bright spot of sunlight passing through a gap — a hole in a fence, a chink in shutters, a tree-canopy speckle. The geometry of that light point gives you a tiny, sharp solar elevation reading even when no upright object is conveniently placed. Useful indoors and in cluttered scenes.
Virtual cardboard cutout. If the only vertical reference is a person, model them as a rigid stick of estimated height, drop them into a 3D scene built from Google Earth or your own photogrammetry, and check whether the cast shadow in the model matches the cast shadow in the photo. Cheap, ugly, effective.
Reverse-compute latitude from noon elevation. If you can establish solar noon (the moment shadows are shortest in a given day), you can derive latitude directly from the elevation angle using φ = 90° − αnoon + δ. No location needed as input. Pair it with an EXIF date you trust, and you've gone from no-info to a latitude band on one calculation.
Sub-pixel measurement of shadow length. The single biggest source of error in beginner work is reading shadow length to the nearest pixel. Don't. Upscale, work in floating-point coordinates, and average across multiple parallel shadows in the same frame. A 1-pixel error on a 50-pixel shadow is a 2% error — which translates into hundreds of kilometres on a Shadow Finder map.
Horizon-edge sun for low-light photos. When the sun is at or near the horizon, the shadow gets unreliable (long, fuzzy, terrain-warped). Instead, find the sun's centre against the horizon line in the image. Its azimuth there is its azimuth, full stop, no shadow needed.
Multi-shadow consistency check. Every shadow in a real photo points away from the same sun. Pick three or four objects in the frame, project their shadow vectors, and they should converge on a single solar azimuth. If they don't converge, the photo is composited. This is one of the cheapest and most reliable forgery detectors in OSINT.
This actually works in real cases
Bellingcat's MH17 investigation used shadow analysis to peg the time of a key Buk transporter photograph at roughly 12:30 — a number that was later corroborated by witnesses on the ground. The same playbook has been used dozens of times since for Syria chemical weapons claims, Sudan atrocity verification, and countless geolocation puzzles in the daily Quiztime cycle.
The people pushing the technique forward are publicly visible. Benjamin Strick at the Centre for Information Resilience writes some of the clearest applied tutorials in the field. Sector035 on Medium documents real-case shadow workflows. Bellingcat's own resources page is the closest thing the discipline has to a textbook. Read them in that order.
Where it breaks
Shadow analysis isn't a confession machine. It has hard limits. Without a known timestamp, Shadow Finder gives you a circle around the planet and not much else — and social platforms strip EXIF on upload, so timestamps disappear by default. Overcast skies kill shadows entirely. Wide-angle lenses warp angles unless you correct for them. Tall surrounding terrain or buildings can amputate shadows at random. And every measurement carries error: a 5% error on shadow length pushes the location band by tens to hundreds of kilometres.
None of that is a reason to skip the technique. It's a reason to combine it with everything else — landmark matching, satellite imagery, weather-data cross-reference, social media chatter — until the constraints intersect at one spot and one time. Shadows narrow the search. Other evidence closes it.
And when a photo's shadows simply refuse to behave — when the multi-shadow check fails, when the azimuth is off by 40 degrees, when the sun is on the wrong side of the building — you have something better than a location. You have proof the image is a lie. The sun doesn't lie. Photoshop does.
