Post-Candide
After the garden...
Why Your Phone Camera Can't See Infrared
(But Your Old Digicam Could)
I actually used to own of these Sony camera with Nightlock. As I remembered this, I asked Claude to review what was the evolution of IR imaging with digital cameras and why:
Digital cameras from the early 2000s would pick up near-infrared light with little effort — point a TV remote at the lens and the sensor would register the burst clearly. Attach a dark IR-pass filter and you could shoot striking infrared landscapes in reasonable exposure times. Modern phone cameras won’t do any of this. The reason is a hardware filter, not a sensor limitation or a software restriction.
Silicon Sees More Than You Do #
The answer starts with physics. The silicon photodiodes at the heart of every digital sensor — CCD or CMOS — respond to a much broader slice of the electromagnetic spectrum than the human eye. Our vision covers roughly 380–700 nanometres. A bare silicon sensor keeps responding well past 1000nm, deep into near-infrared territory.
Left unchecked, this causes real problems. Infrared-heavy scenes blow out in strange ways. Green foliage turns white because chlorophyll strongly reflects IR. Skin tones shift. Blue channels underexpose relative to the others. A sensor that sees too much of the spectrum actually produces worse colour rendition than one that sees less.
The solution is to physically block the light you don’t want.
The Hot Mirror: A Filter That Changed Everything #
Manufacturers place an IR-cut filter — often called a hot mirror — directly in front of the sensor, as part of the sealed optical stack. It reflects or absorbs near-infrared wavelengths before they ever reach a photosite. Problem solved, mostly.
The strength of this filter is a tunable design decision, and that decision has shifted dramatically over the past twenty-five years:
-
Late 1990s – early 2000s: Early consumer digicams used relatively thin, inexpensive IR-cut filters. IR rejection was adequate for normal photography but far from total. With a cheap IR-pass filter (such as a Hoya R72, which blocks all visible light below ~720nm) and a sunny day, exposure times of a few seconds were enough to capture striking infrared images.
-
Mid-to-late 2000s: As sensor technology matured and image quality expectations rose, filter strength increased. IR photography on unmodified cameras moved from “awkward but feasible” to “technically possible but requiring multi-minute exposures,” effectively killing casual experimentation.
-
Phone cameras: Miniaturisation pushed the optical stack into a tight, fixed-focus package where a weak IR-cut filter causes even more severe colour problems than in a larger camera. Phone manufacturers fitted aggressive filters early, and they have only improved since.
The Brief Window of Phone IR #
There was a short period — roughly coinciding with the iPhone 3G and 4, and many Android contemporaries from 2008–2011 — where phone camera sensors had modest enough IR-cut filters that some sensitivity remained. It wasn’t particularly useful for artistic infrared photography, but it was detectable: point a remote control at the camera in a darkened room and you’d see a faint flicker. By the iPhone 5 generation this window had largely closed, and on current hardware it is gone entirely for practical purposes.
Why Software Can’t Help #
A common misconception is that this might be a software restriction that could be unlocked — similar to how camera apps sometimes expose RAW modes or manual controls hidden from the default interface. It isn’t.
The filter acts before the light reaches the sensor. There is no infrared signal in the raw data to recover, no matter how deep into the image pipeline you go. Software can only process information that was captured; it cannot reconstruct wavelengths that were physically stopped at the glass.
The only way to restore infrared sensitivity to a camera is to physically remove or replace the hot mirror — a process that camera conversion services offer for dedicated IR work, permanently sacrificing normal colour photography in the process.
A Footnote: Sony NightShot and the Fabric Incident #
One of the few cases where software and IR genuinely intersected came from Sony in the early 2000s. Several camcorder models included a NightShot mode designed for low-light recording, which electrically disabled the IR-cut filter and activated onboard infrared LEDs for illumination. The feature worked as intended — until it was discovered that in bright sunlight with the filter disabled, the sensor could render certain thin synthetic fabrics effectively transparent.
Sony’s fix was a hardware-enforced interlock that prevented NightShot from activating above a light level threshold. It remains one of the more memorable demonstrations of the fact that, ordinarily, the hot mirror isn’t just a quality-of-life feature — it defines the boundary of what the sensor is permitted to see.
The Takeaway #
The disappearance of accidental infrared sensitivity from consumer cameras is not a conspiracy, a corporate restriction, or a software lock. It is the predictable outcome of manufacturers building progressively better optical filters to produce progressively better colour images. The sensors themselves haven’t changed much in their fundamental response to infrared light. The glass in front of them has simply gotten better at saying no.