HDR vs High Resolution - which is better? cheaper?

… - interesting approach! Things are moving fast these days with camera technology.

About 17 years ago, I worked on the development of a real-time 3D camera using two Neuricam NC1802 “Pupilla” chips - these were 640x480 pixel CMOS optical sensors which utilized photodiodes with a logarithmic response as photosensitive elements. Because of this, the chip had an interesting dynamic range of about 120 db (should be equivalent to around 20 bit). The camera worked great and was able to resolve pedestrians dressed in dark cloths while standing between the high beams of a car. However, because each single pixel of that chip had a slightly different response curve, we needed to calibrate and correct every single pixel of the sensor individually in a FPGA in order to obtain a usable image. Also, the sensor resolution obviously wasn’t that great … :smirk:

By the way, here’s a scan comparision of a 12bit RAW scan (via a Raspberry Pi HQ cam) and a classical HDR scan with 5 different exposures, with the same IMX477R sensor. Not much difference in visual appearance between both. That supports somewhat your statement that a single RAW capture with a good sensor should nowadays be able to handle standard film stock in a single scan pass.