Add new comment

Hi Alex, thanks for the

Hi Alex, thanks for the speedy response.

I'm not using LibRaw directly, so I can't set the maximum or dynamically compute the adjustment threshold. But even without any of this, I believe I should be able to test the effects of either of these by simply setting the corresponding values (-c for adjust_maximum_thr, and -S for saturation point) in the command line when calling `dcraw_emu`.

As I mentioned, `adjust_maximum_thr` has no effect. I've tried 0.01, 0.1, 0.5, 1.0, 1.5--all pink skies. As for the saturation point, setting it to 16200 (and keeping -H 0 to clip highlights) doesn't help--in fact, it looks worse. Interestingly, reducing the saturation point to about 9000 (while clipping highlights) does help.

I'm not entirely familiar with the steps taken by the library. There seem to be details scattered around this forum, but no central piece of documentation that explains exactly what the conversion process is; the API docs just mention all different flags, but provide little context on how it all ties in together. But all this to say, I imagine the saturation point is used to normalize the data to 16 bits, so:

1. Load raw values, with data in range [min_sensor_value, max_sensor_value]
2. Subtract black point, taking data to range [0, max_sensor_value - min_sensor_value]
3. Divide by saturation point, taking data to range [0.0, number supposed to be 1, but could be more]
4. Clip (depending on highlight mode), taking data to range [0.0, 1.0]
5. Scale to `uint16_t` range, taking data to range [0, 65563]
6. Remaining steps.

If the above is correct, then why does setting the saturation point to exactly 16200 still produce pink skies? You can test this on your end with the RAW file to see exactly what I mean. Thanks again for your help.

Regards,
Yusuf.