Strange 14bit output with libraw.dll + unprocessed_raw.exe

Hi!

While using libraw.dll together with unprocessed_raw.exe to decode various RAW file formats to simple to read PGM files for experimentation with my program
I found this strange thing with only Sony ARW files. That even for the input files that are meant to be 12bit (RAW pixel values somewhere in the range of 0-4095) I get output files with RAW pixel values somewhere in the range of 0-16383 (corresponding to 14bit).

Looking at the histograms confirms that "empty spaces" are added.

For example a ARW file from my Sony NEX-5 decoded with dcraw (options: -v -r 1 1 1 1 -o 0 -E -j -4 -t 0) will give a PGM file with pixel values ranging from 53 to 4090 and the histogram looks like this (showing values 0-4090 and each primary color representing one RAW channel): https://postimg.cc/CBmGwLgV

The same file decoded with libraw.dll + unprocessed_raw.exe gives a PGM file with pixel values ranging from 212 to 16360 and the histogram looks like this (showing values 0-9999 since postimg wont allow bigger images): https://postimg.cc/zbrNq8kp

Forums: 

I do not see a question here,

I do not see a question here, so just a general remark.

Sony ARW 2.3 (lossy compressed) format is:
- 11-bit values (11 bit base + 7 bit deltas)
- expanded by expansion curve w/ ~0..16383 range (really slightly more)

dcraw converts it to ~0...4095 range, this is OK for old cameras but not for A7 series. LibRaw uses 'natural' camera range.

-- Alex Tutubalin @LibRaw LLC