Using only green pixels to create final image (discard red and blue)


I was hoping for some advice on how to have libraw completely discard the red and blue pixels and only use the green pixels to interpolate for the final image. This is for a scientific application. I have done a fair amount of research and I believe there is no way for me to do this without modifying the source code. That is fine with me.

The issue is I am inexperienced with C++ and after spending a few hours going through libraw_cxx.cpp I have no idea how to approach this. My guess was that I can insert some zeros into the constant color conversion matrices such that the red and blue values get set to zero.

I am hoping to do this is in the most efficient and fast way possible, any advice would be appreciated.

Thank you.

In case you are wondering the details of why I would want to do this- it is because we are looking at a phosphor that only shines green light, and only care about this signal. Therefore any data coming from the red and blue pixels can only be adding noise.


The red and blue filters on

The red and blue filters on most digital cameras are 'wide', so these channels will respond to your (monochrome green) signal.

BTW, if you want to ignore R/B data completely, you may use:

LibRaw lr;

After these piece of code, your image pixels will reside in lr.imgdata.image[][4] array.
To access pixel value you may use lr.imgdata.image[(row*width)+col][C], where C is color index (R: 0, G: 1, B: 2, Green2: 3).
So, you may interpolate anything you want using only G and G2 channel data.

Use lr.COLOR(row,col) to get color index for given pixel (i.e. which bayer color corresponds to row/col pair)

-- Alex Tutubalin

Hi Alex,

Hi Alex,

Thank you very much for your help! I have been working on this a lot the past few days and have made significant progress. I have a problem though and not sure if you might be able to point me in the right direction.

My code works perfectly with files from D7100 and D5200, but fails on all other .NEF files. I understand Nikon made some changes with d7100 and d5200, but can seem to figure out how this would affect my use of libraw.

The libraw executables alone still work fine with all the nef files I try, but for some reason my code only takes d7100 or d5200 and I have no idea why.

Thanks again.
Edit- more info. My code segfaults when I don't use d7100 or d5200 NEFs.

Segfaults: please show the

Segfaults: please show the code. It is possible, that you run out of image[] array bounds.

-- Alex Tutubalin

Hi Alex,

Hi Alex,
I am using python ctypes to call the c api to libraw. With D5200 and D7100 nef files it works, but with D4, D700, D800 etc it fails. I noted that D700 is actually a smaller image[] array than D7100.

Here is my code: (I did not include the massive struct declarations from the top of the source code for brevity).

from ctypes import *
from ctypes.util import find_library
import numpy as np
from PIL import image
#read test file into buf
buf=create_string_buffer(open('colortest.nef', 'rb').read())
#make it so init returns a pointer to struct declared above
lr.libraw_init.restype = POINTER(libraw_data_t)
handle = lr.libraw_init(0)
lr.libraw_open_buffer(handle, buf, len(buf))
size=4*rwidth*rheight*2 #size of image[] , 2 is np.uint16 size
#open handle.contents.image as a numpy array of unsigned 16bit integers
buffer_from_memory = pythonapi.PyBuffer_FromMemory
buffer_from_memory.restype = py_object
buffer = buffer_from_memory(handle.contents.image, size)
a= np.frombuffer(buffer, np.uint16)
#reshape from 1d to 2d array
#grab only the red pixel array and store it in rs
rs = a[:,0].reshape(rheight,rwidth)
#save rs as 16bit tiff
im = Image.fromarray(rs, mode='I;16')'testred.tif')

So if I run this code with my NEF files, as long as the nef file has d7100 in the exif data, it will save a tiff. If it finds, for example, D800 I will get a seg fault which mentions memcpy() . Because I am using python with ctypes I do not get much more information than that.

I tested several of the included executable in libraw and they all work for all of my nef files.

You're using raw_width/raw

You're using raw_width/raw_height as image[] array size, so you're run out of array bounds.
The image array is sizes.iwidth * sizes.iheight in size.

More details:
raw image from sensor contains image itself and some 'masked pixels' (or optical black pixels, or 'masked frame') at the image edges..
The full sensor (raw data) size is raw_width x raw_height
The visible area size is sizes.width x sizes.height
The image array size is sizes.iwidth x sizes.iheight ( iwidth == width and iheight == height unless you set params.half_size to non-zero).

So you need to set your rwidth variable to sizes.iwidth (and same for rheight = sizes.iheight).

-- Alex Tutubalin

Thank you Alex.

Thank you Alex.

That fixed it instantly! Your help is much appreciated.

Take look of 4channels sample

Take look of 4channels sample in LibRaw/samples.
It works very simple
- it uses half-mode to fill all elements of image[] array (but the array is 1/4 of size, so every 4 bayer pixels are folded into one image[][4])
- it demonstrates access to image[][] elements for auto-scale

-- Alex Tutubalin

Here is the really strange

Here is the really strange thing... if I modify the EXIF data of the d800 nef file and change it to say d7100, it works perfectly (I only need the unpacked image data since I handle post processing in my program). No more seg faults. I have no idea why this is the case.

Also you were right-- the red and blue channels do contain some signal that we want to keep.

What do you mean for 'fails'

What do you mean for 'fails' (and 'works properly')?
The data are not in image[] array or what?

-- Alex Tutubalin

Sorry I was not very

Sorry I was not very descriptive.
Whenever it fails, I get a segmentation fault when I try to read out the values from the image[] array to save them. When it works my code successfully saves a tiff (which is just a test case- I save only the red pixels- the image[][0] array).

you're run out of array

you're run out of array bounds. see my previous comment above.

-- Alex Tutubalin