David Sacco David Sacco - 3 months ago 9
iOS Question

Preserving negative values after filtering

Consider the really simple difference kernel

kernel vec4 diffKernel(__sample image1, __sample image2)
{
return vec4(image1.rgb - image2.rgb, 1.0);
}


When used as a
CIColorKernel
, this produces the difference between two images. However, any valus for which
image1.rgb < image2.rgb
(pointwise) will be forced to zero due to the "clamping" nature of the outputs of kernels in CIKernel.

For many image processing algorithms, such as those involving image pyramids (see my other question on how this can be achieved in Core Image), it is important to preserve these negative values for later use (reconstructing the pyramid, for example). If
0
's re used in their place, you will actually get an incorrect output.

I've seen that one such way is to just store
abs(image1.rgb - image2.rgb)
make a new image, who's RGB values store 0 or 1 whether a negative sign is attached to that value, then do a multiply blend weighted with
-1
to the correct places.

What are some other such ways one can store the sign of a pixel value? Perhaps we can use the alpha channel if it being unused?

Answer

I actually ended up figuring this out -- you can use an option in CIContext to make sure that things are computed using the kCIFormatAf key. This means that any calculations done on that context will be done in a floating point precision, so that values beyond the scope of [0,1] are preserved from one filter to the next!