I'm trying to use CIColorMatrix in Swift on iOS, and I'm coming up with a result that doesn't match my understanding of how it is supposed to work.
The documentation defines it as:
s.r = dot(s, redVector)
s.g = dot(s, greenVector)
s.b = dot(s, blueVector)
s.a = dot(s, alphaVector)
s = s + bias
(255*0.5)+(0*0)+(0*0)+(255*0) = 127.5
// Before this, the RGB values are [255, 0, 0]
let vec = CIVector(x: 0.5, y: 0, z: 0, w: 0)
let filter = CIFilter(name: "CIColorMatrix")
// Default bias is [0,0,0,0] (explicitly setting it as such doesn't change the result).
filter!.setValue(myImage, forKey: kCIInputImageKey)
filter!.setValue(vec, forKey: "inputRVector")
// After this, the RGB values are [187, 0, 0]
All Core Image filters operate in the working color space of the
CIContext executing them. Filter kernels also work with alpha-unpremultiplied pixel values, while filter output is premultiplied. If you're seeing results that aren't what you expect, check that your working and output color spaces are configured as you want them, and make sure you aren't doing anything funny with alpha premultiplication.