I am running into an issue that I am sure many iOS developers have experienced before, so I am coming here to get some help. I work at a company where the design specs for the iPhone app I am working on are made in Adobe PhotoShop. I get these design specs and am told to "make it happen".
In an effort to follow the design specs as closely as possible, I often use the DigitalColor Meter utility that is included in OS X. It is a powerful and useful tool that has been very helpful. The problem is that it is capable of displaying different RGB scales.
So for example, if I am looking at an image exported from PS and I am using the generic RGB scale, I could look at a gray value and get 234/234/234. That's fine. I put that into my iOS app using UIColor and I get a color that looks right, but when I look at it using the DigitalColor Meter the value is 228/228/228!
How can I get a more consistent workflow? How can I make it so that the value I get from the PNG image from PS and the image that shows up in my simulator and even device are the EXACT same? Is that possible?
I am pretty sure that the different iOS devices out there (iPhones, iPads) have different characteristics in terms of their display color profiles: if you used a different device, you would have got a different result from Digital Color Meter.
In principle, to solve this problem, you should color profile the device display; once you have a color profile of the display, you could use that in Photoshop to get the values that you should specify so that the output on the profile display would match the original color. To create a color profile, you should use a specific tool (colorimeter) and a specific software (there are many on the market).
In practice, since, as I said, each device has its own characteristics, you would need a profile for each one of them and then use a different set of colors for each device. Pretty unmanageable.