I'm trying to automatically show the useful part of a largely transparent png in an iPhone app. The image may be say 500x500 but it is mostly transparent. Somewhere within that image is a non-transparent part that I want to display to the user as large as I can so I want to trim off as much as I can from each side (or make it appear that way by stretching and moving within the UIImageView. Any ideas?
Using Quartz convert the image to a bitmap, examine the alpha channel bits to find the bounds of the non-transparent part of the image.
Here is an Apple Tech Note: Getting the pixel data from a CGImage object. You can get a CIImage from a UIImage with:
CGImageRef imageRef = [uiImage CGImage];