Iron-Eagle Iron-Eagle - 4 months ago 15
iOS Question

How to render font in iOS in simple one-line way?

I would like to render text in iOS to a Texture, So I will be able to draw it using OpenGL. I am using this code:

CGSize textSize = [m_string sizeWithAttributes: m_attribs];
CGSize frameSize = CGSizeMake(NextPowerOf2((NSInteger)(MAX(textSize.width, textSize.height))), NextPowerOf2((NSInteger)textSize.height));

UIGraphicsBeginImageContextWithOptions(frameSize, NO /*opaque*/ , 1.0 /*scale*/);

CGContextRef currentContext = UIGraphicsGetCurrentContext();
CGContextSetTextDrawingMode(currentContext, kCGTextFillStroke);

CGContextSetLineWidth(currentContext, 1);
[m_string drawAtPoint:CGPointMake (0, 0) withAttributes:m_attribs];


When I try to use kCGTextFillStroke or kCGTextStroke I get this:

enter image description here


When I try to use kCGTextFill I get this:

enter image description here


Is there any way to get simple, one line clean text like this? (Taken from rendering on OS X)

enter image description here

Answer

This looks like an issue with resolution but no matter that...

Sine you are using iOS I suggest you to use an UI component UILabel for instance. Then set any parameters to the label you wish which includes line break mode, number of lines, attributed text, fonts... You may call sizeToFit to get the minimum possible size of the label. You do not add the label to any other view but create an UIImage from the view (you have quite a few answers for that on SO). Once you have the image you may simply copy the raw RGBA data to the texture (again loads of answers on how to get the RGBA data from the UIImage). And that is it. Well you might want to check for content scale for retina x2 and x3 devices or do those manually by increasing the font sizes by the corresponding factors.

This procedure might seem like a workaround and might be much slower then using core graphics but the truth is quite far from that:

  • Creating a context with size and options creates an RGBA buffer same as for the CGImage (the UIImage only wraps it)
  • The core graphics is used to draw the view to UIImage so the procedure is essentially the same under the hood.
  • You still need to copy the data to the texture but that is in both of the cases. A little downside here might be that in order to access the RGBA raw data from the image you will need to copy (duplicate) the raw data somewhere down the line but that is a relatively quick operation and most likely same happens in your procedure.

So it is possible that this procedure consumes a bit more resources (not much and possibly even less actually) but you do get unlimited power when it comes to drawing a text.

Comments