vincent vincent - 1 month ago 11x
iOS Question

OPENGL Fast Screen Recording of iOS App

I'm new to OpenGL ES. I'm trying to write code for screen recording of iOS apps, especially games.

I'm using the 'render to texture' method described with code in this answer ( to capture screen and write the video for a cocos2d game. One modification I made was that, when I call

then I'm using
[EAGLContext currentContext]
instead of
[[GPUImageOpenGLESContext sharedImageProcessingOpenGLESContext] context]

It does record the video but there are two issues

  1. When it starts recording then new drawing on the screen stops. I want the app to keep on drawing on the screen too. As I'm new to OpenGL ES, I don't have deep understanding of frame buffer objects etc., so I have a hard time figuring out how to simultaneously draw on screen and capture the screen as well. I'll appreciate a code example in this regard.

  2. The recorded video is flipped upside down. How can I get it in correct direction?

Previously I considered
method too, but that has performance drawbacks.

Update: a couple of ideas also came to mind. According to my little understanding,

I could simply draw my texture back to screen, but don't know how.

Main Draw

// ----- Display the keyframe -----
Texture* t = augmentationTexture[OBJECT_KEYFRAME_1 + playerIndex];
frameTextureID = [t textureID];
aspectRatio = (float)[t height] / (float)[t width];
texCoords = quadTexCoords;

// Get the current projection matrix
QCAR::Matrix44F projMatrix = vapp.projectionMatrix;

// If the current status is valid (not NOT_READY or ERROR), render the
// video quad with the texture we've just selected
if (NOT_READY != currentStatus) {
// Convert trackable pose to matrix for use with OpenGL
QCAR::Matrix44F modelViewMatrixVideo = QCAR::Tool::convertPose2GLMatrix(trackablePose);
QCAR::Matrix44F modelViewProjectionVideo;

// SampleApplicationUtils::translatePoseMatrix(0.0f, 0.0f, videoData[playerIndex][0], &[0]);
SampleApplicationUtils :: scalePoseMatrix(videoData[playerIndex][0], videoData[playerIndex][0] * aspectRatio, videoData[playerIndex][0], &[0]);

SampleApplicationUtils::multiplyMatrix(, &[0], &[0]);


glVertexAttribPointer(vertexHandle, 3, GL_FLOAT, GL_FALSE, 0, quadVertices);
glVertexAttribPointer(normalHandle, 3, GL_FLOAT, GL_FALSE, 0, quadNormals);
glVertexAttribPointer(textureCoordHandle, 2, GL_FLOAT, GL_FALSE, 0, texCoords);


glBindTexture(GL_TEXTURE_2D, frameTextureID);
glUniformMatrix4fv(mvpMatrixHandle, 1, GL_FALSE, (GLfloat*) &[0]);
glUniform1i(texSampler2DHandle, 0 /*GL_TEXTURE0*/);
glDrawElements(GL_TRIANGLES, kNumQuadIndices, GL_UNSIGNED_SHORT, quadIndices);



Add the video texture buffer to the frame

glBindTexture([videoWriter textureCacheTarget], [videoWriter textureCacheID]);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, [videoWriter textureCacheID], 0);


You seem to be missing some knowledge on how these things work and is impossible to know where your issue lies. Let me break down a few things for you so you may identify your issue, hopefully fixing it yourself but otherwise ask another question with a bit more specific parts of code showing what is working and what is not.

So to begin with the frame buffer, this object is a container for other buffers which in your case may be textures and render buffers. By binding a specific frame buffer you will tell the GPU to draw to the buffers that are attached to this frame buffer. The most common attachments are color, depth and stencil buffers. Most commonly the procedure of creating a new frame buffer will look something like this:

  • Generate the new frame buffer to get the ID and bind it
  • Generate a render buffer or a texture to get the ID and bind it
  • Set the data and the format of the buffer using either glRenderbufferStorage or glTexImage2D or in iOS if you want to bind it to view you have renderbufferStorage:fromDrawable: on the EAGLContext object. (the last one might be done by some higher level object such as GLKView)
  • Attach the render buffer to the frame buffer using glFramebufferRenderbuffer or glFramebufferTexture2D

This way you may create any number of frame buffers to which you draw. To chose which buffer to draw to you will simply need to bind the correct frame buffer.

Generally we split the frame buffers to main frame buffer and frame buffer objects (FBO). The main frame buffer is the one that represents your screen and has an ID of 0 so in most platforms you may simply bind a 0 indexed buffer to continue drawing to the main screen/view. BUT (and this is very important) on iOS there is no such thing as a main frame buffer from your perspective. You need to save the ID of the render buffer that is generated via renderbufferStorage:fromDrawable: and if this is not in your code (you are using higher levels of tools) you will need to ask the GPU for the buffer ID by calling

GLint defaultFBO;

(found here) But you need to be sure you are calling this when your main buffer is bound, so preferably just before you start recording so you know it is the correct buffer.

Then FBO is any other frame buffer that is not main. It is also called an offscreen frame buffer. These are generally used for post-processing or pre-processing. In your case you use one to draw the pixels into the texture data which can very easily be accessed by the CPU with the texture cache described in the link you provided. So now you do have at least 2 buffers, the main buffer and the offscreen buffer. The offscreen buffer will have the texture which can then be redrawn to the main buffer. Now your basic pipeline on draw should look something like this:

  • Bind FBO
  • Draw the scene
  • Process the FBO for video recording
  • Bind main buffer
  • Bind FBO texture
  • Draw the bound texture to the main buffer

There are other ways as well such as

  • Get data from media (camera, video...)
  • Bind FBO
  • Draw media data to FBO
  • Process FBO for video recording
  • Bind main buffer
  • Draw media data to main buffer

This will draw the scene twice instead of reusing the same scene from the texture in the FBO. If you are not doing any heavy operations on drawing these are basically the same but it is still best to use the first one.