Sam Far Sam Far - 4 months ago 8
C Question

4K screen capturing in Windows and directly save into a buffer

I know there are many posts across the web to do screen capturing in Windows either using GDI or DirectX approaches. However, all I found save the captured image to a bitmap, whereas I want to save it into a buffer instead. Here is my code to do so in GDi way:

HWND hwind = GetDesktopWindow();
HDC hdc = GetDC(hwind);

uint32_t resx = GetSystemMetrics(SM_CXSCREEN);
uint32_t resy = GetSystemMetrics(SM_CYSCREEN);
uint32_t BitsPerPixel = GetDeviceCaps(hdc, BITSPIXEL);
HDC hdc2 = CreateCompatibleDC(hdc);

BITMAPINFO info;
info.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
info.bmiHeader.biWidth = resx;
info.bmiHeader.biHeight = resy;
info.bmiHeader.biPlanes = 1;
info.bmiHeader.biBitCount = BitsPerPixel;
info.bmiHeader.biCompression = BI_RGB;

void *data;
static HBITMAP hbitmap = CreateDIBSection(hdc2, &info, DIB_RGB_COLORS,
(void**)&data, 0, 0);
SelectObject(hdc2, hbitmap);
BitBlt(hdc2, 0, 0, resx, resy, hdc, 0, 0, SRCCOPY);

uint8_t *ptr = new uint8_t[4 * resx * resy];
uint32_t lineSizeSrc = 4 * resx; // not always correct
uint32_t linesizeDst = 4 * resx;
for (uint32_t y = 0; y < resy; y++)
memcpy(ptr + y * lineSizeDst,
(uint8_t*) data + y * lineSizeSrc,
lineSizeDst);

DeleteObject(hbitmap);
ReleaseDC(hwind, hdc);
if (hdc2) {
DeleteDC(hdc2);
}


First, as far as I know, the value of
lineSizeSrc
in this code is not always correct since depending on the screen resolution, some zeros may be added to each line of
data
. Can anyone please explain when the zeros are added and how to get the correct value for
lineSizeSrc
?

Second, is it possible to get the captured image in 4K resolution regardless of the resolution of the monitor, for instance by forcing the graphics card to output in 4K resolution?

Answer

First, as far as I know, the value of lineSizeSrc in this code is not always correct since depending on the screen resolution, some zeros may be added to each line of data. Can anyone please explain when the zeros are added and how to get the correct value for lineSizeSrc?

The bitmap format requires that each line begin at an address that's a multiple of 4 bytes. Often, this just works out because common image widths are multiples of 4 or because the size of an individual pixel is 32-bits (which is 4 bytes).

But if you're representing an image with an unusual width (e.g., 31 pixels wide) and using something like 24 bits (3 bytes) per pixel then you'll need to pad the end of each line so that the next line starts on a multiple of 4.

A common way to do this is to round up the "stride":

lineSizeSrc = (resx * BitsPerPixel + 31) / 8;

resx * BitsPerPixel tells us the number of bits needed to represent the line. Dividing by 8 converts bits to bytes--sort of. Integer division truncates any remainder. By adding 31 first we ensures that the truncation gives us the smallest multiple of 32 bits (4 bytes) that's equal to or larger than the number of bits we need. So lineSizeSrc is the number of bytes needed for each row.

You should use lineSizeSrc instead of resx in the calculation of how many bytes you need.

Second, is it possible to get the captured image in 4K resolution regardless of the resolution of the monitor, for instance by forcing the graphics card to output in 4K resolution?

There's not a simple, works-in-all-cases method. Your best bet is probably to ask the program to render to a window that's 4K, even if the graphics card isn't in that mode. Some programs will support this, but others might now. Look at the documentation for the WM_PRINT and WM_PRINTCLIENT messages.