Goal: I'm writing a very simple image viewer for framebuffer /dev/fb0 (something like fbi).
Current state:
- My software takes the pixel resolution from
/sys/class/graphics/fb0/virtual_size(such as1920,1080). - And then (for each row) it will write 4 bytes (BGRA) for each 1920 row-pixels (total 4x1920=7680 bytes) to
/dev/fb0. This works perfectly fine on my one laptop with a 1920x1080 resolution. - More precisely: setting a pixel at
y-rowx-col =>arr[y * 1920 * 4 + x * 4 + channel]where the valuechannelis0,1,2,3(forB,G,R, andA, respectively).
Problem:
When I try the same software on my old laptop with (/sys/.../virtual_size -> 1366,768 resolution), the image is not shown correctly (bit skewed). So I played around the pixel-width value and found out the value was 1376 (not 1366).
Questions:
- Where do these 10 extra bytes come from?
- And, how can I get this value of 10 extra bytes on different machines (automatically, not manually tuning it)?
- Why do some machines need these extra 10 bytes, when some machines don't need them?