Skip to content

MacOS scaling, HiDPI, LoDPI explanation

waydabber edited this page Dec 11, 2023 · 2 revisions

A brief explanation of macOS scaling, LoDPI and HiDPI resolutions.

What is LoDPI and HiDPI?

By default macOS will use LoDPI rendering for screens that do not meet a certain pixel density (which is measured in PPI - Pixel Per Inch) and HiDPI for high density screens. With LoDPI rendering 1 memory pixel is matched to 1 logical pixel (a logical pixel is an addressable unit in the coordinate system of the screen via integer coordinates). This means that for a display with a LoDPI resolution of 2560x1440 (QHD), the image will be rendered and stored in a video memory area (this is called the framebuffer) that can hold 2560x1440 pixels. If the framebuffer resolution matches the display' physical resolution (this is called 1:1 mapping - in this example this would be a QHD 1440p display), each physical pixel will show one framebuffer pixel (corresponding to one logical pixel).

Contrary to this, for high resolution displays HiDPI rendering is enabled. In HiDPI mode 4 framebuffer pixels are matched to 1 logical pixel. A HiDPI resolution of 2560x1440 is backed up by a 5120x2880 (5K) framebuffer which is 4x as large as a LoDPI framebuffer would be. This improves the clarity of the rendered image greatly. If the display's resolution matches the framebuffer resolution (1:1 mapping - in this example it would be a 5K display), all this extra clarity directly appears on the screen. MacOS uses an other technique, called bitmap - raster - scaling that can resize (scale) the image in the framebuffer to the physical resolution of the display. This allows macOS to properly show (in our example) the 5K HiDPI framebuffer (2560x1440 logical resolution, 5120x2880 pixel resolution) on a display with different - usually lower - resolution, a 4K display for example (which actually has 3840x2160 pixels). This way the 4 pixels of a logical pixel will match up to about 2.25 pixels (on average) of the 4K screen. This scaled image with a framebuffer resolution of 5120x2880, scaled physical resolution of 3840x2160 and logical resolution of 2560x1440 will be displayed on screen - still giving sufficient clarity while keeping the size of the macOS desktop optimal (for a 27" or 32" 4K screen - for which 2560x1440 logical resolution is recommended). In this case 1:1 mapping is not possible but the screen still looks much sharper and vastly superior to a LoDPI (2560x1440 QHD) screen. This is called fracitional scaling. Fractional scaling might be detrimental to some use cases (for example pixel artists might not enjoy this and some LCD test patterns will look weird), however 95% of users will be happy.

Unlocking HiDPI resolutions, creating arbitrary scaled resolutions

BetterDisplay lets you unlock HiDPI rendering for LoDPI displays and create arbitrary resolution (framebuffer) sizes both for Intel and Apple Silicon Macs. This mostly makes sense if you are not satisfied with the default scaling options provided by macOS or you use a mid-PPI display (like a 24" QHD display) that has sufficiently high PPI so the default resolution (1440p for a QHD display) will make the GUI too small but enabling a fractionally scaled interim HiDPI resolution. For a display like this, 1920x1080 HiDPI (with a 4K framebuffer) will not only make the GUI look the right size, it will improve clarity as well. BetterDisplay also has a related feature (native smooth scaling) to enable flexible HiDPI resolutions so you can change the desktop scaling on-the-fly to any % of the screen like it is possible in Windows or Linux (but: this still uses bitmap scaling, does not convert the desktop to be vector based).

HiDPI for low PPI displays?

Unlocking HiDPI for 27" QHD users have some (but limited) use for low PPI displays (like a 27" QHD for example). For a 27" QHD display, HiDPI rending with a 5K framebuffer only creates as a kind of antialiasing/supersampling effect which will make look everything smoother (but has real benefits like high resolution screenshots or accessibility zooming). This is because for a 27" QHD 1440p display, a HiDPI 5K framebuffer will result in 1 physical pixel mapped to 1 logical pixel - which although with HiDPI enabled contains 4 framebuffer pixels having all the extra details and clarity, but this 4 pixels will still be averaged back down to just 1 physical pixel to be shown (as the scaling engine scales the 5K image back to the display's QHD resolution). The antialiasing/smoothing effect comes from the various smoothing and image enhancing algorithms used by the scaling engine (and for some scenarios might actually be detrimental, creating a semi-1:1 pixel mapped screen as the scaling engine - due to its inherent use of averaging - will let adjacent pixels somewhat bleed over to each other). Also there is the performance (and memory) impact of running a 4x framebuffer and 4x rendering work just to get a slightly smoothed image.

Lack of subpixel rendering on macOS - why text looks bad on low PPI displays?

Also important to note that HiDPI rendering for low PPI screens will not make text look any sharper on macOS - for that an other technique, subpixel rendering would be needed (this takes into account the red, green and blue subpixel structure of an LCD screen to effectively triple the physical resolution and with some clever tricks render fonts in a way that looks more pleasing - and feels sharper - to the eye). This was removed from macOS a while back after Apple transitioned its own product line to high PPI displays (subpixel rendering is fundamentally incompatible with fractional bitmap scaling which is the cornerstone of Apple's current approach with macOS - subpixel rendered structures do not survive raster scaling -, so the removal should be understood in this light, not as something sinister on Apple's part). This is why text does not look as nice with low PPI displays on macOS like it does on Windows or Linux (modern Windows apps and in a large part the Windows desktop in general is vector based which is actually superior to how macOS does things - but it in return it wasn't an easy thing to implement it with all that legacy code - it took Microsoft years and multiple iterations - riddled with horrible user experience - with their approach to get to where Apple jumped in under a year with the raster scaling method in terms of resolution independence).

More information about subpixel rendering: https://en.wikipedia.org/wiki/Subpixel_rendering