Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have been using only 4k monitors in Linux for at least a decade and I have never had any problem with fractional scaling.

I continue to be puzzled whenever I hear about this supposed problem. AFAIK, this is something specific to Gnome, which has a setting for enabling "fractional scaling", whatever that means. I do not use Gnome, so I have never been prevented to use any fractional scaling factor that I liked for my 4k monitors (which have been most frequently connected to NVIDIA cards), already since a decade ago (i.e. by setting whatever value I desired for the monitor DPI).

All GUI systems had fractional scaling already before 1990, including X Window System and MS Windows, because all had a dots-per-inch setting for the connected monitors. Already around 1990, but probably already much earlier, the recommendations for writing any GUI applications was to use for fonts or for any other graphic elements only dimensions in typographic points or in other display independent units.

For any properly written GUI program, changing the DPI of the monitor has always provided fractional scaling without any problems. There have always been some incompetent programmers who have used dimensions in pixels, making unscalable their graphic interfaces, but that has been their fault and not of X Window System or of any other window system that was abused by them.

It would have been better if no window system had allowed the use of any kind of dimensions given in pixels in any API function. Forty years ago there was the excuse that scaling the graphic elements could sometimes be too slow, so the use of dimensions in pixels could improve the performance, but this excuse had already become obsolete a quarter of century ago.



> For any properly written GUI program, changing the DPI of the monitor has always provided fractional scaling without any problems.

Parent comment was talking about Wayland, and Wayland does not even have a concept of display DPI (IIRC, XWayland simply hardcodes it to 96).

You're correct - in theory. In practice, though, it's a complete mess and we're already waaay past the point of no return, unless, of course, somehow an entirely new stack emerges and gains traction.

> There have always been some incompetent programmers who have used dimensions in pixels

I don't have any hard numbers to back this, but I have a very strong impression that most coders use pixels for some or all the dimensions, and a lot of them mix units in a weird way. I mean... say, check this very website's CSS and see how it has an unhealthy mix of `pt`s and `px`es.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: