Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The Windows high resolution "support" is done in such a bad way, and I think Windows has the worst possible type of support for higher resolutions out of all the operating systems. They're only making some items "larger" to appear "normal" under the big resolutions. But all the other stuff won't. Plus, what are they going to do for 4k displays? Increase it to 300%?

They should've done it like Apple did it, and it would've been much more streamlined and would make a lot more sense. Here's how they should've done it.

With resolutions higher than 1080p you shouldn't actually get more density in terms of content per screen real estate (what's the point of that? 1080p makes things small enough as it is). Instead they should only support resolutions after 1080p that are exactly "double" (or 4x the pixels) of the lower resolutions. This way, those high resolution displays, can use the "effective" lower resolution.

So 2732x1536 -> effective 1366x768

3200x1800 -> effective 1600x900

3840x2160 ("4k") -> effective 1920x1080

This is the best way to jump to higher resolutions and easiest way to support them at the OS level, instead of these icon scaling "hacks" that Microsoft is implementing.



Apple was able to do it the way they did because there are so few choices in Apple hardware. Sure the "double-or-nothing" approach simplifies things but it's not a practical approach for large ecosystems like Windows or Android where the resolutions vary a lot more.


Arguably if Windows only supported double-or-nothing, the PC vendors would just start shipping appropriate screens.


But that would only work on new hardware, not on the millions of existing machines that people will upgrade, which means it would still have to support the old resolution model to support older machines and thus the incentive for hardware makers to put on higher resolution screens would be reduced since they could get away with older crappy screens.


If an OS dropped by 30" to 1280x800 I would never use it. And assuming everyone has a 16:9 is a common mistake as well.

Telling people to double up or deal with shitty performance is a much worse proposition than what Microsoft is doing.


... except that Windows has supported resolutions higher than 1920x1080 for 15 years already. Should they remove support for 1920x1200 and 1600x1200 displays from future releases?


You obviously didn't even read the linked article since it has screenshots demonstrating that Windows has full support for DPI-based scaling of entire UIs, not 'icon scaling'.

Windows has had DPI-based scaling of user interfaces since Windows 95 where you would set your display DPI and all applications on the system would (theoretically) adapt. The problem is that app developers have historically been completely deficient in this regard; in practice they either hard-code pixel-perfect layouts (but don't lock the font sizes, so text gets cut off), or half-ass it and get the DPI scaling completely wrong by starting to implement it and then stopping. It has literally been possible for Win32 applications to do everything that a OSX/iOS Retina application does since 1995. This feature was even supported in Visual Basic!

In Windows Vista, Microsoft responded to this by adding a new system where unless an application explicitly told the window manager 'yes, I'm actually DPI aware', the window manager assumes that the app will completely muck up DPI scaling, and it renders to a lower-resolution window buffer and scales it up so that text/object sizes are appropriate for your display DPI. Despite this, there are still applications that tell the window manager 'I'm DPI aware!!!' when they're not. Note that this scaler uses an actual scaling algorithm, unlike Apple's nearest-neighbor, so text scaled up in this fashion remains perfectly readable (albeit blurry), unlike the complete mess Apple turns Cleartype text into.

In practice the problem here is ENTIRELY developers and consumers, not Microsoft. Consumers buy (and continue to buy) displays that have resolutions that are not an even integral multiple of some other display resolution, and continue to buy applications that are not correctly DPI aware. Developers respond to this by continuing to ship broken applications that don't respond correctly to display DPI.

Microsoft could do whatever they wanted, including directly mirroring Apple's approach, and none of this would change.

Apple's approach only works because they have a complete monopoly on their platform and they use it to force developers to waste resources on whatever changes they introduce - a new approach to DPI awareness and rendering that requires introducing 2x versions of all your UI bitmaps, a new sandboxing mechanism and app store that requires it, a new UI toolkit, new font rendering APIs, etc. Usually Apple at least uses this power to improve things for consumers, but it's naive to look at how Apple handled the Retina transition and say 'if only Microsoft had done that too' - the Retina transition was incredibly expensive for developers and continues to be expensive for end-users (by making shipped applications larger and potentially slower and definitely more complex).

Don't even get me started on the blatant stupidity Apple's approach to Retina introduced into HTML5/Canvas/WebGL. getImageDataHD and devicePixelRatio, hurray!


Let's review the evidence: Windows apps that claim to support high DPI are mostly broken, but Mac/iOS apps that claim to support retina actually do support retina. If the retina approach is "incredibly expensive for developers", then how expensive must the Windows 95 approach be?


The difference is that Apple forced developers to implement it, so they did. Microsoft didn't force developers, so they didn't implement it. That's it.

The fact that it's optional means that the cost is something developers (and indirectly, customers) can CHOOSE to pay if it is worthwhile. Compare this to Retina, which is basically non-optional because Apple ensured that non-retina applications are an eyesore with reduced text legibility.

I certainly won't argue that DPI-aware programming is easy on any platform. But Retina is not some superlative panacea: It's expensive too, and it has really significant, notable downsides. Like how it basically ruined the rendering model for Canvas/WebGL.


Somebody should make apple stop forcing developers to do things. That apple made the choices very simple and deployed hardware widely had nothing to do with it.

Apple forces is into the future while th Luddites kick and scream.


The problem with having different displays and having stuff look on them properly sized on various resolutions and densities has been solved for more than a decade by the game industry.

Is there a reason outside of legacy code base that everything in the OS is not vector based?


There are still enough situations where pixel alignment makes a noticeable improvement in performance and crispness.

Lots of displays out there are still 96 DPI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: