For me Mathematica is much more akin to numpy+sympy+matplotlib+... with absolutely crazy amount of batteries included in a single coherent package with IDE and fantastic documentation. In a way numpy ecosystem already "won" industry users over, yet Wolfram stack is still appealing to me personally for small experiments.
Dimming reduces total brightness over time and shifts color balance away from neutral. Latest "Pro" displays from Apple now have built-in support for calibration but only with high end calibration equipment: https://support.apple.com/en-gb/guide/mac-help/mchl628f5edf/...
The implementation is great, since it doesn't add fix up profile on top factory calibration, but actually fully replaces factory calibration internal LUT. It worked wonders to completely fix my M1 MBP screen that got noticeably tinted over time. I don't mind brightness reduction since I almost never use it at more than 200 nits, usually around 100. Nominal 1600 leaves lots of buffer for decay over time.
I've had similar issue my OLED TV with the same fix. Got my LG C1 calibrated as well and it looks fantastic again.
It's a shame there are no iOS or Android phones that support calibration out of the box. Some iPads support subset of pro display calibration software (called fine tune calibration), but still lack full recalibration support.
Don't get me wrong it's a fantastic product and great price point, but the only thing it makes me think of is the complete failure of iPadOS. Ultra portable MacBook with is A18 with 8G of ram is infinitely more useful to me (for non-pen input) than full M4/M5 chip with more ram that's completely wasted due to needless OS restrictions.
Does Apple make a profit of iPad Pro and Air, do you think? Is it a "failure"?
Their mere existance screams "our iPads are a gazillion times better and more powerful than Android tablets". Remember, they NEED to have such a reputation to charge luxury product prices (for tablets and otherwise).
Think about market segmentation. iPad and Neo are for students and everyday/coffee-table computing. iPad Pro, MacBook Pro with M5 Max, Mac Studio, Studio Display, Watch Ultra etc are chasing a completely different market (niche power users and vanity purchases).
I might be the only one, but it's still to this date (and dating all the way back to 2014 with the first iMac 5k display) Apple is the only company that truly gets HIDPI desktop displays with high quality gloss and 200+ ppi at screen this large. In the meantime popular and widely sold gaming screens with matte blur filters and mediocre ppi give me headache and eye fatigue after a few hours of use. Prior generation Studio Display is the only external display that truly worked for text heavy work with my eyes (including software engineering), and I'm sure the latest generation is fantastic as well.
The hardware is great, but the software is lacking. macOS only supports resolution-based scaling which makes anything but the default 200% pixel scaling mode look bad. For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry because macOS renders at a higher resolution and then downscales to the 4K resolution of the screen.
Both Windows and Linux (Wayland) support scaling the UI itself, and with their support for sub-pixel anti-aliasing (that macOS also lacks) this makes text look a lot more crisp.
I would love to see examples of this. I have a MBP and a 24" 4K Dell monitor connected via HDMI. I use all kinds of scaled resolutions and I've never noticed anything being jagged or blurry.
Meanwhile in Linux the scaling is generally good, but occasionally I'll run into some UI element that doesn't scale properly, or some application that has a tiny mouse cursor.
And then Windows has serious problems with old apps - blurry as hell with a high DPI display.
Subpixel antialiasing isn't something I miss on macOS because it seems pointless at these resolutions [0]. And I don't think it would work with OLED anyway because the subpixels are arranged differently than a typical conventional LCD.
[0] I remember being excited by ClearType on Windows back in the day, and I did notice a difference. But there's no way I'd be able to discern it on a high DPI display; the conventional antialiasing macOS does is enough.
I'm more surprised that you're using a 24" display at any resolution. Of course, everyone has different preferences, but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably.
I'm personally on the old 30" 16:10 2560x1600 form factor, and it's wildly better visually than the 27" 1440p screen by the same brand (all of them Dell) I use at the office.
> I'm more surprised that you're using a 24" display at any resolution
I have an 24" 4K Dell I bought when big 4k screen with good (measured) colors were still expensive. It's a very pleasant screen to use. Sure, it has less real estate than a bigger one, but this is somewhat mitigated by the fact that I can keep it closer to my eyes, so I can use smaller text.
I find it makes me more "focused" in a way. Can't have multiple windowfuls of crap visible at the same time. It's very practical for TWMs. It also works well in a dual screen scenario, for stronger separation when you need it, but I'm still not sure if a single bigger screen is better than two smaller ones for things like having docs up next to code for example.
I find I can't use two 27" or higher screens, they're just too big and I need to turn my head way too much for comfort. At work we have a 2x27" 4k setup, and I basically only use the screen in front of me. Later I've been experimenting with pushing them very far away, but then I just need to increase text size and lose actual real estate.
> but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably
I don't particularly care about refresh rates above 60 Hz (my laptop does 120 Hz, can see the difference, don't care). But I do care about PPI. Which screens are easily available with the PPI of a 4K 24"? I'd expect something like 5k 27" or 6k 32". These are very expensive (>1000 € for a crappy 27" Samsung, 2000 for a 32" Dell) and not that common, at least in France.
> I don't particularly care about refresh rates above 60 Hz (my laptop does 120 Hz, can see the difference, don't care). But I do care about PPI.
I feel basically the same way, and I don't like excessively wide screens or even 16:9. I've always preferred 16:10, and have wavered between 1,2,3 screens over time. 16:9 27" 1440p is not a pleasant form factor, but it's fine in vertical mode.
I tend to prefer PPI, but not at the cost of screen real estate, and I tend to prefer 120hz, but not at the cost of PPI or picture quality. So the Dell Ultrasharp 30" series from years ago, with IPS 60hz and 2560x1600 is perfect for now, and it also lets me run games without investing substantially in brand new gaming PC hardware. The picture quality is great, the price on the used market is great, screen real estate is great, it's just not as sharp or fast as my Mac screen.
I've got my eyes on 32" 6k displays, but since they're so ungodly expensive, I'd really prefer them to have 120hz and good HDR, even though they're not priority attributes for me. I'd keep one of the 30" displays next to it in vertical mode for documentation or log files
> I'm personally on the old 30" 16:10 2560x1600 form factor
I sorta wish that form factor had taken off instead of 27" 1440p. The extra vertical space is really nice, and that seems to be the ideal PPI for 100% scaling IMHO.
I keep telling myself I'd like to get a 4K OLED display at the same PPI, but 40" seems to be conspicuously missing in every monitor lineup... at least at a price that will convince me to buy three of them, anyway.
Agree! I still have several (now discontinued) Philips 40 inch monitors, and that is the perfect size to do programming work. Very little scrolling needed while you work. But I would love to have a 40 inch in 4K+ instead of 2560x1600, why is no one making these? (I did get a Samsung 8K 50 inch, but that's too large for a multi screen setup)
Ya idk what people are getting from ultrawides tbh. They're not great for video, not great for my neck, not enough vertical space, and can be disorienting for gaming. I can certainly imagine scenarios that would make them effective, but I'd just rather have more vertical space
I took one of my dual 24" office monitors during Covid WFH and ended up keeping it when I quit that job. I use it as a second display alongside the MacBook which is on a stand.
I think the largest I would want at my current desk is 27". 30 is way too big for me. But more importantly I want something that matches the crispness of the MBP display, and 1440p and 1600p are too low res.
I have a Macbook pro and a Linux machine attached to my dual 4k monitors.
Fonts on Linux (KDE Plasma on Wayland) look noticeably sharper than the Mac. I don't use subpixel rendering either. I hate that I have to use the Mac for work.
This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read. I ran the 5K Studio Display at 4K scaled for a bit but it was noticeably blurry.
This would've been easily solved with non-integer scaling, if Apple had implemented that.
(I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)
All through the 2000s Apple developed non-integer scaling support in various versions of MacOS X under the banner of “resolution independence” - the idea was to use vectors where possible rather than bitmaps so OS UI would look good at any resolution, including non-integer scaling factors.
Some indie Mac developers even started implementing support for it in anticipation of it being officially enabled. The code was present in 10.4 through 10.6 and possibly later, although not enabled by default. Apple gave up on the idea sadly and integer scaling is where we are.
Here’s a developer blog from 2006 playing with it:
There was even documentation for getting ready to support resolution independence on Apple’s developer portal at one stage, but I sadly can’t find it today.
Here’s a news post from all the way back in 2004 discussing the in development feature in Mac OS tiger:
Lots of of folks (myself included!) in the Mac software world were really excited for it back then. It would have permitted you to scale the UI to totally arbitrary sizes while maintaining sharpness etc.
Yep, I played with User Interface Resolution app myself back then in uni. The impact of Apple's choice to skip non-integer scaling didn't hit me until a few years ago when my eyes started to fail...
> This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read.
> (I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)
The TV is likely a healthier distance to keep your eyes focused on all day regardless, but were glasses not an option?
Glasses would have been the "normal person" fix, but my eyes are great otherwise (better than 20/20 distance vision). So I could focus closer with glasses, but the lenses were worse quality than just sitting farther back.
If you can get used to using it (which really just requires some practice), the screen magnifier on Mac is fantastic and most importantly it’s extremely low latency (by this I mean, it reacts pretty much instantly when you want to zoom in or out).
Once you get used to flicking in and out of zoom instead of leaning into the monitor it’s great.
As an aside, Windows and Linux share this property too nowadays. Using the screen magnifiers is equally pleasant on any of these OSes. I game on Linux these days and the magnifier there even works within games.
> For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry
I use a Mac with a monitor with these specs (a Dell of some kind, I don't know the model number off the top of my head), at 150% scaling, and it's not blurry at all.
I just tested on my 4k display and 150% and 175% were not blurry at all. I'm on a 32 inch 4k monitor. Is it possible this information is out of date and was fixed by more recent versions of macos?
Interesting, maybe it just doesn't bother me, because I do not notice it at all. I was looking at black text on a white background. Maybe it's less of an impact on Q-OLEDs with their pixel layout perhaps? I just checked and I actually run my ultra-wide monitor at 125% resolution and the text looks crisp. That one is a regular LED display but it does have really high pixel density (5120 x 2160, I run it at 3360x1418)
4K pixels is not enough at 27" for Retina scaling.
Apple uses 5K panels in their 27" displays for this reason.
There are several very good 27" 5K monitors on the market now around $700 to $800. Not as cheap as the 4K monitors but you have to pay for the pixel density.
There are also driver boards that let you convert 27" 5K iMacs into external monitors. I don't recommend this lightly because it's not an easy mod but it's within reason for the motivated Hacker News audience.
If your Mac goes bad it can be worthwile. My friend gave me his pre-Retina 27" iMac, part of the circa-2008 generation of Macs whose GPUs all failed.
I removed all the computing hardware but kept the Apple power supply, instead of using the cheapo one that came with the LCD driver board I bought. I was able to find the PWM specs for the panel, and installed a cheap PWM module with its own frequency & duty-cycle display to drive it and control brightness.
The result is my daily desktop monitor. Spent way too much time on it, but it works great!
Wayland supports it (and Chrome supports it very well) but GTK does not. I run my UI at 200% scaling because graphical Emacs uses GTK to draw text, and that text would be blurry if I ran at my preferred scaling factor of 150% or 175%.
GTK uses Pango/Harfbuzz and some other components to draw text, all of which are widely used in other Linux GUI stacks. GTK/GDK do not draw text themselves, so your complaints are not with them directly.
This works with GTK for me at least. I've been using Gnome+Wayland with 150% scaling for almost 4 years now, and I haven't noticed any issues with GTK. Actually, my experience is essentially backwards from yours—anything Electron/Chromium-based needed a bunch of command-line flags to work properly up until a few months ago, whereas GTK apps always just worked without any issues.
If you're using a high-DPI monitor, you might not notice the blurriness. I use a standard 110-DPI monitor (at 200% scaling in Gnome) and I notice it when the scaling factor is not an integer.
Or more precisely, I noticed it eventually as a result of my being primed to notice it after people on this site insisted that GTK cannot handle fractional scaling factors.
Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice (even on a standard 110-DPI monitor used at 150% and 175% scaling) any blurriness in those apps since the app I'm most conditioned to notice blurriness is my browser, and Chrome's viewport is resolution independent except when rendering certain image formats -- text is always non-blurry.
Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default and for years before that could be configured to talk Wayland, so I don't consider that worth talking about. If Xwayland is not involved, the contents of Chrome's viewport is non-blurry at all scaling factors except for the PNGs, JPGs, etc. For a long time, when run at a fractional scaling factor under Gnome (and configured to talk Wayland) the only part of Hacker News that was blurry was the "Y" logo in the top left corner, then about 2 years ago, that logo's PNG file was replaced with an SVG file and the final bit of blurriness on HN went away.
> If you're using a high-DPI monitor [...] I use a standard 110-DPI monitor (at 200% scaling in Gnome)
FWIW, I'm using a 184 DPI monitor with 150% scaling.
> you might not notice the blurriness. [...]
> Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice
I'm pretty sensitive to font rendering issues—to the point where I've complained to publishers about their PDFs having unhinted fonts—so I think that I would have noticed it, but if it's really as subtle as you say, then maybe I haven't.
I do have a somewhat unusual setup though: I'm currently using
$ gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer','xwayland-native-scaling']"
although that might not be required any more with recent versions. I've also enabled full hinting and subpixel antialiasing with Gnome Tweaks, and I've set the following environment variables:
So maybe one of those settings would improve things for you? I've randomly accumulated most of these settings over the years, so I unfortunately can't really explain what (if anything) any of them do.
> Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default
Ah, good to hear that that's finally the default; that probably means that I can safely remove my custom wrapper scripts that forced those flags on.
Sorry, but I haven't ever used a Mac, so I unfortunately can't answer that. I've used Windows with fractional scaling, and most programs aren't blurry there, but the few that don't support fractional scaling are really blurry.
While the original OS X display model, Quartz, evolved from Display PDF via NextStep, I believe that it shifted back to pixel rasterization to offload more of the display stack onto the GPU.
Quartz Extreme?
John Siracusa, Ars Technica:
It's possible that existing consumer video cards could be coerced into doing efficient vector drawing in hardware. Apple tried to do just that in Tiger [note], but then had to back off at the last minute and disable the feature in the shipping version of the OS. It remains disabled to this day.
Yeah this is correct, I don't know why you're being downvoted. The decisions Apple made when pivoting their software stack to high-DPI resulted in Macs requiring ultra-dense displays for optimal results - that's a limitation of macOS, not an indictment of less dense displays, which Windows and Linux accommodate much better.
I bought that original 5k iMac on release day in 2014. I was thrilled with that display, and stoked to see the entire display industry go the route of true quadruple-resolution just like smartphone displays did.
Sadly, it basically never happened. There was the LG display that came out a couple of years later. It didn't have great reviews, and it was like two thirds the cost of an entire 5k iMac.
It took Apple over 7 years to release their standalone 5k display, and there are a few other true 5k displays (1440p screen real estate with quadruple-resolution, not the ultrawide 2160p displays branded as "5k") on the market now with prices just starting to drop below 1,000 USD.
Unfortunately in that time I've gotten used to the screen real estate of the ultrawide 1440p monitors (which are now ubiquitous, and hitting ridiculous sub-$300 prices). As of now, my perfect display for office work (gaming, video/photo work, or heavy media playback are different topics) would be 21:9 with 1440p screen real estate with quadruple-resolution—essentially just a wider version of that original 5k iMac display.
I bought an LG Ultrafine 5k at the time and felt kind of stupid for being spending on it. But nearly 10 years later... its still my daily driver. Best ROI of any tech equipment I've bought. It changed my mind about how to think about it, not just the monitor, but having speaker / camera / mac built in, and all over one cable, its been such a joy when I bounce around the house to be able to plugin / unplug so easily; or when I swap from work to personal laptop. Its such a simple setup. Im definitely considering the Apple one, basically regardless of what it costs, once its time. Its simply been too convenient to have a one-plug solution for the laptop that has everything I need, never breaks (my LG may be exception here lol), and that has somehow taken forever to be super ceded by something better.
Only thing that holds back that thought lately is, I'm suddenly spending more and more time in multi-pane terminals, and my screen real estate needs have dropped. The only two things I greatly miss now on my laptop is keyboard quality and general comfort (monitor height, etc).
The iMac Pro is nearly 9 years old at this point. At the time, there was no other option for a retina-quality 27" display, but you could get a 4k 27" for $400.
A decade later, it boggles my mind that it's so hard to find a retina-class desktop monitor. The successor to the Cinema Display is basically an iMac, and priced like it. There have very recently been releases from ASUS and BenQ, but it still feels like an underserved niche, rather than standard expectation.
Anyone reading this I am begging to please thoroughly test anything that comes out of ASUS before committing. Maybe only purchase with a generous return policy and possibly insurance. They are decent panels but everything around the panel is horrendous. Random connection errors with different machines, poor UX for switching inputs, takes a millenium to boot up and connect to the screen, forget about any support, if you have built in speakers you'd be better off with a tin can connected to your computer.
I’ve frankly have had worse experience with Samsung and better experiences with LG. The model I have is pretty bare bones, which is much better than the Samsung 27 inch 5k I had that just died on me after a couple of years. The LG 28 inch 4k is going on its 6th year. I think if I buy a 6K, I’ll wait for the LG to come down in price a bit ($2k for LG vs $1300 for Asus on Amazon).
They all suck in their own ways. In my experience LG, has random hardware failures (like one audio channel just dying how? I dont know), still kinda slow booting but this has gotten better, and their designs can be hit or miss(terrible stands, aesthetics are not ergonomic enough etc.). Samsung has been better for me but suffers from variations of the above.
These brands all have glowing fans online pushing their products(the flamewars about ASUS made me even hesitate to comment) but they burn their reputations customer by customer and I guess enough have been burned that Apple is able to maintain enough sales.
It was also really disappointing to see 24" 4k displays disappear from the market instead of becoming the new standard resolution for that size. A few years ago, there were several options including a cheap LG that was usually around $300 or less. Those all seem to be gone, likely for good, even though there are still plenty of 24" displays with 1080p and even a fair number with 1440p.
I've been very pleased with my ViewSonic VP2488-4K. A little steep for $550, but if you spend any significant time in front of the screen I think it's very much worth it. I'm planning to buy a second one.
Indeed. I’m holding on to my 24” Dell P2415Q that I got like 10 years ago because it’s the perfect size for my desk and there just isn’t anything in that size to replace it with.
The LG UltraFine's were garbage, but got better over time as either the firmware improved or macOS added drivers that worked around the nonsense. For a while I ran with two of them on an iMac Pro with a 5K itself, but switched to a single Pro Display XDR with a laptop eventually. I'm very sad to see the 6K/32" form disappear, it's by far the best screen I've ever used.
That's pretty good.
I think the sales of monitors have become slow overall, so now they can focus on higher-end stuff to make some money even if it's for niche products at first.
I just saw a brand new display for 70 bucks at a store the other day; the margins must be extremely low.
Because monitors aren't simple. There are dozens of axes along which they can be scaled.
They have resolution (1080p FHD, 1440p QHD, 4K, 5K, 6K, 8K), aspect ratio (16:9, 8:5, 4:3, 3:2, 21:9, 32:9), refresh rate (60 Hz, 75 Hz, 120 Hz, 144 Hz, 165 Hz, 240 Hz, 360 Hz, 480 Hz, 1 kHz, and of course adaptive refresh rate tech including G-Sync), colour quality (depth and accuracy), contrast ratios for HDR, panel technology (LCD-TN, LCD-IPS, LCD-VA, OLED, QD-OLED, WOLED, and now RGB stripe OLED), backlight technology (CCFL, edge-lit LED, miniLED, microLED), connectivity (HDMI/DP, USB-B, USB-C, DP alt mode, Thunderbolt, 3.5 mm, and KVMs).
It's very hard to stuff all this information in one neat model number.
On the consumer's part it makes sense to understand these features and what is necessary for one's use case, filter monitors by said features, and note down the model numbers that satisfy the requirements.
Simplifying their offerings for the sake of the model number doesn't make any sense. Simplifying their offerings for other reasons might make sense, but the companies themselves would be the best judge of whether or not it makes sense for them.
I feel like they do it deliberately, so that you can’t easily research their products and find if they are out of date. They can sell you a monitor from 2012 as if it’s brand new, because you have no idea what it is.
> So apple is just selling generic white labelled slop
There are only ~5 flat-panel manufacturers worldwide: AU Optronics, Innolux, LG Display, Samsung Display, Sharp Display, and recently BOE Display. Apple has to use one of these, even for its bespoke, notched, curved iPhone/iPad displays.
This new 5K 2304-zone panel was developed by LG Display, and is not 'generic white-labelled slop' by any means. It is an extremely good panel in its own right, probably the bleeding edge of LCD technology today achieving top-notch responsiveness, contrast, and colour depth and accuracy.
That MSI monitor will probably retail for ~£800 as will the Asus and LG equivalents, which is not a trivial amount for a monitor. Apple just marked it up 3×, as they are prone to do for anything.
The Apple monitor will likely have better speakers, and I'm not even sure the others will have microphones at all. Apple also does a better job with color accuracy/consistency, at least historically. There's still a sizeable markup, but it's not entirely for nothing.
Back in the day (~15 years ago), when 4K monitors were unheard of and even Apple's high-end displays were still 1440p, you could get a bottom-dollar monitor using one of their panels (e.g. Yamakasi Catleap Q270) for about a third of the price. However, it came with no amenities, a single connector (dual-link DVI only), a questionably legal power cable, and no built-in scaling. The vendors, presumably to prevent refunds, even asked for your graphics card model before selling it to you, because it wouldn't work with low-end cards. Oh, and there were very few in the U.S., so you were typically getting them shipped straight from abroad, customs duties and all.
I disagree, the software and excellent integration in the ecosystem has always differentiated Apple and even years later models from ASUS are still headaches when it comes to everything outside the panel. Its like when gamers used to compare Apple spec by spec (ie. CPU, RAM, Disk) and valued all the software they provide at $0.
These days they still value software at $0 but the specs have become quite competitive and many times exceed what the rest of the market offers.
there is another differences between Apple monitors vs the rest. - standard and peak brightness[1]. All of them are less bright, than Apple's monitors. I'd really wanna know why.
MSI - 1400 nits, LG - 1250 nits, Apple - 2000 nits. That's peak brightness, standard brightness isn't even mentioned, except for Apple one. Is it just cooling or something more?
Those are down to backlight technology, which (usually) is independent of the LCD panel itself. With LCDs, though, it's a fine line as extremely bright backlights can lead to bad bloom.
There’s a solid use case for matte screens. I use an 800R curved monitor and there’s absolutely no way that would work for me if it wasn’t matte. I know this because when I glance over at my coworker’s 1200R glossy screen it’s like looking in a funhouse mirror.
Does gloss mean reflective? Like where I can see the room lights reflecting off my screen. I never considered the possibility that someone might consider that a good thing.
In an environment with little to no reflections, gloss looks so much better. It becomes truly transparent with no distraction. Matte displays always have a little frost to them.
If you do most of your computing in a prepared or controlled room, I can see the logic in that, although I think I'm not personally nearly sensitive enough to care.
For me though, I am frequently working in different rooms with arbitrary lighting situations. Net effect of the gloss is negative for me unquestionably.
What kind of environment is that? Maybe if you're a black person wearing black clothes, no glasses (maybe contacts are ok?) in a room with closed curtains, no lights and nothing reflective, sure.
I used to daily drive an apple thunderbolt display (the last non-retina one, 2560x1440). That thing was atrocious. I could often see the reflections of my glasses, or a white glare if I was wearing a white shirt. At nigh, in a dark office (lights off, just whatever came in from the street).
I'm typing this on a matte "ips black" dell ultrasharp something-or-other at 10% brightness, wearing glasses, a white t-shirt, with an overhead light, and see no reflection or glare on my screen. There's no way in hell I'd go back to a shiny screen.
I understand "anti-glare" technology has improved. The most recent apple screen I've tested is an m1 mbp. It seems somewhat better than my 2013 mbp, but still a worse experience than my 2015 (or thereabouts) 24"@4k dell, which is pretty old technology. My 2025 lenovo has a screen that's much more confortable to use inside.
Paradoxically, I'd say the one environment where I prefer my macs to my matte screens is in bright sunlight. Sure, there are more reflections than you can shake a stick at, but there's always an angle where you can see the part of the screen you want. You have to move around, which is obviously annoying, but you can see. The matte screens just turn to mush. Luckily for me, I hate being out in the sun, so I never encounter this situation in practice.
I think the "frost" you're talking about depends a lot on the screen implementation. I tested once an HP model, 27"@4k, and it did have such an effect. Anecdotally, it didn't handle reflections all that well, either. So maybe it's just a question of lower quality product?
Not anymore, as long as you make sure that any RGB antialiasing is turned off. Linux defaluts to disabling this and doing only grayscale antialiasing, so it looks great on an OLED out of the box. Windows can be configured to do this.
I have no idea what you mean by "Linux defaults to" ... what possible Linux-wide global could there be for antialiasing? Apps are free to turn on different kinds of antialiasing for text rendering all by themselves.
Personally, I can't handle glossy displays, trying to read with reflections gives me a headache. Most other manufacturers offer both glossy and matte, except for Apple, because they know better.
I have an ASUS ProArt Display 27” 5K. And I somewhat regret it.
I love the pixel density. But I don’t love the matte finish. Which is apparently a controversial take. But I really don’t. I like the crisp pop of typography you get with a glossy display. And, for UI design, the matte finish just doesn’t “feel” like the average end-user experience. I am constantly pushing Figma between my laptop display and my monitor to better simulate what a design will look like on an average glossy LCD or OLED display.
LG used to with the Ultrafine 5k (I believe it's discontinued now?)
I got a deal on a used one last year and I love it. It's the only monitor I've used plugged into a MacBook that didn't look notably off (worse) compared to the MacBook's display sitting next to it. Only thing a bit jarring is it's 60Hz but I can live with it.
> In the meantime popular and widely sold gaming screens with matte blur filters and mediocre ppi give me headache and eye fatigue after a few hours of use.
I presume you also mean "when used for text heavy work" here, yes? Or do you mean that these displays tire out your eyes even when used "for what they're for", i.e. gaming? (Because that's a very interesting assertion if so, and I'd like to go into depth about it.)
I constantly see people saying Apple displays are a terrible value. Last Apple display I had was the Thunderbolt 27 but from now on I'm sticking with Apple.
I've had nothing but issues with non-Apple monitors as well. Customer service ime is non-existent if you need a repair. For something I rely on to get work done, I'm starting to think the premium is worth it.
> Apple is the only company that truly gets HIDPI desktop displays with high quality gloss and 200+ ppi at screen this large.
And somehow they completely forgot how to seamlessly work with displays in general. Connect multiple displays via Thunderbolt? Nope. Keep layouts when switching displays? No. Running any display at more than 60Hz? No. Remember monitor positions? No.
I have an Alienware AW2721D and my M series Macs have no problem driving it at 240hz. macOS picks up that it’s a GSync display and supports VRR on it too.
My MacBook M3 Air & Pro laptops can run two QHD displays with one at 240 Hz and the other at 120 Hz. What it can't do is run either above 60 Hz with HDR enabled. But for my use cases, I've never need more than 60 Hz anyway.
How many 27” 5k 120hz+ high PPI are shipping right now? Reddit is particularly clowning on this for the refresh rate and completely ignoring the resolution.
This is a workstation-class monitor for people using these machines to make money. It's not a gamer toy monitor. People on Reddit don't get this. Apple's monitors are fantastic for those of us who use our computers to make money and need high quality. I am not playing video games on the same machine I use to make money.
(I think) what you are thinking of was something introduced around the Catalina>Big Sur transition, when the Pro Display XDR was introduced.
At the time, people were "marveling" at the magic of Apple, and wondering how they did the math to make that display work within bandwidth constraints.
The simple answer was "by completely fucking with DP 1.4 DSC".
I had at the time a 2019 (cheesegrater) Mac Pro. I had two Asus 27" 4K HDR 144Hz monitors, that the Mac had no problems driving under Catalina.
Install Big Sur. Nope. With the monitors advertising DP 1.4, my options were SDR@95Hz, HDR@60Hz. I wasn't the only one, hundreds of people complaining, different monitors, cards, cables.
I could downgrade to Catalina: HDR@144Hz sprung back to life.
Hell, I could on the monitors tell them to advertise DP 1.2 support, which actually improved performance, and I think I got SDR@120Hz, HDR@95Hz (IIRC).
So you don't deserve downvotes on this. Apple absolutely ignored standards and broke functionality for third party screens just to get the Pro Display XDR (which, ironically, I own, although now it's being driven by an M2 Studio, versus the space heater that was the Xeon cheesegrater).
Apart from prioritizing FFI (like Java/Scala, Erlang/Elixir), the other two easy ways to bootstrap an integration of a new obscure or relatively new programming language is to focus on RPC (ffi through network) or file input-output (parse and produce well known file formats to integrate with other tools at Bash level).
I find it very surprising that nobody tried to make something like gRPC as an interop story for a new language, with an easy way to write impure "extensions" in other languages and let your pure/formal/dependently typed language implement the rest purely through immutable message passing over gRPC boundary. Want file i/o? Implement gRPC endpoint in Go, and let your language send read/write messages to it without having to deal with antiquated and memory unsafe Posix layer.
Like many others, I too would very much like to hear about this.
I taught our entry-level calculus course a few years ago and had two blind students in the class. The technology available for supporting them was abysmal then -- the toolchain for typesetting math for screen readers was unreliable (and anyway very slow), for braille was non-existent, and translating figures into braille involved sending material out to a vendor and waiting weeks. I would love to hear how we may better support our students in subjects like math, chemistry, physics, etc, that depend so much on visualization.
The creator, https://www.reddit.com/user/Mrblindguardian/ has asked for help a few times in the past (I provided feedback when I could), but hasn't needed to as often of late, presumably due to using one or more LLMs.
I did a maths undergrad degree and the way my blind, mostly deaf friend and I communicated was using a stylized version of TeX markup. I typed on a terminal and he read / wrote on his braille terminal. It worked really well.
Yes, mostly raw TeX, just plain ascii - not specially coded for Braille. This was quite a long time ago, mid 1980's, so not long after TeX had started to spread in computer science and maths communities. My friend was using a "Versa Braille" terminal hooked via a serial port to a BBC Micro running a terminal program that I'd written. I cannot completely remember how we came to an understanding of the syntax to use. We did shorten some items because the Versa Braille only had 20 chars per "line".
He is still active and online and has a contact page see https://www.foneware.net. I have been a poor correspondent with him - he will not know my HN username. I will try to reach out to him.
Now that I've been recalling more memories of this, I do remember there being encoding or "escaped" character issues - particularly with brackets and parentheses.
There was another device between the BBC Micro and the "Versa Braille" unit. The interposing unit was a matrix switch that could multiplex between different serial devices - I now suspect it might also have been doing some character escaping / translation.
For those not familiar with Braille, it uses a 2x3 array (6 bits) to encode everything. The "standard" (ahem, by country) Braille encodings are super-sub-optimal for pretty much any programming language or mathematics.
After a bit of (me)memory refresh, in "standard" Braille you only get ( and ) - and they both encode to the same 2x3 pattern! So in Braille ()() and (()) would "read" as the same thing.
I now understand why you were asking about the software used. I do not recall how we completely worked this out. We had to have added some sort of convention for scoping.
I now also remember that the Braille terminal aggressively compressed whitespace. My friend liked to use (physical) touch to build a picture, but it was not easy to send spatial / line-by-line information to the Braille terminal.
Being able to rely on spatial information has always stuck with me. It is for this reason I've always had a bias against Python, it is one of the few languages that depends on precise whitespace for statement syntax / scope.
Thank you so much for all this detail. This is very interesting & quite helpful, and it's great you were able to communicate all this with your friend.
For anyone else interested: I wanted to be able to typeset mathematics (actual formulas) for the students that's as automated as possible. There are 1 or 2 commercial products that can typeset math in Braille (I can't remember the names but can look them up) but not priced for individual use. My university had a license to one of them but only for their own use (duh) and they did not have the staff to dedicate to my students (double duh).
My eventual solution was to compile latex to html, which the students could use with a screen reader. But screen readers were not fully reliable, and very, very slow to use (compared to Braille), making homework and exams take much longer than they need to. I also couldn't include figures this way. I looked around but did not find an easy open source solution for converting documents to Braille. It would be fantastic to be able to do this, formulas and figures included, but I would've been very happy with just the formulas. (This was single variable calculus; I shudder to think what teaching vector calc would have been like.)
FYI Our external vendor was able to convert figures to printed Braille, but I imagine that's a labor intensive process.
Partway through the term we found funding for dedicated "learning assistants" (an undergraduate student who came to class and helped explain what's going on, and also met with the students outside of class). This, as much or more than any tech, was probably the single most imapctful thing.
Chris McCausland is great. A fair bit of his material _does_ reference his visual impairment, but it's genuinely witty and sharp, and it never feels like he's leaning on it for laughs/relying on sympathy.
He did a great skit with Lee Mack at the BAFTAs 2022[0], riffing on the autocue the speakers use for announcing awards.
I'm not a fan of his (nothing against him, just not my cup of tea when it comes to comedy and mostly not been interested in other stuff he's done), but the few times I have seen him as a guest on shows it's been clear that he's a generally clever person.
Honestly, that’s such a great example of how to share what you do on the interwebs. Right timing, helpful and on topic. Since I’ve listened to several episodes of the podcast, I can confirm it definitely delivers.
I suppose I should write about them. A good few will be about issues with the mobile apps and websites for AI, like Claude not even letting me know a response is available to read, let alone sending it to the screen reader to be read. It's a mess, but if we blind people want it, we have to push through inaccessibility to get it.
Mainly realtime processing. I play video games, and would love to play something like Legend of Zelda and just have the AI going, then ask it "read the menu options as I move between them," and it would speak each menu option as the cursor moves to it. Or when navigating a 3D environment, ask it to describe the surroundings, then ask it to tell me how to get to a place or object, then it guide me to it. That could be useful in real-world scenarios too.
Have any studies been done on the use of newer or less popular programming languages in the era of LLMs? I'd guess that the relatively low number of examples and the overall amount of code available publicly in a particular language means that LLM output is less likely to be good.
If the hypothesis is correct, it sets an incredibly high bar for starting a new programming language today. Not only does one need to develop compiler, runtime, libraries, and IDE support (which is a tall order by itself), but one must also provide enough data for LLMs to be trained on, or even provide a custom fine-tuned snapshot of one of the open models for the new language.
Research takes some time, both to do but also to publish. In my area (programming languages), we have 4 major conferences a year, each with like a 6-to-8-month lag-time between submission and publication, assuming the submission is accepted by a double-blind peer review process.
I don't work in this area (I have a very unfavorable view of LLMs broadly), but I have colleagues who are working on various aspects of what you ask about, e.g., developing testing frameworks to help ensure output is valid or having the LLMs generate easily-checkable tests for their own generated code, developing alternate means of constraining output (think of, like, a special kind of type system), using LLMs in a way similar to program synthesis, etc. If there is fruit to be borne from this, I would expect to start seeing more publications about it at high-profile venues in the next year or two (or next week, which is when ICFP and SPLASH and their colocated workshops will convene this year, but I haven't seen the publications list to know if there's anything LLM-related yet).
(I have a pretty unfavorable view of LLMs myself, but) a quick search for "LLM" does find four sessions of the colocated LMPL workshop that are explicitly about LLMs and AI agents, plus a spread of other work across the schedule. ("LMPL" stands for "Language Models and Programming Languages", so I guess that's no surprise.)
Just anecdotally, I'm more productive in languages that I know _and_ which have good LLM understanding, than in languages that I'm just experienced with.
As much as I dislike Go as a language, LLMs are very good at it. Java too somewhat, Python a fair amount but less (and LLMs write Python I don't like). Swift however, I love programming in, but LLMs are pretty bad at it. We also have an internal config language which our LLMs are trained on, but which is complex and not very ergonomic, and LLMs aren't good at it.
It's not only the amount of code but also the quality of the available code. If a language has a low barrier to entry (e.g. python, javascript), there will be a lot of beginner code. If a language has good static analysis and type checking, the available code is free of certain error classes (e.g. Rust, Scala, Haskell).
I see that difference in llm generated code when switching languages. Generated rust code has a much higher quality than python code for example.
I know it's a meme project, but still it's impressive. And cc is at the point where you can take the repo of that language, ask it to "make it support emoji variables", and 5$ later it works. So yeah ... pretty impressive that we're already there.
For example let's say I want to go to display settings from search. I enter 'monitor' in search since I forgot how it's called. First results: accessibility, privacy and security, control center, and only 4-th category Displays. It's 8-th line if you count sub-categories.
I usually google where a particular setting is now since I don't use the exact same words and the settings search is very literal.
Coq/Lean target very different use cases.