Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's neat and impressive, but that's all they have. They don't have a high end. For $3000 you can get a top end gaming PC. There's no amount of money you can spend on a Mac to equal that.

I think you're missing the point that parent (and Siracusa) made - Apple invests a signficant into the software and graphics stack, only to fumble it at the last minute by not having high-end graphics hardware, and caring enough to court "triple A" game developers to their platforms, despite them creating and maintaining Metal and this 20k-line WINE patch.

There's this weird mismatch of Apple dedicating a non-trival amount of time in their keynote to "Mac Gaming" as if it's supposed to be impressive to finally play a 4 year old game on a Mac because they don't ship high-end graphics devices.



Apple silicon wins on performance per watt but not in performance outright and suddenly everyone cares about power consumption. Whichever spec everyone's favorite fruit company excels at gets put on a pedestal.


The PC industry is no longer driven by desktops; laptops have taken over long ago. There is a gaming PC crowd, but that is a small captured audience who wants performance, wattage be damned.

Apple is selling around 80% laptops versus desktops, and the rest of the industry is something like 77%. The fact Apple is winning the laptop GPU race doesn't mean it should automatically be entered into the desktop GPU race, where it is not winning.


> The fact Apple is winning the laptop GPU race

Fact? Which Apple chips are outperforming the laptop 3070, much less the current-gen mobile 4090?


I would take a guess that Apple is shipping (far) more TFlops of GPU power than Nvidia or anyone else in the mobile GPU market. Few people are buying laptops with 80-150w TDP GPUs, as those start to stretch the definition of both 'laptop' and 'battery powered'. Big gaming laptops with an hour of battery life are more akin to the luggables of yore.


That's fair. Nvidia has issues scaling their full systems down to laptop spec, and Apple almost has the opposite problem. They're both impressive in their own right, but right now Nvidia has both the performance and performance-per-watt crown in this space. The disparity in 3D applications (like gaming and Blender[0]) so ugly it's not even close.

And in all fairness - Apple's products might not need more GPU power. Cyberpunk and Elden Ring appear to be CPU-bottlenecked, if people are comfortable upscaling they could get a pretty comfortable Retina experience. The 2D optimization and media accelerators are a good focus for mobile hardware. For more demanding applications though, it looks like Apple's current approach is not scaling well.

[0] https://opendata.blender.org/benchmarks/query


Yeah I'm really curious what Apple's next-gen GPU (with raytracing and a bunch of other stuff) brings to fix some of these shortcomings. It was supposed to show up on last year's iPhone 14 followed presumably by inclusion in the M-series, and the 3nm process was supposed to be shipping this year, but everything got set back a year. In Mac-land the M2 wound up just being an overclocked M1, so we're left waiting for M3 to bring us a more competitive GPU.

The other half of the story is a lot of software (inc Blender, looking at these crazy results) just isn't well optimized and Apple is still struggling to win over developers in certain sectors of the market. Nvidia's decade+ investment in the software side has paid off so incredibly well for them, it's basically made the company.


shockingly, I think there might be more than one person on the internet and these people might have varying opinions

but yea you can say the same thing about tons of brands. Last summer all the AMD fans were talking about 1€/kWh electricity and saying they were going to buy whatever dGPU was most efficient... when that turned out to be Ada by a country mile, everybody pivoted to whining about price and bought RDNA2 GPUs with half of the perf/w.

During RDNA2 everyone insisted that a 10% perf/w advantage for AMD was a buying point, back during the Vega years they insisted that a 2x perf/w disadvantage didn't matter. Rinse and repeat.

I generally think power matters when it rises to the level of a tangible difference... 200W difference between 4070 and 6950XT means the latter is really a non-starter even if it's 10% faster (at a 5% higher price), especially considering the big-picture featureset (DLSS improves both perf and perf/w). And really it matters more in laptops. You're right that Mac Studio/Mac Pro are not really a place where it hugely matters, but, in a laptop, the next-best thing would be a Ryzen 6800U which is about GTX 1630 performance, so 3060 performance in the same envelope is a big step upwards!

And really this "big differences matter, small ones don't" applies to most stuff in general. 5% this way or the other, who cares. That kind of thing is often less important than general UX/quality/features, I'll take a laptop that's 5% slower but way longer battery life or better screen/trackpad/whatever. When things start rising to the level of 25% or 30% difference in some spec, or in price... yeah that's immediately noticeable.

But yea I generally agree that desktops like Studio or outright workstations like Mac Pro are dGPU territory and people are generally not looking for a super efficient iGPU with 3060 performance. On the other hand, being able to talk to 192GB of VRAM is definitely novel, especially with large AI models being the talk of the town this year (and accessible to even the most casual of artists/developers), and the unified APU approach with uniform memory/zero-paging has other advantages for development too. AMD had a lot of this stuff hammered out 10 years ago, supposedly, and then... just never did anything with it, other than sell it to consoles. It's great for PS5 and Xbox, why can't I buy a PC laptop with 96GB of unified/uniform memory with 3060-level performance in a 25W envelope?

Really I think a lot of the people who have bought Macbooks recently are not "traditional" apple customers. The MBP and even MBA are legitimately really nice laptops with a good screen, good keyboard, good trackpad, good sound, etc. I have said before that I really think a lot of MBP customers would be interested in a "Macbook Tough" toughbook if they ever did that, although of course that's the most un-Jony Ives product possible.

There is a clear demand for a high-quality AMD-based non-GPU ultrabook using a 6800U or 7040U or whatever. Framework is the first company to even try, and they're using crappy 13" hardware on the upcoming AMD model while the market clearly wants more like a 15" or 16" (and their 16" will not have AMD boards). Why didn't anybody else do it first? Apple is catching on because they're filling a market niche that everyone else is ignoring, and they're not even really exactly filling it squarely, they just happen to be vaguely closer than the rest of the market.

And now that the nerd crowd has the hardware... the software is following. It's the same reason that CUDA has taken off while AMD's GPGPU programme has spun its wheels for 15 years, and the same reason AMD has good Linux drivers now. Give the nerds the hardware and innovation will follow - when they tinker they'll be tinkering with your platform.

Big missed opportunity for AMD, yet again. Or Intel, but, they're so far behind on APUs/integration that I think disappointment is basically the baseline expectation at this point. AMD had all the pieces, and yet again just chose not to do anything with them.


https://store.steampowered.com/hwsurvey/videocard/?sort=pct

The most popular GPUs in use TODAY are 1650, 1060, 3060, 2060.

If Apple can get the M2 Macbook Air to run like one of these, it essentially makes the most popular laptop also the equivalent to the most popular gaming rig.


Two days ago we could speculate that maybe the $6000+ Mac Pro would bring better graphics performance, but now we know it's a $7000 Mac Studio with PCIe slots. And as far as we know you can't put a GPU in those slots.

Not that it would've been in my price range anyway, but it could've indicated that thunderbolt eGPU support would make a return.

Lack of that is a weird omission if Apple is trying to act like they have a gaming platform.


> Lack of that is a weird omission if Apple is trying to act like they have a gaming platform.

Apple has a huge gaming platform, and it isn't the Mac.

https://www.ign.com/articles/apple-made-more-than-nintendo-s...

However, I imagine that they'd still like to sell more games in the Mac App Store (in addition to iOS ports, iPad games that can run on Apple Silicon, and Apple Arcade subscriptions) and this might help.

It might also make it easier to port games to Apple Arcade.


For $3000 you can get a top end GPU, not a whole gaming PC. Most gamers have mid-range GPUs like those in the M2 Max.


This is wildly false, see https://www.reddit.com/r/buildapc/comments/143ugg4/comment/j... as a real world example of a gaming PC you can buy right now for $2,600 which is the top of the crop and an absolutely wild gaming machine that's vastly better at gaming than $5,000 M2 Ultra Mac Studio even if the comparisons are done only in games that actually work and work well on the Mac.

If you were after a comparable experience with the fastest Macs on Earth you could configure a PC that's another $1,000 cheaper than that ($1,600).


You’re right and my knowledge was outdated about the prices.


Actually you can get NVidia's top consumer GPU today, the RTX 4090, for $1500-1600. Go back one generation and you can get a RTX 3090 for $750 which still packs a punch.

So it's quite possible to build a well-performing gaming PC for sub-$2000 with RTX 3090 which is still significantly more performant than Apple's latest Mac, in terms of GPU throughput.

I snapped myself a gaming PC for $1300 at last year's Thanksgiving sales, came with a AMD Ryzen, RTX 3080 (10 GB VRAM model) and 32 GB DDR4 RAM, no way I could have gotten a Mac with that performance for anything close in terms of price.


Yea, that’s true as of today in the US. There’s no way you could get such a deal where I live (Norway) due to our weak currency, and it used to be the other way around just a few years ago




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: