Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Keep in mind, it's getting 15fps at 1440x900. It's not saying that Cyberpunk 2077 at Ultra on an M1 is a great experience. It's merely pointing out that it's technically possible (which is a massive achievement).


I'm not sure - Apple's marketing claimed that the M1 was beating top end PC parts. This isn't even close.


CPU maybe, GPU definitely not. In the end, M1's GPU is still a mobile GPU with a few desktop features bolted on (like BCx compressed texture formats).


They put comparison of power against nVidia 3090 which made every Apple fan think it's comparable in performance too :D


That was for the M1 ultra, while TFA is on a standard M1.


Wasn't it the "fastest laptop ever"?


I mean, for the work I do that’s probably true. My work is 99% CPU/RAM/disk dependent.

GPU obviously not, but maybe that claim would hold water at some arbitrary wattage limit.

My takeaway is that the GPU “doesn’t completely suck” and that Apple are dedicating continuing resources to making their platform actually usable, which I was worried about. I mean, it seems difficult just to intentionally use the Apple Neural Engine, and impossible to explicitly use it, which makes testing aggravating. Any continued focus on improving developer experience for the coprocessors (GPU, ANE, R1, etc) is a good signal.


If it's disk dependent, then the m2 ssds are half as fast as the m1 ssds because it's single lane, unless you upgrade


Yeah I don't think laptops should be sold at all with 512GB -- I think that's as absurd of a product as a laptop with 128GB. Personally I spec out 2TB and judge price value based on that.

So the speed issue doesn't affect me personally. I just wish they wouldn't sell that model and then it wouldn't affect anyone.


Isn't it when anchored to energy consumption? Sure you can put a powerful GPU in a laptop and make it effectively anchored to wall plug...


That's already true of the M1 Max. At maximum power from CPU+GPU it will barely last an hour.

Edit: I wrote Ultra, but I meant Max.


That may be true, but in practice I get faaar better battery life out of my M1 Max than I would out of a laptop with a mobile 4090.


Do you do AAA gaming on your M1 Max? If not, then the GPU is irrelevant, because a laptop with a 4090 mobile is going to shut it down completely.

If you are indeed doing AAA gaming, then you wouldn't have sufficient battery life without plugging in, or you wouldn't have sufficient performance.


I'm not talking about AAA gaming here. I'm talking about day-to-day work-related tasks, which is primarily what I use my MacBook for.


Yes, which is why I said that if you don't, the 4090 is irrelevant because it's just turned off.


People report playing Baldur's Gate 3 for an hour on M1 Max with 40% battery life left, is it AAA enough? https://www.reddit.com/r/macbookpro/comments/qogsov/battery_... (M1 Pro lasts longer)


Given that Baldur's Gate runs at Ultra 1080p60+ on a 3050Ti mobile, that's about what you can expect from a low end gaming laptop - and that GPU runs at 35-50W on most of those


If power consumption is about the same as M1 Max while running the same game at the same settings the only difference becomes battery life...


If the power consumption is the same and the battery capacity is the same size then the battery life is the same. It's a simple division.


Unless one OS consumes power more efficiently then another, sure...


> M1 Ultra

Do they put M1 Ultra in laptops tho?


No. There does not exist an M1 Ultra laptop of any kind. Nor an M2 Ultra laptop for that matter. The only machines with Mn Ultra are the Mac Studio and Mac Pro.

https://en.wikipedia.org/wiki/Apple_M1#Products_that_use_the...


Yes. The MBP with an M1 Max will, at max performance, use enough power that it would discharge it's battery in less than an hour. I think Apple throttles it on battery, though.


The MBP with an M1 Ultra isn't a thing that exists.


You're right, I miswrote, it's the M1 Max. See https://www.anandtech.com/show/17024/apple-m1-max-performanc... - the M1 Max can draw over 100W.


They did, although the graph cuts off just before the 3090 takes the lead and goes beyond.

https://cdn.videocardz.com/1/2022/03/M1-vs-3090.jpg

Frustrating they're being this misleading when M1 is outstanding for it's own reasons, but 3090 eats it alive in the workflows it excels in too.

Perfect machine would be both those chips in the same box tbh.


They've been straight up lying about the M series chips performance from day one. They will show insane graphs of the M chips beating top end desktop parts with an asterisk that explains the very specific BS benchmark they used that clearly favors their chip but won't generalize and then public benchmarks never even come close.

People still parrot it.


I do not understand. The picture you link clearly does not cut off that way: if it were the case, the curves would be about to cross at the cut, or maybe I am missing something?


To be fair, the M1 is now approaching 3 years of age. And the game is being emulated.


Many much cheaper 3 year old PCs would handle this fine


No, you won’t many cheaper 3 year old laptops running Cyperpunk 2077 on Ultra this well… name one.


Honestly, I wouldn't use the phrase "this well" non-sarcastically for ~15fps at 1440x900 like shown in the video.


this is on Ultra graphics settings.. not lot of laptop could even run it on Low

Cyberpunk is known to be a VERY demanding open world game

on lowest setting, this should probably be fine


Without RT it's actually not that demanding.

I mean, here's the game on Ultra on a Steam Deck: https://www.youtube.com/watch?v=gHeso2jc_L0


Comparison is not fair, game porting toolkit also does X86 -> ARM, so it's missing perf on lot of HW intrinsics

Also this is the 1st gen M1, which was released in 2020, I wonder what's the performance like on the newest models?


Hey, you made the claim "not a lot of laptops could run it even on low". I just put that in perspective.


It says Ultra, but ray tracing is not supported so it is not really "ultra".


It's an impressive turnout, but I wouldn't ignore the power of modern low-end APUs. Here's a 3-generation-old, entry level Ryzen laptop playing the game for comparison: https://youtu.be/Aqgm0zcV7Kw

"this well" is more or less equivalent to a older Ryzen 3's native performance. Apple is really banking on developers recompiling for ARM to reduce overhead here.


> name one.

At 900p? Steam Deck does just fine (and it's SOC is even older).


1440p900? Any laptop with a 2060 mobile would do it. At 1500$ it's well within budget.


13 inch ultra lights? I very much doubt that. You are basically relegated to igpus. I could see Ryzen 7840u beating the M1. But 3 years ago the best there was was the Ryzen 5800u.


Do keep in mind this script was marketed as a way for game developers to judge the viability to port to using native apis and native isa.

I would imagine that not running through this codeweavers patch and through rosetta would have better performance.


>I'm not sure - Apple's marketing claimed that the M1 was beating top end PC parts. This isn't even close.

Your statement is quite misinformed.

First, this is the M1. Not M1 Pro. Not M1 Max. The M1 is almost 3 years old.

Second, this is being translated from DX12 to Metal and also x86 to ARM64. Yes, both the CPU and GPU layers are being translated.

Third, Apple claimed that the M1 Max was the most powerful GPU on laptops. It was probably true depending on what benchmarks.

Finally, this is Cyberpunk 2077 running in Ultra settings. It's the most demanding PC game ever.


M1 has 8 GPU cores, M1 Pro has 16, M1 Max has 32 cores. Apple says the GPU of M1 Max is four times as fast as M1. So 30FPS Ultra on 1080p should be possible?


This is not the M1. You can’t configure a MacBook Pro 16 inch (stated in the tweet) with anything other than an M1 Pro / M1 Max, or M2 Pro / M2 Max on the latest models.


This is an M1 with 16 GB of RAM not 16 inch.


You’re right, my apologies I misread the 16 GB as 16 inch!


It's being run through several layers of emulation. Of course it's going to be slow.


Wine + DXVK disagrees. Graphics API emulation doesn't necessarily provide worse performance, in fact DXVK often wins against raw DirectX 9/10 and sometimes even DirectX 11. VKD3D performance is pretty awesome.

Not sure how much of the bottleneck here is because of Rosetta though (i.e. cpu-bound) although I suspect not much really.


Unless it is running a JIT internally as part of the game engine, Rosetta should take the whole executable and rebuild it ahead-of-time.


Rosetta can only do that for an x86 macOS binary. Once it goes through WINE it's all JIT. Though I think it should get cached after a while.


What, in playing Windows games with translated DirectX and amd64 calls?


it translates X86 to ARM, that's not free

High-end games makes ton of use of SIMD instructions to gain massive boost, I wonder if that's translated properly


I think it's a network effect thing as much as anything else. If it's good enough to get people playing games on their macs at all (even if at relatively sub-par settings), that builds the market, shows there are people willing to spend money on games to play them on their macs.

Then at that point games developers might be more inclined to give the platform explicit support.

Otherwise it's a bit of a chicken and egg situation: people aren't playing games on their macs because the library isn't there, the library isn't there because no developers will support a platform where there aren't gamers and so on.


Not as massive an achievement as Proton getting nearly every Windows game in existence running well from unmodified binaries on Linux.


Don't let perfect be the enemy of the good. Think about how many Steam Decks have been sold. Game devs are already actively targeting the Deck. If porting games to Mac can be made easier, we all need to actively encourage it. Unfortunately, the only way we get Apple to give us more game dev tools is by porting the games.

I have a Steam Deck and a Mac. I would love to play half the games from my Deck on my laptop.


Do you find this is a good solution for playing Steam games?

I have an older PC, and I'm thinking about replacing it with a macbook air and a steam deck. Does the steam deck feel more limiting than just having a windows PC with steam?


only get a steam deck if you're really interested in portable playing. You can build a better desktop from old parts. The steamdeck is an interesting device that's doing a lot for gaming via proton, but that makes tradeoffs for battery life and portability. If you're not interested in the portability just get a desktop. If you're interested in the portability but need a large screen, get a laptop. If you want a sega game gear form factor that can run steam games get a steam deck.


Proton isn’t as massive an achievement as the sun! That thing pumps out around 2.3012⋅10^27 joules a second!


Yeah, but that's the first attempt. The game porting toolkit is designed to shorten the time it takes to launch the game on macOS for the first time by allowing you to take the ready Windows version and just running it directly on macOS with translation. A finished macOS port would have additional work after this step.

https://developer.apple.com/videos/play/wwdc2023/10123/


"Technically possible" is a non statement. Of course its technically possible. The question is how much money is Apple willing to throw at making games run well on their gpu.


all while being emulated on rosetta


Ultra with 16gb of memory shared between cpu and gpu. Unlike a a traditional laptop or desktop with separate memory for both.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: