shockingly, I think there might be more than one person on the internet and these people might have varying opinions
but yea you can say the same thing about tons of brands. Last summer all the AMD fans were talking about 1€/kWh electricity and saying they were going to buy whatever dGPU was most efficient... when that turned out to be Ada by a country mile, everybody pivoted to whining about price and bought RDNA2 GPUs with half of the perf/w.
During RDNA2 everyone insisted that a 10% perf/w advantage for AMD was a buying point, back during the Vega years they insisted that a 2x perf/w disadvantage didn't matter. Rinse and repeat.
I generally think power matters when it rises to the level of a tangible difference... 200W difference between 4070 and 6950XT means the latter is really a non-starter even if it's 10% faster (at a 5% higher price), especially considering the big-picture featureset (DLSS improves both perf and perf/w). And really it matters more in laptops. You're right that Mac Studio/Mac Pro are not really a place where it hugely matters, but, in a laptop, the next-best thing would be a Ryzen 6800U which is about GTX 1630 performance, so 3060 performance in the same envelope is a big step upwards!
And really this "big differences matter, small ones don't" applies to most stuff in general. 5% this way or the other, who cares. That kind of thing is often less important than general UX/quality/features, I'll take a laptop that's 5% slower but way longer battery life or better screen/trackpad/whatever. When things start rising to the level of 25% or 30% difference in some spec, or in price... yeah that's immediately noticeable.
But yea I generally agree that desktops like Studio or outright workstations like Mac Pro are dGPU territory and people are generally not looking for a super efficient iGPU with 3060 performance. On the other hand, being able to talk to 192GB of VRAM is definitely novel, especially with large AI models being the talk of the town this year (and accessible to even the most casual of artists/developers), and the unified APU approach with uniform memory/zero-paging has other advantages for development too. AMD had a lot of this stuff hammered out 10 years ago, supposedly, and then... just never did anything with it, other than sell it to consoles. It's great for PS5 and Xbox, why can't I buy a PC laptop with 96GB of unified/uniform memory with 3060-level performance in a 25W envelope?
Really I think a lot of the people who have bought Macbooks recently are not "traditional" apple customers. The MBP and even MBA are legitimately really nice laptops with a good screen, good keyboard, good trackpad, good sound, etc. I have said before that I really think a lot of MBP customers would be interested in a "Macbook Tough" toughbook if they ever did that, although of course that's the most un-Jony Ives product possible.
There is a clear demand for a high-quality AMD-based non-GPU ultrabook using a 6800U or 7040U or whatever. Framework is the first company to even try, and they're using crappy 13" hardware on the upcoming AMD model while the market clearly wants more like a 15" or 16" (and their 16" will not have AMD boards). Why didn't anybody else do it first? Apple is catching on because they're filling a market niche that everyone else is ignoring, and they're not even really exactly filling it squarely, they just happen to be vaguely closer than the rest of the market.
And now that the nerd crowd has the hardware... the software is following. It's the same reason that CUDA has taken off while AMD's GPGPU programme has spun its wheels for 15 years, and the same reason AMD has good Linux drivers now. Give the nerds the hardware and innovation will follow - when they tinker they'll be tinkering with your platform.
Big missed opportunity for AMD, yet again. Or Intel, but, they're so far behind on APUs/integration that I think disappointment is basically the baseline expectation at this point. AMD had all the pieces, and yet again just chose not to do anything with them.
but yea you can say the same thing about tons of brands. Last summer all the AMD fans were talking about 1€/kWh electricity and saying they were going to buy whatever dGPU was most efficient... when that turned out to be Ada by a country mile, everybody pivoted to whining about price and bought RDNA2 GPUs with half of the perf/w.
During RDNA2 everyone insisted that a 10% perf/w advantage for AMD was a buying point, back during the Vega years they insisted that a 2x perf/w disadvantage didn't matter. Rinse and repeat.
I generally think power matters when it rises to the level of a tangible difference... 200W difference between 4070 and 6950XT means the latter is really a non-starter even if it's 10% faster (at a 5% higher price), especially considering the big-picture featureset (DLSS improves both perf and perf/w). And really it matters more in laptops. You're right that Mac Studio/Mac Pro are not really a place where it hugely matters, but, in a laptop, the next-best thing would be a Ryzen 6800U which is about GTX 1630 performance, so 3060 performance in the same envelope is a big step upwards!
And really this "big differences matter, small ones don't" applies to most stuff in general. 5% this way or the other, who cares. That kind of thing is often less important than general UX/quality/features, I'll take a laptop that's 5% slower but way longer battery life or better screen/trackpad/whatever. When things start rising to the level of 25% or 30% difference in some spec, or in price... yeah that's immediately noticeable.
But yea I generally agree that desktops like Studio or outright workstations like Mac Pro are dGPU territory and people are generally not looking for a super efficient iGPU with 3060 performance. On the other hand, being able to talk to 192GB of VRAM is definitely novel, especially with large AI models being the talk of the town this year (and accessible to even the most casual of artists/developers), and the unified APU approach with uniform memory/zero-paging has other advantages for development too. AMD had a lot of this stuff hammered out 10 years ago, supposedly, and then... just never did anything with it, other than sell it to consoles. It's great for PS5 and Xbox, why can't I buy a PC laptop with 96GB of unified/uniform memory with 3060-level performance in a 25W envelope?
Really I think a lot of the people who have bought Macbooks recently are not "traditional" apple customers. The MBP and even MBA are legitimately really nice laptops with a good screen, good keyboard, good trackpad, good sound, etc. I have said before that I really think a lot of MBP customers would be interested in a "Macbook Tough" toughbook if they ever did that, although of course that's the most un-Jony Ives product possible.
There is a clear demand for a high-quality AMD-based non-GPU ultrabook using a 6800U or 7040U or whatever. Framework is the first company to even try, and they're using crappy 13" hardware on the upcoming AMD model while the market clearly wants more like a 15" or 16" (and their 16" will not have AMD boards). Why didn't anybody else do it first? Apple is catching on because they're filling a market niche that everyone else is ignoring, and they're not even really exactly filling it squarely, they just happen to be vaguely closer than the rest of the market.
And now that the nerd crowd has the hardware... the software is following. It's the same reason that CUDA has taken off while AMD's GPGPU programme has spun its wheels for 15 years, and the same reason AMD has good Linux drivers now. Give the nerds the hardware and innovation will follow - when they tinker they'll be tinkering with your platform.
Big missed opportunity for AMD, yet again. Or Intel, but, they're so far behind on APUs/integration that I think disappointment is basically the baseline expectation at this point. AMD had all the pieces, and yet again just chose not to do anything with them.