They could have used lower power Intel chips. Nobody forced them to put 45W chips in a paper thin chassis, but their marketing guys really wanted to have the top of the line Intel chips in them despite the wattage.
What did you expect would happen?
It would be like Apple putting the chip from the Mac Studio in the Air.
They did use low power chips. They had core M CPUs (later rebranded i5-i7 because OEMs). These were 4.5W TDP chips, even the best i7 in the 2017 model (i7-7Y75). The ONLY chips Intel had with lower power usage at the time were the significantly slower Atom and Atom derivatives.
There's seeing at a superficial level and seeing at a deeper level. Plenty of companies were designing, building and selling hardware using the same components and making many of the same tradeoffs. The point is Apple realised what this meant, and saw what the implications would be to try and actually address those issues.
I loved my 12" macbook and would buy an Mx one in a heartbeat. It was so small I'd forget if I had it in my bag and would have to look. I traveled all over the planet and it was super convenient. I even wrote code on it.
12 had amazing form factor, but it was great only for native apps, web apps were running horrible on it.
But it was beautiful at that time, it was first macbook with various colors? Even once one lady at coffee shop approached me and asked what version of MacBook it was. No no... that's end of the story :)
I almost bought one - I had, if Apple had had one in stock when I tried to buy one. A truly faszinating device. Probably the smallest "full size" laptop, depending how you could, ever made.
For sure, the weakeast point of it was the processor, which had to sacrifice too much performance to stay in the thermal budget. I guess the basic design decisions for the 12" MacbBook were made when Intel still seemed to be on track for their 10nm process. But it wasn't the only caveat with its design. It had of course the horrible butterfly keybord which would plague the Apple laptops for years and 12" is the absolute minimum screen size for doing work.
I think Apple has another laptop design < 13" coming, that could be 12". The ARM processors pretty much solve the power/performance problem. The keyboard could actually be the biggest hurdle, as it increases the body thickness. But the keyboard of the new M2 Air is just wonderful, well worth whatever it adds to the device thickness. I wouldn't even be surprised, if they don't make an ultra-thin 12" MacBook but rather a 12" MB Pro, which might be slightly thicker than the Air.
Same. It certainly had its flaws, but it was ahead of its time in many ways. At the time I was traveling a lot and didn't need much power so it was perfect for me. It seems I got extremely lucky as I never had keyboard issues with mine.
Nowadays you could stick a (maybe throttled or fewer cores to get even more battery life) M1/M2 in there, add a second USB-C port, and it would be an excellent device without any of the downsides that the original version had.
I also never had keyboard issues but mine was 2015 Gen 1. I know a couple of people who got the 2016? Or 2017? Model and their keyboard had some issues with keys not registering.
That form factor with m1/2 would be awesome! Especially to use in bed cos the screen was so nice and it was so light.
The 12” MacBook (2015) was the last Apple laptop I bought before the current MacBook Air M2. I opted for a BTO with the i7 processor but it was still stultifying and slow, a constant exercise in frustration. Eventually the motherboard died requiring an out-of-warranty repair, which I conceded to, but shortly after that the batteries died… and then I set it aside. An absolutely stunning piece of design work but so unusably slow it put me off laptops for a full seven years. What did I use in the meantime? A 2012 Mac Mini upgraded to the hilt and more recently a Mac Studio, and various releases of iPads and iPad Pros throughout the years.
I’m talking about battery. If there’s a discrete gpu it drains the battery so quick because switching between integrated and descrete sucks on windows.
>>On the Intel front, Apple saw how underperforming, short lasting battery and hot the 12” MacBook was.
Then they saw how performant, long battery lasting and cool their iPad (ARM chip) was.
This became an easy decision for Apple to ditch Intel when they saw how much better their own iPad Pro was relative to the 12” Intel MacBook.
For those who have being using Mac since the PowerPC days all have used this argument against X86 chips. As you all know, Apple actually had to switched from RISC chips to Intel X86 chips, before they switched back to ISAs chips now. Just saying performance is not the main reason to switch chip sets.
IBM's PowerPC chips were high performance workstation chips, which means you're going to get performance and heat. In a desktop computer, you can add robust cooling, in an ultra-portable laptop, you cannot.
Apple abandoned PPC after it became obvious that laptops were going to become more popular than desktops, and the G5 cheese grater Mac Pro required liquid cooling for the dual CPU chip versions.
Apple moved from Intel to ARM for the exact same reason. Intel no longer cares about pursuing high performance with a low power draw. They have returned to the Pentium 4 strategy of performance via clock speed and power increases.
I believe the specific reason (as far as Apple disclosed) was that lower performance / higher efficiency PowerPC CPUs just weren't on IBM's roadmap and whatever quantity of CPUs Apple was buying and/or willing to commit to buying wasn't enough for IBM to consider it. Intel was focusing on power efficiency after the whole Netburst disaster.
Have they really gone back to the pentium 4 strategy or are they behind in process node tech and can only compete with AMD’s performance by pumping power into their cpus? I think Intel’s top priority right now is to catch up to Tsmc’s node tech to have as close to performance per watt (or beyond) parity as they can against AMD (and Apple)
Stop feeding into hyperbole. The 13900k is for maybe 5% of the market, and the non k will give crazy enough performance for most buyers (if they even exist at this point). Giving the 13900 a high heat mode is just for the chance to keep up/beat AMD for bragging rights.
Intel isn’t relying on an architecture thats about to run them into a wall like the Pentium 4’s architecture was about to, what keeping them is second is being behind on their process, and beyond that, execution.
Sorry, but Intel is pretty obviously chasing performance via higher and higher clock speeds and ridiculously high power draws just like they did previously with their Pentium 4 strategy.
You can argue that it's not what they "want" to be doing, but it's certainly what they are doing.
> This became an easy decision [in 2015] for Apple to ditch Intel when they saw how much better their own iPad Pro was relative to the 12” Intel MacBook.
[1] implies that the M1 development started around 2008 - although you could also read it that the M1 was 5 years in development but that sounds a bit quick and also doesn't fit with [3] in 2014 - "Apple Testing ARM Based Mac Prototypes".
But there doesn't seem to be any other direct corroboration of the 2008 date that I can find.
2008 is probably about the time Apple started serious work on in house CPU cores, rather than specifically the M1? The A6 chip was the first with an Apple designed core, and was released in 2012, so they would have been working on it for several years before.
Also, it was around this time that the chips in ipad pros were strangely getting close to intel cpu performance in benchmarks while running in thin enclosures without fans.
Oh Im pretty sure since the ipad pro launched or maybe the gen after, their scores in various benchmarks were getting scary close to the intel benchmarks. It mightve started maybe 30% off but with each generation the gap closed. It was increasingly clear that apples arm chips weren’t just toy chips limited to tablets and tech media talked about how it was only a matter of time before there would be an arm mac or a mac/ios convergence.
Whenever Tim Cook was asked, why pretty much up until maybe a few months before the apple silicon announcement, he said there weren’t any plans to switch the mac to arm.
I personally like the 12" form factor, and I own one, but I saw approximately zero of them in the wild. It's significantly lighter, I can still get work done on it, it's great for travel... and yet I can't remember seeing other people using them.
It's hilarious how HN is consistently wrong, even on tech-heavy subjects. No bright minds popping up here tbh. You can glean some interesting stuff from this. The hive mind was wrong on dropbox, seemingly wrong on this, and they are likely wrong on blockchain today.
For most people this has little value. The centralised banking model is cost efficient and for most people trust isn't an issue. The modern world runs on trust.
Also if I want to trade anonymously there is always cash ( as least for now ), which is a simple, well understood, near universally accepted mechanism.
There are niches where decentralized banking is useful - but they are niches - and many of them are associated with dubious activity.
On top of that the current tech plaforms simply don't scale - the cost of replacing simple trusted parties with technology is huge.
Not to mention NFTs, but it seems impossible to convince the HN crowd that there is a real art market in them, with real buyers and real artists using it. Any attempt to demonstrate this is met with wild leaps of logic - it's wash trading, it's not big enough, it's all a scam. I point to Beeple, Wes Cockx, DeeWay sales and all they tell me is it must all be a fraud. I show them markets with vibrant activity - Versum, FXhash, Foundation - and all I get is repetitions of memes around how NFTs are dead.
Successful scams have customers - the presence of people putting in money isn't a defining trait.
However I do think it's a bit harsh to call these things a scam - I mean take the art market itself - value is entirely subjective - doesn't mean it's a scam ( though scamming things happen in the normal art market to try and inflation prices ).
The question you have to ask yourself is:
Are you buying the NFT as an investment - because you think other people will value it, or are you spending the money because you are quite happy to own that thing forever and never sell - ie the value is to you.
I'd argue if you are doing the former you are more likely to be 'scammed' than the latter.
It shows that there is groupthink at play and that the overall HN commentary on subjects, even when they should know better (tech), is not particularly correct.
Being wrong once is one thing, but HN commentary seems consistently wrong on a whole litany of topics. The biggest source of bad hot takes seems to be something new/different. HN seems to be consistently conservative.
It’s a shame because I used to think that i would gain some insight on future trends from the fact that a lot of the people who comment here are in tech, but now I’m not so convinced.
That's because it is hard to discern what is shilling and what is a real / expert opinion. Shilling happens here on HN, like it does on every social media platform / discussion forum. It doesn't help that we can't call out suspected shilling on HN as it is against the rule. But that rule makes sense because suspicion is not proof, and it would just bring down the quality of discussion here if everyone of us accuses the other of shilling :). (And as someone once pointed out to me, sometimes these shillings are not necessarily from the marketing team but from employees and shareholders here, who have a vested interest in seeing the company do well.)
i think that's harsh. why would you even expect an aggregate sentiment of hn to indicate where to invest one's money. on the other hand it might be worth pointing out that y combinator supports plenty of blockchain projects
The 12” MacBook came out in 2015 and had terrible performance and a problem with overheating. Insider reports say that Intel had promised a lower power chip with better performance that Apple designed the MacBook for but then Intel killed that chip and Apple had to use another Intel mobile chip. Some people feel that was when the problem got real.
Not entirely true, people watching and in the field knew. Back in 2015 I made a massive bet on AMD because people on HN working in the field explained the arch shift. Similar with this move by Apple. There are people in those rooms, making the decisions, sharpening their ideas on HN — if we care to listen.
>Back in 2015 I made a massive bet on AMD because people on HN working in the field explained the arch shift.
I think you got really lucky. Zen1 didn't ship until 2017 and it lagged severely behind in single thread. You had no idea what AMD was going to have.
Even AMD would tell you that they were surprise that Intel fell so far behind. They've been quoted saying this a few times.
Intel's 10nm Node (equivalent to TSMC 7nm) was suppose to ship in 2016! It didn't ship anything on the desktop using 10nm until Alder Lake in 2021. Five year delay.
Intel would have been well ahead of Zen2 in node technology. Instead, it was around 1.5 node behind.
If you made your bet purely on what was said inside AMD in 2015, you just got lucky. No one knew that Intel would be stuck on 14nm for 7 years when they were planning for 2 years.
You can’t completely remove risk, but I invest in areas where insiders discuss publicly about their work. It provides insight often fundamentals lack; leaving massive potential upside.
They had already plateaued in 2014/2015 with Haswell/Broadwell. They've basically been releasing that same CPU with minor tweaks to power consumption and codec support for 8 years now.
At the time, it was hard to notice, but reviews at the time absolutely noticed the minor CPU update (https://www.theverge.com/2015/4/9/8375735/apple-macbook-pro-..., search "Broadwell"). Another funny aspect of that review: it mentions out 10+ hour battery life for the MBP as a nice, but hardly astonishing spec. 9 hours 45 minutes with Chrome was the worst case. It's amazing to think how bad the 2016-2019 MBPs were in comparison, to the point where getting back to 10 hour battery is an amazing Apple Silicon feature!
It's easy to doubt but it actually takes effort to form educated guesses about the future.
I didn't see this particular article but I would have agreed with it. I certainly have written many comments on similar articles (search the comments for alwillis ARM Mac for starters) explaining why such a switch to ARM was completely doable, having lived through the switch from PowerPC to Intel while working at MIT.
- Apple will have created two additional "multitasking" systems for iPadOS on top of the failed stage manager. People still continue to ask for a traditional Mac-like window management system and Apple still will be like "lol no".
- Apple arcade and Apple TV will still be around but Apple has no plan/vision for gaming.
Apple will have an AR or VR headset, and we will start ditching traditional displays for VR.
Meta will have recovered from their current issues and will be their main competitor.
I think Apple could have a big advantage because their processors could allow more powerful things in a standalone VR headset compared to the current Generation where you need a external PC for most CPU and GPU intensive tasks.
I don’t know - mobile phones were a huge huge success because they are first and foremost, practical. They fit in your pocket, can take stellar images, have access to literally everything on the internet and are fully capable general purpose computers in your hands - a sci-fi product turned into reality. And on top of that, we control them with our hands, which are arguably our most capable part for this job.
I don’t see any practicality to VRs outside some tiny niches. They make a few games more fun, some niche workloads can be done more efficiently, but they are hard to equip and first and foremost block your interactions with the real world. Sure, some contact lens futuresque thingy could improve on this as well, but how would you control that? Voice control is slow and troublesome.
So I don’t predict a huge success to VR, it will be at most something akin to an xbox’s kinect, or some wii accessory.
I see a huge VR market in misguided companies with too much money to burn on "team building" projects who see VR as a good way to extend middle management BS to WFH employees. Likely subsidized by MetaFacebook desperately pushing their crappy VR projects to keep them on life support.
As far as I can tell, we have a lot of progress to make with display resolution and GPU quality before VR becomes competitive for work environments with a modern hiDPI dual display setup. Maybe it's more appealing to folks with crappy home desk setups, or people who live in cities who don't want a full desk? Ergonomics still feels like an obstacle, though.
Rumors of the Apple car have died down (I think ...) but they may come back to the transportation market, maybe out of left field. Repurposing the iPod trademark?
Apple has been focused on health lately. So we might see more sensors and apps related to that, to the point where it would monitor your overall health continuously.
Some key take away:
- A lot of people got it wrong with the bet on Intel.
- 2 people that said it will take 5 years were right.
- Suggestion to use Rosetta was right.
- Indicator for fan less Macbooks was right.
It's easy to doubt but it actually takes effort to form educated guesses about the future.