Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's more interesting to see HN's past discussion from 2015: https://news.ycombinator.com/item?id=9244240

Some key take away:

- A lot of people got it wrong with the bet on Intel.

- 2 people that said it will take 5 years were right.

- Suggestion to use Rosetta was right.

- Indicator for fan less Macbooks was right.

It's easy to doubt but it actually takes effort to form educated guesses about the future.



2015 was a pivotal year to indicate Apples future direction, here’s why …

2015 is the year both the (Intel) 12” MacBook and the first iPad Pro were released.

These we also both entirely new form factors for Apple (and both roughly the same size).

On the Intel front, Apple saw how underperforming, short lasting battery and hot the 12” MacBook was.

Then they saw how performant, long battery lasting and cool their iPad (ARM chip) was.

This became an easy decision for Apple to ditch Intel when they saw how much better their own iPad Pro was relative to the 12” Intel MacBook.


>Apple saw how underperforming, short lasting battery and hot the 12” MacBook was

I sure hope they saw. They designed, built and sold the things.

Would be hard to miss those issues during their development and testing, wouldn't it?

It's not like Apple didn't know what they were selling.


And they could do what, given that Intel was their only option at the time? Not built it?

Not to mention tons of people love them...


They could have used lower power Intel chips. Nobody forced them to put 45W chips in a paper thin chassis, but their marketing guys really wanted to have the top of the line Intel chips in them despite the wattage.

What did you expect would happen?

It would be like Apple putting the chip from the Mac Studio in the Air.


They did use low power chips. They had core M CPUs (later rebranded i5-i7 because OEMs). These were 4.5W TDP chips, even the best i7 in the 2017 model (i7-7Y75). The ONLY chips Intel had with lower power usage at the time were the significantly slower Atom and Atom derivatives.


>They could have used lower power Intel chips

They could, but those were crap.

>What did you expect would happen?

That it would be a niche due to size, but otherwise much beloved model, whose 2022 absense is still lamented? (See comments below)


Arguably the customers forced Apple to use a reasonably performing chip vs one that didn’t perform even reasonably?


There's seeing at a superficial level and seeing at a deeper level. Plenty of companies were designing, building and selling hardware using the same components and making many of the same tradeoffs. The point is Apple realised what this meant, and saw what the implications would be to try and actually address those issues.


I loved my 12" macbook and would buy an Mx one in a heartbeat. It was so small I'd forget if I had it in my bag and would have to look. I traveled all over the planet and it was super convenient. I even wrote code on it.

Yes it was slow but really, not terrible.


The 12” MacBook was amazing?

It ran windows with battery life better than macOS. It was a solid form factor. Never over heated. Great for traveling with.

I miss mine.


The 12” will be (in my personal memory) the best Apple product I have ever owned.


12 had amazing form factor, but it was great only for native apps, web apps were running horrible on it. But it was beautiful at that time, it was first macbook with various colors? Even once one lady at coffee shop approached me and asked what version of MacBook it was. No no... that's end of the story :)


I almost bought one - I had, if Apple had had one in stock when I tried to buy one. A truly faszinating device. Probably the smallest "full size" laptop, depending how you could, ever made.

For sure, the weakeast point of it was the processor, which had to sacrifice too much performance to stay in the thermal budget. I guess the basic design decisions for the 12" MacbBook were made when Intel still seemed to be on track for their 10nm process. But it wasn't the only caveat with its design. It had of course the horrible butterfly keybord which would plague the Apple laptops for years and 12" is the absolute minimum screen size for doing work.

I think Apple has another laptop design < 13" coming, that could be 12". The ARM processors pretty much solve the power/performance problem. The keyboard could actually be the biggest hurdle, as it increases the body thickness. But the keyboard of the new M2 Air is just wonderful, well worth whatever it adds to the device thickness. I wouldn't even be surprised, if they don't make an ultra-thin 12" MacBook but rather a 12" MB Pro, which might be slightly thicker than the Air.


Same. It certainly had its flaws, but it was ahead of its time in many ways. At the time I was traveling a lot and didn't need much power so it was perfect for me. It seems I got extremely lucky as I never had keyboard issues with mine.

Nowadays you could stick a (maybe throttled or fewer cores to get even more battery life) M1/M2 in there, add a second USB-C port, and it would be an excellent device without any of the downsides that the original version had.


I also never had keyboard issues but mine was 2015 Gen 1. I know a couple of people who got the 2016? Or 2017? Model and their keyboard had some issues with keys not registering.

That form factor with m1/2 would be awesome! Especially to use in bed cos the screen was so nice and it was so light.


It’s also when Apple’s laptop speaker game left the earth. To this day no one even came near Apple.


Yeah the speaker on my 2015 12” is still better speaker than my 2021 Lenovo legion 7 and 2022 dell XPS 15.

I kinda wanna get an M2 Pro but I personally dislike macOS. :(


The 12” MacBook (2015) was the last Apple laptop I bought before the current MacBook Air M2. I opted for a BTO with the i7 processor but it was still stultifying and slow, a constant exercise in frustration. Eventually the motherboard died requiring an out-of-warranty repair, which I conceded to, but shortly after that the batteries died… and then I set it aside. An absolutely stunning piece of design work but so unusably slow it put me off laptops for a full seven years. What did I use in the meantime? A 2012 Mac Mini upgraded to the hilt and more recently a Mac Studio, and various releases of iPads and iPad Pros throughout the years.


Macs always run windows better than macOS, which I find hilarious


I’m talking about battery. If there’s a discrete gpu it drains the battery so quick because switching between integrated and descrete sucks on windows.


>>On the Intel front, Apple saw how underperforming, short lasting battery and hot the 12” MacBook was.

Then they saw how performant, long battery lasting and cool their iPad (ARM chip) was.

This became an easy decision for Apple to ditch Intel when they saw how much better their own iPad Pro was relative to the 12” Intel MacBook.

For those who have being using Mac since the PowerPC days all have used this argument against X86 chips. As you all know, Apple actually had to switched from RISC chips to Intel X86 chips, before they switched back to ISAs chips now. Just saying performance is not the main reason to switch chip sets.


IBM's PowerPC chips were high performance workstation chips, which means you're going to get performance and heat. In a desktop computer, you can add robust cooling, in an ultra-portable laptop, you cannot.

Apple abandoned PPC after it became obvious that laptops were going to become more popular than desktops, and the G5 cheese grater Mac Pro required liquid cooling for the dual CPU chip versions.

Apple moved from Intel to ARM for the exact same reason. Intel no longer cares about pursuing high performance with a low power draw. They have returned to the Pentium 4 strategy of performance via clock speed and power increases.


I believe the specific reason (as far as Apple disclosed) was that lower performance / higher efficiency PowerPC CPUs just weren't on IBM's roadmap and whatever quantity of CPUs Apple was buying and/or willing to commit to buying wasn't enough for IBM to consider it. Intel was focusing on power efficiency after the whole Netburst disaster.


Have they really gone back to the pentium 4 strategy or are they behind in process node tech and can only compete with AMD’s performance by pumping power into their cpus? I think Intel’s top priority right now is to catch up to Tsmc’s node tech to have as close to performance per watt (or beyond) parity as they can against AMD (and Apple)


>Raptor Lake to Offer ‘Unlimited Power’ Mode for Those Who Don’t Care About Heat, Electric Bills

https://www.extremetech.com/computing/338748-raptor-lake-to-...

If that isn't a return to the Pentium 4 strategy, I don't know what is.


Stop feeding into hyperbole. The 13900k is for maybe 5% of the market, and the non k will give crazy enough performance for most buyers (if they even exist at this point). Giving the 13900 a high heat mode is just for the chance to keep up/beat AMD for bragging rights.

Intel isn’t relying on an architecture thats about to run them into a wall like the Pentium 4’s architecture was about to, what keeping them is second is being behind on their process, and beyond that, execution.


Sorry, but Intel is pretty obviously chasing performance via higher and higher clock speeds and ridiculously high power draws just like they did previously with their Pentium 4 strategy.

You can argue that it's not what they "want" to be doing, but it's certainly what they are doing.


I'm obviously not going to change your mind so you do you


>Intel Raptor Lake boosts performance, but the [power] requirements are staggering

https://www.digitaltrends.com/computing/intel-raptor-lake-ma...

What you can't change is the reality of Intel's actions.


> This became an easy decision [in 2015] for Apple to ditch Intel when they saw how much better their own iPad Pro was relative to the 12” Intel MacBook.

[1] implies that the M1 development started around 2008 - although you could also read it that the M1 was 5 years in development but that sounds a bit quick and also doesn't fit with [3] in 2014 - "Apple Testing ARM Based Mac Prototypes".

But there doesn't seem to be any other direct corroboration of the 2008 date that I can find.

[1] https://www.youtube.com/watch?v=4oDZyOf6CW4 via [2] [2] https://news.ycombinator.com/item?id=31778257 [3] https://www.macrumors.com/2014/05/25/arm-mac-magic-trackpad/


2008 is probably about the time Apple started serious work on in house CPU cores, rather than specifically the M1? The A6 chip was the first with an Apple designed core, and was released in 2012, so they would have been working on it for several years before.


2008 is the year Apple bought PA Semi, an independent chip design house, which formed the core of their semiconductor design team for Apple Silicon.


You pretty much nailed it. They likely started work when they hired Johny srouji.

https://www.apple.com/leadership/johny-srouji/


Also, it was around this time that the chips in ipad pros were strangely getting close to intel cpu performance in benchmarks while running in thin enclosures without fans.


> Also, it was around this time that the chips in ipad pros were strangely getting close to intel cpu performance

Around what time? 2015?

Maybe I could have been more clear in my original post but 2015 was the year the iPad Pro launched.

How could the “iPad Pro were getting close to Intel performance” happen in thr first version?

The very first iPad Pro was already more performant than the MacBook 12” (which also launched that same year).


Oh Im pretty sure since the ipad pro launched or maybe the gen after, their scores in various benchmarks were getting scary close to the intel benchmarks. It mightve started maybe 30% off but with each generation the gap closed. It was increasingly clear that apples arm chips weren’t just toy chips limited to tablets and tech media talked about how it was only a matter of time before there would be an arm mac or a mac/ios convergence.

Whenever Tim Cook was asked, why pretty much up until maybe a few months before the apple silicon announcement, he said there weren’t any plans to switch the mac to arm.


and yet today the wonderful 12" Macbook is dead


I personally like the 12" form factor, and I own one, but I saw approximately zero of them in the wild. It's significantly lighter, I can still get work done on it, it's great for travel... and yet I can't remember seeing other people using them.


I want a 12" netbook, but most options are garbage or overpriced. A 12" M1 OSX netbook would be a gamechanger.


you may not have seen them, but they are out there (raises hand)


It's hilarious how HN is consistently wrong, even on tech-heavy subjects. No bright minds popping up here tbh. You can glean some interesting stuff from this. The hive mind was wrong on dropbox, seemingly wrong on this, and they are likely wrong on blockchain today.


It's been over 10 years and we still don't have a decent usecase for blockchain outside of crypto.


the global username by ethereum name service (ENS)

see fallon.eth


there has been a huge use-case the whole time: decentralized accounting / banking

some people see value in this some people perhaps not


For most people this has little value. The centralised banking model is cost efficient and for most people trust isn't an issue. The modern world runs on trust.

Also if I want to trade anonymously there is always cash ( as least for now ), which is a simple, well understood, near universally accepted mechanism.

There are niches where decentralized banking is useful - but they are niches - and many of them are associated with dubious activity.

On top of that the current tech plaforms simply don't scale - the cost of replacing simple trusted parties with technology is huge.


> For most people this has little value

for most people in the US or the world? i think everywhere banks rank amongst the most disliked institutions


Blockchain asks the question: what if you couldn't regulate the banks through democracy?

Which would imply that people love banks and want them to be even more powerful.


ok


That doesn't make being "not a bank" valuable.


Not to mention NFTs, but it seems impossible to convince the HN crowd that there is a real art market in them, with real buyers and real artists using it. Any attempt to demonstrate this is met with wild leaps of logic - it's wash trading, it's not big enough, it's all a scam. I point to Beeple, Wes Cockx, DeeWay sales and all they tell me is it must all be a fraud. I show them markets with vibrant activity - Versum, FXhash, Foundation - and all I get is repetitions of memes around how NFTs are dead.


Successful scams have customers - the presence of people putting in money isn't a defining trait.

However I do think it's a bit harsh to call these things a scam - I mean take the art market itself - value is entirely subjective - doesn't mean it's a scam ( though scamming things happen in the normal art market to try and inflation prices ).

The question you have to ask yourself is:

Are you buying the NFT as an investment - because you think other people will value it, or are you spending the money because you are quite happy to own that thing forever and never sell - ie the value is to you.

I'd argue if you are doing the former you are more likely to be 'scammed' than the latter.


NFT’s are fungible in practice, is the root of the issue.


I buy art for my art collection. They happen to be NFTs. I also purchase physical art.


It just shows that predicting the future is really difficult. It's also not logical why HN being wrong in one case means it will be wrong in another.


It shows that there is groupthink at play and that the overall HN commentary on subjects, even when they should know better (tech), is not particularly correct.

Being wrong once is one thing, but HN commentary seems consistently wrong on a whole litany of topics. The biggest source of bad hot takes seems to be something new/different. HN seems to be consistently conservative.

It’s a shame because I used to think that i would gain some insight on future trends from the fact that a lot of the people who comment here are in tech, but now I’m not so convinced.


That's because it is hard to discern what is shilling and what is a real / expert opinion. Shilling happens here on HN, like it does on every social media platform / discussion forum. It doesn't help that we can't call out suspected shilling on HN as it is against the rule. But that rule makes sense because suspicion is not proof, and it would just bring down the quality of discussion here if everyone of us accuses the other of shilling :). (And as someone once pointed out to me, sometimes these shillings are not necessarily from the marketing team but from employees and shareholders here, who have a vested interest in seeing the company do well.)


> No bright minds popping up here tbh

i think that's harsh. why would you even expect an aggregate sentiment of hn to indicate where to invest one's money. on the other hand it might be worth pointing out that y combinator supports plenty of blockchain projects


Also, it was until 2015 where Intel was on a great streak with their processors and didn't show signs of stagnation until around 2017/2018

I5/i7's were great at that time


The 12” MacBook came out in 2015 and had terrible performance and a problem with overheating. Insider reports say that Intel had promised a lower power chip with better performance that Apple designed the MacBook for but then Intel killed that chip and Apple had to use another Intel mobile chip. Some people feel that was when the problem got real.


Apple would still have a lot of people who relived the same situation during the IBM days.

If you're getting hints that your chip vendor is not aligned then you better have a backup plan.


Not entirely true, people watching and in the field knew. Back in 2015 I made a massive bet on AMD because people on HN working in the field explained the arch shift. Similar with this move by Apple. There are people in those rooms, making the decisions, sharpening their ideas on HN — if we care to listen.


>Back in 2015 I made a massive bet on AMD because people on HN working in the field explained the arch shift.

I think you got really lucky. Zen1 didn't ship until 2017 and it lagged severely behind in single thread. You had no idea what AMD was going to have.

Even AMD would tell you that they were surprise that Intel fell so far behind. They've been quoted saying this a few times.

Intel's 10nm Node (equivalent to TSMC 7nm) was suppose to ship in 2016! It didn't ship anything on the desktop using 10nm until Alder Lake in 2021. Five year delay.

Intel would have been well ahead of Zen2 in node technology. Instead, it was around 1.5 node behind.

If you made your bet purely on what was said inside AMD in 2015, you just got lucky. No one knew that Intel would be stuck on 14nm for 7 years when they were planning for 2 years.


There were a ton of Intel engineers at the time complaining about management in a different thread.

I’m sure luck was involved (so many things could go wrong). But i tend to make money on bets based on what I hear on the fringe. AMD, Bitcoin, etc


The problem is how do you separate the gold from the cruft? Seems impossible. Also, so much cruft makes you miss the gold as well. :/


I actually wrote software to do that lol

Built this: https://insideropinion.com/

But use it for investments.

You can’t completely remove risk, but I invest in areas where insiders discuss publicly about their work. It provides insight often fundamentals lack; leaving massive potential upside.


Stalking as a Service? How could that possibly go wrong?

I’m equal parts horrified and amazed. And curious what The Algorithm thinks about my ramblings.


From my understanding, it is way more costly to miss the gold than to get some cruft.


They had already plateaued in 2014/2015 with Haswell/Broadwell. They've basically been releasing that same CPU with minor tweaks to power consumption and codec support for 8 years now.

At the time, it was hard to notice, but reviews at the time absolutely noticed the minor CPU update (https://www.theverge.com/2015/4/9/8375735/apple-macbook-pro-..., search "Broadwell"). Another funny aspect of that review: it mentions out 10+ hour battery life for the MBP as a nice, but hardly astonishing spec. 9 hours 45 minutes with Chrome was the worst case. It's amazing to think how bad the 2016-2019 MBPs were in comparison, to the point where getting back to 10 hour battery is an amazing Apple Silicon feature!


I don't think my 2019 MBP has ever laster more than 3 hours on battery.

My M1 Max is amazing by comparison.


They were bad for thin and light laptops with good battery life (mostly atom shit and underpowered core cpu's like in the 12inch macbook).


It’s funny to see how confident some OPs are in their claims, all while pointing out how the article is “speculating” the future.


Nice catch.

Phew. I’m glad I didn’t comment on that thread.


It's easy to doubt but it actually takes effort to form educated guesses about the future.

I didn't see this particular article but I would have agreed with it. I certainly have written many comments on similar articles (search the comments for alwillis ARM Mac for starters) explaining why such a switch to ARM was completely doable, having lived through the switch from PowerPC to Intel while working at MIT.


2030 predictions? Anyone?


- Apple will have created two additional "multitasking" systems for iPadOS on top of the failed stage manager. People still continue to ask for a traditional Mac-like window management system and Apple still will be like "lol no".

- Apple arcade and Apple TV will still be around but Apple has no plan/vision for gaming.


Apple will have an AR or VR headset, and we will start ditching traditional displays for VR.

Meta will have recovered from their current issues and will be their main competitor.

I think Apple could have a big advantage because their processors could allow more powerful things in a standalone VR headset compared to the current Generation where you need a external PC for most CPU and GPU intensive tasks.


I don’t know - mobile phones were a huge huge success because they are first and foremost, practical. They fit in your pocket, can take stellar images, have access to literally everything on the internet and are fully capable general purpose computers in your hands - a sci-fi product turned into reality. And on top of that, we control them with our hands, which are arguably our most capable part for this job.

I don’t see any practicality to VRs outside some tiny niches. They make a few games more fun, some niche workloads can be done more efficiently, but they are hard to equip and first and foremost block your interactions with the real world. Sure, some contact lens futuresque thingy could improve on this as well, but how would you control that? Voice control is slow and troublesome.

So I don’t predict a huge success to VR, it will be at most something akin to an xbox’s kinect, or some wii accessory.


I see a huge VR market in misguided companies with too much money to burn on "team building" projects who see VR as a good way to extend middle management BS to WFH employees. Likely subsidized by MetaFacebook desperately pushing their crappy VR projects to keep them on life support.

As far as I can tell, we have a lot of progress to make with display resolution and GPU quality before VR becomes competitive for work environments with a modern hiDPI dual display setup. Maybe it's more appealing to folks with crappy home desk setups, or people who live in cities who don't want a full desk? Ergonomics still feels like an obstacle, though.


Rumors of the Apple car have died down (I think ...) but they may come back to the transportation market, maybe out of left field. Repurposing the iPod trademark?

(E.g. this for the XXI century: https://en.wikipedia.org/wiki/Isetta )


Apple has been focused on health lately. So we might see more sensors and apps related to that, to the point where it would monitor your overall health continuously.


Funny to see an old comment of mine there. Looks like I wasn't wildly off base, fewf :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: