Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's only part of it.

Intel had an epic once in a generation stumble with a combination of very bad, very high impact choices all at once.

The NetBurst (Pentium-4) architecture was both hugely expensive to develop and manufacture and also extremely underwhelming. But that's only part of it. They also staked out the next generation advancement of the instruction set to a revolutionary and untested architecture (a VLIW variant) with certain compromises, all of which turned out to be a huge and costly mistake. It took almost a decade for the architecture to fully mature and even today it's still not very suitable for everyday computing. They hitched their system wagon to RAMBUS memory, which priced their systems out of the consumer-grade sweet-spot and their long-term contract with RAMBUS screwed them when DDR came around as an equally fast or faster alternative at a much lower cost. And they struggled with making decent chipsets for several years. One of the last pre-RAMBUS/pre-NetBurst chipset was the legendary 440BX which was hugely reliable, flexible, and high performance. After that intel chipsets had a long period of utter mediocrity.

In that window AMD made several strong moves. They produced high performance, low-cost chips using a very solid architecture. They developed an excellent next-generation 64-bit instruction set (x86-64) which had a great deal of heritage with the IA32 (x86) architecture and included mostly just very sensible changes and extensions. And they were able to create hardware which both ran existing 32-bit code extremely well and also ran new 64-bit code at competitive levels of performance. They also managed to put together a "whole system" in terms of cpu, chipset, and RAM which was at a very excellent intersection of affordability and performance.

Unfortunately for AMD, Intel very much learned from their mistakes, adopted DDR ram, vastly improved their chipsets, cut their losses on mistakes like IA64 and the early generation of EM64T systems, adopted the AMD64 / x86-64 architecture, and developed some excellent CPU technology in the form of the Core, Core 2, and Core i{3,5,7} architectures. Meanwhile, AMD ran into a few stumbling blocks and has had a tough time getting over them, let alone getting up to being able to compete head to head with intel again.



IIRC, there were several points during that period where AMD tried to become the default CPU for large computer builders at the time (e.g. Gateway, Dell, etc.) due to far better price/performance ratios, yet frustratingly they stuck it out with Intel. I'm guessing Intel offered some sweet sweet long term deals to keep AMD out of the picture.

The landscape could have looked very differently today had those deals gone through.

I think AMD actually ended up with the superior technology all the way back with the Athlon line (when Intel was still PIII) with a better memory architecture and a few other good odds and ends.

They tried to get back in the game with some bold moves, like buying ATI, but it just hasn't panned out for them.


Three things that have always been in Intel's favor have been their truly massive and always cutting edge fab capability (which no other cpu manufacturer has been able to even come close to), their extremely well-funded R&D teams, and their industry relationships (largely due to those other two factors).

Intel is able to field several different processor lines and architectures simultaneously, and also develop multiple new architectures at the same time. That's not something that most other companies can do. They've also been reasonably good at recovering from mistakes quickly (something that Microsoft has also been fairly good at).

That said, it'll be interesting to see how things pan out with the changes to the computing landscape that are in the works now.


things that have always been in Intel's favor have been their truly massive and always cutting edge fab capability (which no other cpu manufacturer has been able to even come close to)

On a flight a while back I struck up a conversation with a PhD that specialized in chip design. He suggested that if AMD was able to manufacture their chip designs in Intel's fabs, that they would be out Intel chips in terms of performance and power.

The idea was that because AMD uses fabs that are, basically, two generations behind Intel, that their designs have to be more innovative to be able to work within the constraints of the older process.

I'm no electrical engineer, so I took him at his word, but it certain points out that Intel's fabs are a distinct advantage.


I wouldn't go that far. Intel definitely has an advantage in process technology, but it's not as though they rely on it exclusively to stay ahead. For example, currently the AMD and Intel CPUs in the server market are both at the same level of process technology, 32nm, and Intel's CPUs are still overall superior.


Intel's Ivy Bridge processor has been out for almost a year now, and is on the 22nm fab process.


Ivy Bridge still hasn't been released in the server space. The server market currently has Sandy Bridge chips. The Ivy Bridge chips are due at the end of the year.


It's 22nm finfet. The finfet is equivalent to a 0.5/1 generation advantage in power or speed.


Wasn't AMD moving to have (x86) chips made at TMSC?

Maybe if they partner with IBM. Still, even if Global Foundries is two generations behind Intel, that's still one of the most modern fabs and being constantly updated is very expensive.


I've always felt that there fab division should have been forced to be split off from it's design division. It would be better for the industry if architectures had to compete on their merits and all designs had access to the same quality of fabs.


Very sweet long-term deals, yeah. Dell managed to meet Wall St's earnings expectations almost entirely through payments from Intel not to ship AMD systems; they even got in trouble because they fiddled the books to hide where the money was really coming from: http://www.pcpro.co.uk/news/359770/intel-sweeteners-made-up-...


> I'm guessing Intel offered some sweet sweet long term deals to keep AMD out of the picture.

I don't know whether Intel under bid AMD's pricing or not.

What we know Intel did do, though, is that they ran a marketing blitz of Intel Inside ads so that when consumers went to buy a computer and saw "AMD" instead of "Intel" they reacted negatively. Based purely on the marketing, customers would view the computer sporting the "Intel Inside" logo as the premium product.


Methinks they did a wee bit more than that.

http://www.nbcnews.com/id/33882559/ns/business-us_business/t...


AMD did some interesting things. Another I'd point I'd add is taking RISC vs CISC off the table with the x86-outside, RISC-inside K5. This opened up the field for a lot of innovation in the core.

What Intel did best was learn well from both AMD's success and their own stumbles. I'm sure Netburst & Itanium yielded great learnings about branch prediction.

I'm not sure if it was luck, but Intel also benefited from a shift to mobile. iirc, the original Intel Core line was for mobile, but ended up dominating the Intel roadmap. At the same time, AMD was trying to make inroads into the server market, mostly because it was lucrative.

Perhaps that stifled their innovation in some key ways. Anecdotally all the "big iron" architectures are dying off. Certainly if you wound back to 2000 and said ARM is the upstart in the server industry, you'd get laughed at.


Interestingly both Intel and AMD pulled the same trick at about the same time with regard to CISC vs. RISC. The Pentium Pro was also a RISC core that used micro-op translation to present an x86 exterior, and it was introduced about half a year prior to the K5, though it didn't hit the consumer market for a while.

Also, as you point out, Intel has been good at cross-pollinating between different development lines and teams. The Core architecture was a ground up redesign but it was heavily influenced by the work on the Pentium M design done by the Israeli division. And previously there was a lot of influence on the Pentium-III design from the work done with the low-budget Celeron line.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: