Are all AMD fans idiots?

Yea I know, sounds a bit like a blog I did on linux a while ago, doesn’t it? Well, that’s very appropriate, because it’s a similar case of revisionist history by people probably not old enough to ever have witnessed the events at the time.

It’s a pet peeve of mine, the internet is filled with AMD fanboys, thinking they’re some kind of expert on CPU design and all that. I can understand it up to a certain point… I mean, Intel is that big corporation that people love to hate, and they’re rooting for the underdog, which is AMD ofcourse. I myself have used a long list of non-Intel processors over the years, so I know all about the alternatives to Intel processors, and how they can be better. But I can’t stand people who just don’t bother to check their facts, just like with the nonsense about linux/unix and how it was ‘secure by design from the start’, and all that.

I think the whole AMD fanboy movement started with the success of the Athlon. I bet most AMD fans never even heard of AMD before the Athlon, or in fact haven’t even owned a PC before that time. That is the only way in which I can explain their delusional idea that Intel and AMD are somehow eachother’s equals in a technological sense, and how they leap-frog over each other, trading the performance crown back and forth.

Clearly, anyone who bothered to study the history since the beginning of Intel’s 8086-range will know that AMD started as an independent seller of x86 processors with the Am386 (after having been a second source for Intel with the 8086 and 80286 for years), and that they did this in 1991. Put this in the proper perspective: Intel released the original 80386 in 1985(!), and released the 80486(!) in 1989. So from the get-go, AMD was about 6 years behind Intel, with a gap of more than a generation.

One often hears the fairytale that AMD sold much faster 486 derivatives than Intel, so AMD must have had a technological advantage over Intel. While it is true that AMD sold 486 derivatives up to 133 MHz, while Intel’s fastest was only 100 MHz, this has to be put in the proper perspective as well: AMD’s first Am486 was introduced in 1993, actually a month AFTER Intel had introduced the Pentium, which may not have had higher clockspeeds at the time, but the Pentium had far higher performance per cycle, especially the FPU was an incredible deal faster than the outdated design of the 487. In fact, AMD didn’t actually introduce those 100+ MHz 486s until 1995, while Intel released its last 486 in 1994.

So what really happened was that AMD basically was selling overclocked 486 processors as their high-end, while Intel had a much more advanced architecture which delivered much better performance, even at considerably lower clockspeeds. Clearly Intel wasn’t even interested in selling high clockspeed 486 processors, as they would only threaten Pentium sales. And ofcourse AMD was still a generation behind technologically, so the fact that they eventually had a 133 MHz 486 in 1995 doesn’t mean much. Intel offered 133 and 150 MHz versions of the Pentium by then. So not only could Intel match AMD’s clockspeed, but Intel’s processors were MUCH faster at those clockspeeds. In fact, even if we look at the fastest processor that you can put on a 486 socket, it’s not AMD’s, it’s still Intel’s. Intel offered a Pentium Overdrive processor for the 486 socket (although not all motherboards support it). It was a true Pentium processor at 83 MHz, complete with the superscalar architecture with the U and V pipelines, the large caches and the massively improved FPU. I’ve actually used it in my home server for a few years, running FreeBSD.

That’s pretty much the story of AMD all around. Usually their CPUs were a generation behind, and also their manufacturing process was usually one node behind that of Intel. In fact, the entire success of the Athlon architecture is partly due to them being a generation behind. While Intel moved on to the Pentium 4 aka Netburst architecture, AMD was still working with an architecture that was closely related to Intel’s P6 architecture, as used in the Pentium Pro, Pentium II and Pentium III.

Netburst didn’t quite work out, and as a result, Intel had never killed off their P6 architecture completely. They used it in the Pentium M line for mobile devices, as the Pentium 4 just drew too much power (even the Pentium 4M derivative was useful for desktop replacements at best). Overclockers already knew it, and Intel must have known as well: If you overclock the Pentium M (or later the Core Duo), you get performance very similar to that of the high-end Athlons and Pentium 4s.

So, basically Netburst was just an anomaly. If it was ‘business as usual’ with Intel, then AMD would never have been able to touch Intel’s high-end, as their CPU architecture was a generation ahead. And if Intel had stuck with a P6-like architecture, as AMD did with the K7 and K8, then Intel would have had performance and power consumption much closer to AMD’s than they did with Netburst.

And that’s where Core2 comes in. Intel took a bit of P6, a bit of Pentium 4, and a bit of ‘new’, and they went right back to where they always were: a generation ahead of AMD, and AMD unable to compete with Intel’s high-end performance. The most surprising part here is that Intel didn’t even make use of an onboard memory controller yet. They didn’t need that to outperform AMD’s processors, because the entire architecture was so good (except for multi-CPU systems).

As AMD struggled to get their quadcore answer out in the form of Phenom, Intel worked on a new architecture which finally did leverage an onboard memory controller, and also recycled some of the remaining Pentium 4-technology in the form of HyperThreading (as lackluster as Netburst may have been in many aspects, HyperThreading and SSE2 were very nice technologies and will likely be with us for a long time). As a result, Intel maintains its lead of an entire generation over AMD, and the performance gap became ever larger. Now the multi-CPU problem is also solved. This leaves AMD competing with 6 cores against Intel processors with ‘only’ 4 cores in the server/workstation market, because the combination of Intel’s more advanced architecture and the reintroduction of HyperThreading just delivers more performance per cycle per core.

Business as usual. AMD having higher clocked or better performing CPUs? Outside the Athlon/PIII/P4 era this pretty much never happened, the whole leap-frog thing is a myth. Today AMD is pretty much where they’ve always been, except for that one anomaly. At this point it seems more likely that AMD goes bankrupt than that AMD will once again compete head-to-head with Intel in the high-end market. AMD was always about bargains, bang-for-the-buck, low-end to mainstream systems.

Makes me wonder, do those AMD fans even realize that AMD wasn’t the first, the only, or even the best Intel alternative most of the time? Me, I have nothing against AMD, I’ve used their processors from time to time. I actually had an early Am486DX2-66 back in 1994, and I’ve never had a Pentium 4 myself, I used Athlons through that era, before going back to Intel with Core2, for obvious reasons. But I’ve also had an IIT 387 coprocessor, which had some very nifty tricks over a regular Intel one, like having 4 stacks of registers rather than 1, which allowed you to fit whole matrix*vector operations on stack, for example. I’ve also had a Cyrix 6×86 for a short while, which was a better Pentium alternative than AMD’s K5 at the time. In the end I have to admit I went back to the real Intel Pentium though, because the Cyrix and AMD both were comparable to a Pentium only with integer operations. The FPU on those things was about as weak as a 486, nowhere near a real Pentium. I also had an IBM/Cyrix Blue Lightning 486 at some point. And ofcourse there was the NexGen, which AMD eventually bought and reworked into their own K6 architecture (just like the Athlon was partly reworked Digital Alpha technology). And what about the NEC V20/V30? Some of the earliest x86 alternatives, way back in the 8088-era. There’s more, but the point is that AMD wasn’t even Intel’s main competitor, until after all other competitors had given up. I have to give AMD credit for being so persistent though.

Don’t take my word for it though, it’s all on Wikipedia.

This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

47 Responses to Are all AMD fans idiots?

  1. Edward says:

    Scali, I dod not disagree with your egeneral feel about AMD fanboys or for that matter Intel, nVidia and ATI fanboys. ALL of the major brands have these people. In your piece you mention AMD is usually behind. I will point you to the Athlon 64 and Athlon X2 processors which where actually ahead of Intel for a while.At the end of the day AMD is just what it is a good alternative to Intel chips. Right now for the more budget minded AMD is a better alternative. If they will hold after the i3 releases remains to be seen.

  2. Scali says:

    Yes Edward, the operative word is that AMD is *usually* behind. I didn’t mention Athlon 64/X2 specifically, but I did talk about the Pentium 4, the ‘anomaly’ that it was in Intel’s history, and obviously it was those Athlons that were up against the Pentium 4 at the time.As for fanboys… I think the AMD vs Intel (or AMD vs nVidia) situation is very similar to Linux vs Microsoft. Statistics clearly show that there are considerably more Intel users than AMD users (and the same for MS vs Linux, even moreso)… However, the online discussions don’t reflect this relation at all. You don’t see a lot of people ‘championing’ Intel. Everyone was kicking Intel when it was down with the Pentum 4, but hardly anyone is kicking AMD now it’s down with its Phenom II. Hardly anyone stood up for Intel’s Pentium 4, but people still excuse the lackluster Phenom II performance and unhealthy power consumption at the high end… not to mention the idle hope that AMD will return with a superior architecture once more.AMD is a better budget option, always has been, probably always will be. The problem AMD is facing is just that it’s getting pushed into an ever smaller market bracket as Intel introduces faster CPUs at the same price.

  3. Denys says:

    I read this article on: http://www.brightsideofnews.com. Here is my replyPerhaps the idiot who wrote the article should remember a few things.First, who was the first to release a 1 GHz CPU?http://www.zdnetasia.com/news/hardware/0,39042972,13026272,00.htmSecond, do you remember about the PIII 1133 NHz fiasco?http://www.tomshardware.co.uk/intel,review-214.htmlhttp://www.hardocp.com/article/2000/08/28/113_ghz_by_intel/http://www.tomshardware.com/reviews/revisiting-intel,221.htmlThird: Intel’s idiotic sticking with expensive and crappy RDRAM?http://www.eetimes.com/news/semi/showArticle.jhtml;jsessionid=3W0DGB3HABQVJQE1GHPCKHWATMY32JVN?articleID=10813246Fourth: as a matter of fact, AMD had a clear performance lead since launch of K6 (http://www.tomshardware.co.uk/intel,review-21.html), K7 (Athlon) on June 23, 1999, up until the launch of Intel’s 3 GHz parts early in 2003. (Wiki). After that, the release of Athlon-64 ruined it all for Intel all the way up until Core 2 Duo/Quad launch. (Wiki, Tom’s, Anandtech, you name it…)Fifth: Socket strategies. With release of K7, AMD went on to Socket A (Slocket) system, which lasted until Athlon 64 systems, and was used across the board for desktop, server, and mobile platforms until phaseout in 2003. In the meantime, Intel went through S-370, S-423, S-495, S-478, S-479 for mobile, S-603 and S-604 for servers. Sixth: Current socket strategies. Intel makes a new socket for i7, and i5, while AMD’s AM2 chips work just fine in AM3 sockets, and AM3 CPUs feel just fine in AM2/AM2+ boards, with obvious performance penalties, of course, but they WORK. Now go out, shell out $500 for an i7-965, and stick it in a $200 S-1156 mobo. You’ll fry both.http://en.wikipedia.org/wiki/CPU_socketSeventh: what about Intel’s greatest chipsets and their graphics capabilities? (anyone actually bored enough to be bothered to actually review them)Eighth: remember Intel’s stifling the competition? Not just AMD, VIA and Transmeta as well.http://money.cnn.com/2009/11/13/technology/intel_amd_settlement.fortune/Ninth: don’t even get me started on Laurrabee (or is it Laughabee?) "ability" to compete with Radeon 5870 anytime soon. Maybe by 2012, when 7800 are out from AMD…I am not a fanboy. I have 2 computers, one built with entirely AMD parts (CPU+GPU+Chipset), and one built by Intel (CPU+Chipset w onboard GPU).So, to wrap this all up, in light of your omissions, forgetfulness, plain lies, most typical fanboism commenting, and due to the number of facts that I have just provided; I call you, sir, an accomplished idiot. Please go back to your alternate reality, happily pay your $1000 USD for a 3.0 GHz Prescott CPU which you would have been your fastest option now if not for AMD.

  4. Scali says:

    Sounds like your idiotic fanboyism got the better of you, denys. I clearly stated in my article that I used Athlons in the Pentium 4 era. I’ve never had either a Pentium III or Pentium 4, I had two Athlons. You think I was ignorant to the Athlon’s performance and the fact that it was the first to get to 1 GHz? People like you is exactly why I wrote the article. You come here with your rants and insults, stating the obvious, living in the past. And by the way, my Core2 Duo didn’t even cost 1/3rd of $1000. Bye now.

  5. Scali says:

    "Just because Intel has higher end CPUs doesn’t make them any more correct by default. All bigots/fan boys are idiots on both sides."Yes, however, considering the fact that only about 30% of all x86 customers are AMD customers, it’s remarkable that the majority of vocal fanboys on the internet are in the AMD camp."Who was the first company to release an x-86 processor on 64 bit architecture?"Who cares? Intel was the first company with a microprocessor PERIOD. Intel was also the first with an 8-bit, 16-bit and 32-bit x86 processor. It’s all in the past, since both companies use the same 64-bit x86-technology today, and have been for quite a few years already.But I guess you totally misunderstood the article. You are EXACTLY the type of person that the article is about, and you are too blind to see it, just like the rest of them. Go ahead and continue insulting me, and continue to praise AMD’s achievements while putting down Intel’s achievements. I hope it makes you happy.

  6. Scali says:

    " In one breath you use the past to validate your argument and in the next breadth when it works against you, you dismiss it."I already told you you didn’t get the article.It’s not about Intel or AMD, or who’s better.It’s about fanboys like you who are obnoxious in spreading misinformation, lies, and throwing insults around towards everyone who has a different opinion.They always seem to be on AMD’s side. Just look at the responses here and on BSN*. It’s nearly only AMD fans insulting everyone. There don’t seem to be many Intel-fanboys responding to the article.If fanboys were the same on both sides, then I’d expect 70% of the responses to be Intel fanboys agreeing with the article and insulting AMD fanboys. This clearly is not true. QED.

  7. Edward says:

    Alan, Scalis piece is a bit hard hitting but not off the mark completely. AMD does have it’s fanboys that overlook facts. I disagree with Scali in that I feel there are equal counterparst for this in Intel and nVidia. In fact over the last few weeks I have been amazed at some Intel and nVidia fan boys for the BS they put out.Scali I understand you ducked the entire Athlon II era but you seriously need to step back to that. AMD has led with innovation a few times and in a way that Intel copied. The Athlon X2 was a much superior CPU to the Intel Pentium D. AMD was the first to put the two on die to the consumer and the first to bring about an onchip memory control. Both of which I might point out Intel at the time called useless innovations but then quickly followed the example.As for who is the performance leader I say who cares. Even the lowely Athlon II X4 delivers a GREAT computing experience for the majority of people and does so at a much better price than an similiar Intel offering. At the end of the day the decision between these two chips comes down to what you want. The best performance in benchmarks or the a lower price that gets the job done.

  8. Scali says:

    You haven’t shown anything, except a bunch of insults and some rants about things that I *didn’t* put in my blog. If I don’t mention a certain thing, I cannot be incorrect about it either.In case you’re wondering why I don’t just delete your dribble… You only succeed in making a fool of yourself, and providing live examples of what my blog is about.Another thing, I bought AMD for the money aswell, it’s right there in the blog. Go read my other blogs aswell, and figure out how stupid I REALLY am. And who were you again?

  9. Scali says:

    "Scali I understand you ducked the entire Athlon II era but you seriously need to step back to that. AMD has led with innovation a few times and in a way that Intel copied. The Athlon X2 was a much superior CPU to the Intel Pentium D. AMD was the first to put the two on die to the consumer and the first to bring about an onchip memory control."That’s not entirely correct, Edward.The original Pentium D, the 8xx-series was a single-die solution. It wasn’t until the introduction of the 9xx series that Intel started using MCM. Needless to say, the 8xx series was on the market BEFORE the Athlon X2.Yes, the Athlon X2 was faster, but Intel swallowed its pride, and perhaps for the first time in their existence, Intel actually undercut AMD’s prices, making the Pentium D the best bang-for-the-buck, a first for Intel.As for the integrated memory controller. Intel was simply right about that. They didn’t need an integrated controller in the Core2 series to outperform AMD’s offerings.In theory an integrated controller is better, but being the first doesn’t mean much if it doesn’t result in a better product. Intel didn’t need an integrated controller to compete, and when they finally DID integrate the controller, the result was an architecture that is well out of reach of AMD, despite being remarkably similar on paper.And before anyone mistakes that as saying that faster is better, and only high-end matters… Anyone who thinks that is just not getting the bigger picture. I always say "Today’s high-end CPUs are tomorrow’s budget CPUs". Sure, Intel introduced the Nehalem architecture in the high-end first… but today you have very affordable Core i3 and i5 solutions aswell.I am sick and tired of people who only look at prices. Yes, AMD doesn’t offer any CPU over ~$300… But that’s AMD’s fault, because they simply cannot deliver the performance. I have never considered a $300 CPU ‘expensive’ or ‘high-end’. It simply isn’t. I would say that anything in the range of $200-$500 is just ‘mainstream’, and anyone with a regular dayjob can afford that. Let’s not pretend that they are expensive enthusiast products, because they simply aren’t. Back in the day when I bought my Athlons, they were more than $300 aswell, and the Athlon X2 went all the way up to the $1000 regions. Why? Because back then AMD could deliver the performance that needs to go with the pricetag.These days AMD cannot, simply because they are behind Intel on many levels. Intel has 32 nm, HyperThreading, CPUs with integrated PCI-e controllers and GPUs etc. Things that just make CPUs perform better, makes them more affordable and energy-efficient.All AMD can do is lower its prices on its aging architecture time and time again. They cannot continue indefinitely. They need a new architecture to remain in Intel’s tracks, otherwise they’ll just slide out of the market, like they have been sliding ever since the Core2 was introduced, and AMD was wiped out of the $300-$1000(!) pricerange almost instantly.Now, you can make excuses for AMD all you want, but it just shows you don’t get what’s really happening here.

  10. Scali says:

    "My perspective is that Scali doesn’t know much more then what he has read in Wikipedia which in it’s self is kind of a laugher."Firstly, I didn’t get my info from Wikipedia, I’ve lived through the entire x86 era from day 1. Secondly, as I said, read my other blogs, to get an idea of what I do and what I know. I doubt your knowledge and experience are anywhere near mine. You’re just a fanboy with a big mouth and no skills. I have actually posted plenty of things that back up that I know what I’m talking about.

  11. Scali says:

    How can I be a fanboy if I don’t have a preference of any brand over the other?

  12. Edward says:

    Scali I am curious do you trash Intel and nVidia fanboys on your blogs as well?

  13. Scali says:

    "Scali I am curious do you trash Intel and nVidia fanboys on your blogs as well?"Ofcourse, I’m an equal opportunity trasher.I’m not sure if you quite understood what this blog is (or the earlier one regarding linux for that matter)… It isn’t meant to be some kind of historical overview. What I did was: I took a few of the myths that keep reverberating around the internet, and provided the relevant factual and historical context to explain why these myths are myths, and not reality. In other words, revisionist history.I think you fell for it yourself with your earlier comment here. You’ve heard the myth that AMD was the first with a single-die dualcore so often, that you actually thought it was the truth. Which it wasn’t. That’s the sort of thing that this blog is about.Now, I would do exactly the same if it were any other brand or topic (as shown with the topic on linux/unix/opensource, while I’m a FreeBSD/opensource user myself, and I actively write and maintain opensource code aswell). Thing is, I wouldn’t know what to write about regarding Intel or nVidia. Do you know of any such myths that I could write about?

  14. Scali says:

    "He made previously on one of his other blogs a comparison between AMD and Nvidia. Said AMD would and never will compete with Nvidia. Well as of late they have made great strides in market share and product. yet it is his absolutism about AMD not competing with Nvidia that gives his own fan boyism away."I told you to read all my blogs, which apparently you haven’t. Firstly, I never said anything like that. Secondly, I actually bought a AMD Radeon 5770 card recently, as you could have read in my blogs (where I explain how it is the better product for me at this time), if you had bothered to read them. That doesn’t match at all with the picture you’re trying to paint of me.

  15. Scali says:

    "Scali, I have only read the BSN blog so if you have this stuff on other blogs I haven’t read them."This is my blog. BSN just published one of my blogs on their website. You clearly said you read my blogs: "He made previously on one of his other blogs a comparison between AMD and Nvidia".If you only read BSN, you could only have read this particular entry, as nothing else of mine was published there. So stop lying."Also didn’t you basically make the statment AMD always lags Intel? If so does this lag and why?"The exception that proves the rule. It’s ‘just’ a harddisk controller. Intel never had any chipsets with ATA133 either. It seems that the boards with third-party 6 Gbps SATA controllers on board are doing a good job, so you can still have the best of both worlds anyway.

  16. Scali says:

    "I guess you mean that okay AMD did get one but it is an exception so it really shouldn’t count?"No, I mean that even a company as large and mighty as Intel cannot possibly be the first with EVERYTHING (but don’t forget, Intel is also the company that is responsible for developing standards such as PCI, AGP, PCI-e, USB and SATA).So yes, AMD and other companies (such as Marvell with the controller on the SSD side) do get the occassional ‘scoop’. Fair enough, give credit where credit’s due.In the greater scheme of things though, I don’t think they are very significant wins. In the case of AMD, you have a faster chipset, but a slower CPU, which is more important?In the case of Marvell… okay, so you have a 6Gbps SSD controller… but the majority of chipsets out there is Intel, so you cannot use it (in fact, it may not entirely be a coincidence that Intel doesn’t have 6 Gbps chipsets AND no 6 Gbps SSDs)."I am hearing the Bulldozer core will be the first 128 Bit CPU out there because of the 128 bit FPU."I think that’s a load of nonsense, to be honest. In the old days we had FPUs with higher precision, such as 80 bit for the x87, and up to 96 bit on Motorola chips for example.These clunky and archaic formats were dropped in the RISC era, because you rarely needed anything more than 64-bit, and if you did, you might aswell use a software solution instead of crippling the hardware with overly complex logic.Therefore I don’t see the need for more than 64-bit precision as a developer. We’ve been using 32-bit and 64-bit for decades without problems.I don’t think AMD is going that route, it doesn’t sound like a good technological decision. It’s not a good trade-off between performance and hardware cost, as software rarely needs 128-bit precision.I think BSN just got the information mixed up or something. It might just mean packed arithmetic like in SSE registers and such.

  17. Scali says:

    "My guess it is for marketing, or patent reasons."Sometimes it’s just timing. You can’t introduce new CPUs and chipsets everytime a new technology pops up. That would take too much development time, and not enough return-on-investment.I have no doubt that Intel could rush out a 6 Gbps PCI-e 2.0 chipset if they wanted to. But why would they? Their current platform performs fine, and is a good seller. They can let AMD have this advantage.Just like they didn’t rush out a CPU with onboard memory controller just because AMD had one.Sometimes you just have to ride it out."Wow I had forgotten about the x87.."x87 is still with us, it’s in every x86-compatible CPU today. Most 32-bit software still makes extensive use of the x87.They just rarely use the 80-bit floating point datatype. In fact, in some programming languages the 80-bit type is not supported at all. Java is a good example of that. The .NET framework doesn’t support it either. Neither do Microsoft’s C/C++ compilers (unlike for example gcc, which has ‘long double’ to access it, which obviously isn’t portable to other architectures with only 64-bit).

  18. Scali says:

    "Yet they seem to know what buttons to push to loosen the Intel grip and get people to buy AMD. Looks like Q4 09 was going AMDs way. Small but in the right direction."Still a far cry from the 30% marketshare that they had during the Athlon heyday. What is more disturbing is that AMD only gets this marketshare from constantly lowering their prices and cutting in their profit margins. If it wasn’t for the Arab investors and the Intel settlement, AMD would be in dire straits right now. Since there aren’t going to be investors and settlements every year, something has to happen for AMD to survive. The way it’s going now, it’s only a matter of time until they go bankrupt. It has been that way for years.Intel has recently introduced 32 nm, and CPUs with integrated PCI-e controllers and GPUs. These CPUs once again deliver more performance at lower cost and lower power consumption, and all AMD can do at this point is to lower the prices on their existing products, which obviously isn’t good for the company.At this point AMD has to use triple cores and quadcores to compete with Intel’s dualcores, and six cores to compete with Intel’s quadcores. Intel is also pushing AMD out of the server/workstation market, which is one of the most profitable markets.Things just aren’t looking good for AMD. They haven’t looked good for AMD in years, not even during the Athlon heyday. Even then AMD was struggling to make a decent profit."Scali, you may be right but I don’t think they are going to stop this crazy train. Talk of windows going to 128 in the future. It is just going to keep getting bigger and bigger is my guess."Well, I can’t comment on it until I know what this is really about.

  19. Scali says:

    "30% of a multibillion dollar industry still has a lot of 0’s behind it."Only if you have profitable products. As I already said, Intel has 32 nm, better integration of CPU, GPU and chipset, and their stuff just performs better. So AMD needs to compete with 45 nm CPUs with more cores, and more expensive chipsets.If it were the other way around, sure, you could easily make the smaller marketshare work… but in this case, Intel is the one that calls the shots. If Intel drives the prices down too far, AMD won’t be profitable. And that’s what’s been happening ever since Core2 was introduced."Otherwise I believe with AMD gone, Intel won’t have the incentive to produce top of the line product and there product offerings will become stale."I keep hearing people repeat that, but I think that’s a very naive and shallow look at the situation. I will give two simple arguments to make you think about it:1) Intel hasn’t had competition from AMD from the beginning of the x86 up to AMD’s introduction of the Am386 in 1991.Without this competition Intel still managed to produce the 8086/8088, 80186, 80286, 80386 and 80486, and was only months away from the introduction of the Pentium.2) CPUs don’t really ‘wear out’. Most of the CPU market is already saturated, with every desk in every office having a PC, and most people having at least one PC at home. Hence the majority of CPU/PC sales come from upgrades, not from new customers. If it wasn’t for upgrades every few years, people could easily use the same PC for 10 years or more, which would mean 0 sales for Intel.So Intel is always competing against itself. It needs to continue to step up the technology, in order to sell products.In fact, one could argue that AMD is already irrelevant in this scenario. After all, about 80% of the market uses Intel CPUs.. and Intel has to keep those people interested in Intel products. It doesn’t really matter whether those people buy AMD CPUs, or don’t upgrade at all… Either way, it’s not an Intel sale.I don’t think AMD really plays much of a role here… firstly because of its small marketshare, and secondly, because its products only cover one end of the market.When I look at my situation at work for example, we generally buy Intel Core i7 development machines and laptops. AMD systems simply wouldn’t be an option. They don’t deliver the performance we want, and wouldn’t even be much of an upgrade from the systems we are replacing (which are mostly Core2 Duo/Quad). We don’t replace them because they’re old and broken. We replace them because there’s something better on offer, and if buying a newer, faster system saves development time, it’s worth the investment.

  20. Scali says:

    "How do you know AMD’s chipsets are more expensive? I don’t believe you do. Your assuming they are."I gave you some fine arguments already, you might want to pay a bit more attention reading my posts:- Intel puts the PCI-e controller in the CPU, not in the chipset- Intel puts the GPU in the CPU, not in the chipsetNow, without even getting into the fact that Intel has the advantage of economy-of-scale… I think it’s pretty safe to say that Intel’s chipsets are cheaper because there’s a lot less logic in them, and as such their die-area is smaller, which means more chipsets from the same wafer size, less chance of defects etc.You could ofcourse have figured this out if you bothered to think.

  21. Scali says:

    Oh, and another thing… AMD chipsets are made by TSMC, because of the fabless ATi legacy. They haven’t moved chipset production to Global Foundries yet.And I forgot to mention that Intel has integrated the southbridge in their latest chipsets. So as I said before, Intel has the better integration.

  22. Scali says:

    "AMD, according to STEAM’s hardware survey AMD is at 30.51% of market share for CPU."On Steam yes. But that’s not representative for the entire market. In their heyday, AMD had almost 50% on Steam. The Steam survey obviously doesn’t include things like corporate desktop machines, servers/workstations and such, since nobody ever plays games there, let alone via Steam.AMD’s share has been dropping slightly over the past months, as you can see here: http://store.steampowered.com/hwsurvey/processormfg/"Therefore since you can’t sell product unless it is priced according to the market then it is easy to assume AMD will be closing the performance gap."That’s called wishful thinking.Just because AMD *needs* to close the performance gap, doesn’t guarantee that they *will*. History is full of hardware manufacturers that tried to close the performance gap and failed.Frankly I don’t see where AMD’s competition is supposed to be coming from. They won’t have 32 nm this year, and I don’t see a new architecture surfacing any time soon either… let alone that it will compete with Nehalem, or whatever Intel has on offer by that time.I think we’re heading for another Phenom… An AMD CPU that can only compete against Intel’s last-gen products, and doesn’t even do that all that well.It just seems like AMD can’t keep up Intel’s tick-tock pace. As long as Intel continues executing it as flawlessly as they have done so far, AMD doesn’t stand a chance.

  23. Scali says:

    Tried to look for some older Steam figures…Here’s one from June 2009: http://wearecolorblind.com/wp-content/uploads/2009/06/steam-hardware-survey-chart_01.jpgAMD was still at 33.5% there…December 2008: http://theovalich.files.wordpress.com/2008/12/steam_hw-survey.jpgAMD at 36.3%So as you see, it’s not going well at all. They’ve dropped considerably in 2009.

  24. Scali says:

    So first you come up with Steam… then I say Steam has only been dropping over the past time…And instead of responding to those simple facts, you just come up with speculation.I’m not going to bother responding to speculation, certainly not if you don’t even bother to respond to the trend of reduced marketshare in sources that you yourself bring up as a reference.I don’t feel like debating why Steam isn’t representative either… but I’ll just give a few simple facts:1) Steam is a platform for Windows games. This by itself leaves out a large part of Intel systems, such as all Apple computers, and things like Atom-powered netbooks. Those markets alone probably make up at least 10-15% of the whole x86 market, and they are Intel-only.2) As I said, in its heyday, AMD had almost 50% on Steam (in fact, they even went slightly OVER 50% at some point in 2006).I’ve found the data to back that up, as I found where Steam stores its archive:http://steamgames.com/status/survey_v1.htmlhttp://steamgames.com/status/survey_v2.htmlhttp://steamgames.com/status/survey_v3.htmlhttp://steamgames.com/status/survey_v4.htmlhttp://steamgames.com/status/survey_v5.htmlhttp://steamgames.com/status/survey_v6.htmlNow, since 2007, it’s basically been a free fall from ~50% down to the 30% of today, and the trend is still dropping.Now clearly NOBODY believes that the 50+% share of AMD systems was a reflection of the WHOLE market, right?3) Most other market surveys report lower AMD numbers than Steam, as their sample space is larger. In AMD’s heyday at around 2006, they were slightly over 30%, according to most sources. These days they are around 20% according to most sources (which can be explained mostly by 1)).Some sources to demonstrate that:http://arstechnica.com/old/content/2006/01/6053.arshttp://www.reghardware.co.uk/2007/01/31/mercury_x86_marketshare_2006/

  25. Scali says:

    Errors in my opinions? An opinion is subjective, and as such can’t be wrong.If I say "it doesn’t look good for AMD", that depends on my subjective interpretation of what ‘good’ means for AMD, so what my expectations are. So perhaps I have higher expectations of AMD than you do… Am I wrong in that? I don’t think so, those expectations are just my opinion, and we are all entitled to our own opinions. It would be rather boring (and not very constructive) if everyone thought the same about everything.As for AMD not being down-and-out… Some people analyze AMD’s results a bit deeper than just at face-value:http://www.brightsideofnews.com/news/2010/4/16/amd-1st-quarter-2010-earnings-report-shows-hope-for-the-future.aspxThey turned in a profit, but they were rather ‘creative’ with their GlobalFoundries construction. I wouldn’t be surprised if GlobalFoundries is the one taking the losses that used to be taken by AMD as a whole. Makes AMD look good, but it’s not the whole story. Don’t forget, no GlobalFoundries means no AMD, because who else has the same capacity and the same low production costs for AMD?As for good products coming out, I’ve already discussed that, we obviously don’t agree, and I don’t see any reason to go deeper into the matter, since you haven’t even commented on the arguments I gave.

  26. Scali says:

    Alan Sulzer, you have to realize that this is MY blog, a space where I can post MY opinions. Different opinions are fine, and I am willing to discuss them, as long as it remains on a respectable level, and with points worth discussing.Random insults just because you have a different opinion will be deleted. If you want to vent YOUR opinion, do it on YOUR blog.I have countered your previous arguments, but you simply ignore the evidence, and continue on a different tangent.And posting random cherry-picked reviews doesn’t exactly impress me either. It’s not like your legitreviews link is representative.Try this one for a different opinion on Phenom II X6:http://www.bit-tech.net/hardware/cpus/2010/04/27/amd-phenom-ii-x6-1090t-black-edition/10"Despite being an astonishing £600 cheaper than the exorbitantly-priced Intel Core i7-980X Extreme Edition, the X6 1090T BE still isn’t a very good buy. That’s because despite being clocked at a respectable 3.2GHz and having a useful auto-overclocking feature in Turbo Core, it’s based on a comparatively old architecture – K10, which is in reality only a tweaked version of the ancient K8 architecture dating way back to 2003.As a result, the X6 1090T BE really struggles to keep up with the similarly priced Intel Core i7-930, which has was noticeably faster in six of our eight benchmarks thanks to its far more modern Nehalem architecture. The only exception to this was our Cinebench and WPrime tests, indicating that the X6 1090T BE may be worth considering for a low cost graphics workstation. However, even then, the i7-930 retook pole position when both CPUs were overclocked to their air-cooled maximum frequency.Ultimately, despite being a good step forward for AMD, the i7-930 still remains our first choice CPU in the £200-£250 price range. Only if you have a compatible AMD motherboard and just want to upgrade the CPU should you look to buy the Phenom II X6 1090T Black Edition."So it looks like I’m not the only one who isn’t all that impressed with AMD’s X6. Why don’t you go complain over there, and leave my blog alone.

  27. Alan says:

    Reuters reports that AMD might be in more computers this Summer.People familiar with the matter who work for AMD said the company’s latest microprocessors are expected to be included in 109 mainstream laptop models in the coming months, the company’s best showing during the crucial back-to-school sales season. Last year, AMD’s chips were available in 40 laptop models. "This is the first time we’ve seen this much attention to our notebooks," the source said, referencing the company’s laptop Things just keep getting better!! LOL Look at what happens when you have product and you don’t have to fight against unfair trade practices. Wow what a novel idea!!

  28. Scali says:

    This blog was never about AMD itself. It was about its obnoxious and idiotic fanbase.You’ve proved my point Alan Sulzer… Posting random AMD-related news as ‘response’ to this blog. This is not a place for promoting AMD.

  29. Scali says:

    Now you’ve disappointed me, Alan Sulzer…You always come here to post the latest business news on AMD… but you haven’t said anything about the recent news of Oracle/Sun abandoning AMD processors in their line of server/workstation products!

  30. Alan says:

    Didn’t see it. Thanks for the info.

  31. Alan says:

    Well as you once said. You can’t win them all. And in AMD’s case you can hardly win them at all. I will be surprised if AMD is never anything more than a value prop, which is what I want anyway. It gets the job done for me while AMD lineup is good enough for now. On the flip side Intel has too many cards to play and for the foreseeable future untouchable with respect to AMD catching up. Yet let’s see what bulldozer brings. I suspect Intel will already have the answer though. I haven’t responded lately because I figured you would appreciate it if I left you alone. I try not to be totally unreasonable.

  32. Pedro says:

    ALL the money in the world , and poor little "honest" intel (the company that coughed $1.25 billion) cant come up with a decent gpu :)And now they are advertising Knights Corner , I guess those "computers filled with graphics cards" are "bitting" intels heels hmmm ?Think Tianhe-1 , … jaguar is interesting isnt it ?Anyone who is an Intel fan should be really , but really thankfull , that AMD exists , because the way intel does business it´s clear that without AMD , intels current desktop cpu flagship would be a 2.5ghz single core PIV celeron, so when youre petting your i7, remember AMD ;)I buy AMD, it´s cheaper, does what I need , and hey it´s the smart buy right now for the average desktop, what you save on the cpu/mobo , you can spend on a ssd HDD (that´ll give desktop performance… ) or a better graphics card if youre in to games ! … and they need and deserve support (intel spends in R&D the same as AMD´s total revenue !and remember in every side there´s idiot fans , and fans that know what they are cheering for , and i´ll bet you a trillion dolars that that you have more of the first ones cheering for intel than AMD regards !

  33. Alan says:

    Pedro,Scali picks on AMD users but your right you have idiots on both sides. Yet I think Scali weather he intended to or not, I believe misses the main reason AMD has fans and which you touched on it yourself. I think AMD has a fan base for many reasons but I belive the prevailing reason is the decent low cost CPUs they produce. So in general AMD seems to be a better buy. I will take Scali’s side of the fence now and state that is not always the case. I might be in AMDs camp but I am not blind to the truth either. But when that ocurrs AMD gets corrected by market factors and then lowers it’s price to regain the price/performance bench crowns or at least get in-line with respect to Intel.although Scali has stated that AMD has no impact on Intel since it is not a real competitor, I believe that Scali is wrong on that point and shows in my opinion how much he seems to be in the Intel camp. Yet I agree with you that we can thank AMD for forcing Intel to get off their lazy butts and give us something good. This in turn, forces AMD to keep lock in step with Intel to give us an alternative. You have to love market dynamics, because without it, I would call you comrade and tell you to pass the vodka.

  34. Scali says:

    "Scali picks on AMD users but your right you have idiots on both sides."You should read my latest blog, I pick on nVidia fanboys!And no, I’m quite sure AMD has more fans than Intel… you rarely see anyone cheering for Intel… and keep in mind that Intel has about three times as many users as AMD does.I’m not a fan of either. I think being a fan of a company is the sign of an idiot by default.

  35. Alan says:

    Scali, I appreciate that you share the love so to speak with reference to the Nvidia fans. So when are you going to hit on the ATI fans? They are just as crazy but just like the color red more then green. You might be right about AMD having more fans but if not then they are more vocal. I think most people like to root for the little guy or the down troden. Which is AMD by far.

  36. Scali says:

    "So when are you going to hit on the ATI fans?"Geez, there’s just no pleasing you guys is there? Firstly, I consider AMD and ATi a single company… because it is. Secondly, I’ve given AMD and its fans plenty of flack regarding their GPU business already, in various blogs. Mostly related to the huge failure that is OpenCL and physics on AMD’s side." I think most people like to root for the little guy or the down troden."That’s what I said in my article. I think that’s idiotic. And I’ll tell you why. If you want to talk about competition, then competition is about one company one-upping the other with its products. And that company is rewarded by its customers for delivering those great products. It’s a survival of the fittest.AMD’s CPU division however hasn’t done anything interesting in years. They’re still using the same K10 core, which in essence is barely different from the K8 core that went before it. The only thing that keeps AMD going at this moment is the fact that they are selling their outdated and underperforming products at bargain prices. But why should I as a customer reward them for not having innovated anything significant in years?AMD needs to do something to deserve my money.With their GPU division, they did. They were the first to use GDDR5, the first to use 40 nm, and the first to deliver a DX11 part. So they deserved my money, and I bought a Radeon 5770.But in the CPU world, AMD just isn’t worth your money. They aren’t fit to survive. AMD had its chance when they had a good product in the original Athlon and then the Athlon64, and was able to expand their market and raise their prices… but they haven’t done anything with it.Intel on the other hand has long recovered from this period, and continues to develop new and innovative architectures, and pioneer new production processes.In fact, Intel is even working on improving their GPU performance. The integrated GPU in the new Core i3/i5 series is now pretty much on par with the latest AMD and nVidia IGPs. And Intel has claimed that Sandy Bridge’s IGP is going to perform as well as a mainstream discrete card. If they pull that off, they have pre-empted AMD’s Fusion… Another failure for AMD… just like physics… AMD has talked about Fusion for years, but never managed to get anything on the market so far. Its competitors actually deliver working products. So those competitors deserve to be rewarded. AMD just doesn’t deserve it.That’s how competition works.Now you could argue that AMD deserves your support anyway, because it gives them a better chance at competing… But let’s be realistic here. It’s been YEARS since AMD delivered a decent CPU (one that wasn’t just old technology at bargain prices, but actually something new and good). And AMD has actually HAD support from tons of fanboys and such… And STILL nothing substantial has come out of AMD’s CPU division. How long do you want to continue supporting AMD? It sounds like AMD is a company on life support, and it’s time to put it out of its misery. It has done nothing to justify the faith that its fans have put in the company in the past years.It never delivered a Core2-killer CPU, it never delivered a Core i7-killer CPU, and I hate to think what Sandy Bridge is going to do to AMD’s product line…

  37. Alan says:

    Scali,I agree with you on many of your points. Although ATI/AMD all part of the same company, yet I just figured if you where slamming Nvidia you could turn around and do the same on ATI, but apparently you already have. I guess I need to pay better attention to your blogs. AMD is definitely riding out this K10 architecture for all it is worth. I am a little disappointed in the lack of architectural improvements by AMD on the CPU side as well, but I guess that is why they are getting pasted by the high end i7 CPU class. Yet AMD is trying to change the game with this new fusion architecture so I am looking forward to seeing what it brings. I don’t expect much at first. I think it will be a big much to do about nothing at first but if the software vendors implement based on its architecture, I think it might get some traction in the market. I noticed some news out of the Intel camp being that they are attempting to go down that APU road as well. So AMD must have some good instincts on that front. I guess we will have to wait and see.

  38. Scali says:

    "AMD is definitely riding out this K10 architecture for all it is worth."Isn’t that ironic, though? AMD fanboys often go around claiming that Intel will stop innovating when AMD isn’t around to compete. But in reality AMD is the one that hasn’t innovated in years. Nobody ever criticizes AMD for that."Yet AMD is trying to change the game with this new fusion architecture so I am looking forward to seeing what it brings."Sadly the first generation will AGAIN be based on the K10 achitecture, and the existing Radeon architecture… Nothing new, other than that it will be integrated in one chip.I think at first it’s only going to be interesting as a cost-cutting feature, and lower power consumption… advantages that Intel is already reaping with the integrated IGP on the Core i3/i5 CPUs. These chips are very attractive for notebooks and low-cost desktops (eg office machines).I hope that Intel will also enable OpenCL and DirectCompute on their IGPs.

  39. Pingback: Yes, AMD fanboys *are* idiots | Scali's blog

  40. Lucian says:

    Scali,let us try a comparison:
    build a machine with the newest hexacore Sandy Bridge 3960 x and build another machine with the newest octocore 8150 FX
    Put 16 GB of DDR3 memory in each machine and a good graphics card,like the 6870 or something similar
    Also put an 256 GB SSD and a 3TB HDD in each computer
    Then go out and shoot 8 small videoclips ,no less than 10 minutes in size each
    Render these 8 videoclips with Avid Media Composer at the same time
    Observe then how the Intel machine is freezing and the AMD octocore is running fine
    And AMD also has 16 cores processors in the server market
    Let us count the cores: 4 cores on 65 nm(Q6600) ,8 cores on 45 nm(double the space,double the number of cores-Intel Nehalem EX) ,16 cores in 32 nm(AMD Opteron)
    Intel has no 16 cores processor right now although they are using 32 nanometer process since 2009
    Adn they should have produced the 22nm at the end of 2011 by the tick -tock schedule already

    • Scali says:

      LOL!!!!! What a load of nonsense!!!
      Besides, I don’t think you quite get the point of the article…
      On the other hand, you demonstrate the point quite nicely…

      Hey, more cores is always better, right (unless the cores are HT, then they don’t count… although funny enough they DO count the gimped cores of the gimped Bulldozer)? Lol, idiotic fanboys. The core myth is worse than the MHz myth.

      • Lucian says:

        This is not a conscious reply…why Intel needs to invent processors with 6 or 10 cores,when the power of 2 are so much nicer and consistent with the expectations of your clients?
        You clearly are in the bussiness for Intel,you said you are a developer
        Then it will be very nice to assemble a system of a single quad-socket from AMD with 64 cores in total;surely that will surpass any Intel 6-core single socket machine despite all the HT differences who are not seen in the previous example.
        You realize,dont you,that AMD can offer single socket ,dual socket and quad socket motherboards to the consumer clients anytime they wish and they also may sell 16-core processors as well?
        they are more interested in the server space at the moment
        or perhaps there are some nasty US Gov reglementations…

      • Scali says:

        Wut?
        Geez man, give it up.
        Also, retarded insinuations about me working for Intel are retarded.
        You clearly have no idea who I am, and you clearly have not bothered to read any of my other blogs.
        It just shows how one-dimensional and stupid you fanboys are.
        If I am negative about the behaviour of AMD fanboys (hint: the fanboys, not the company itself, this article has nothing to do with the company, merely its retarded fanbase), that doesn’t say much about my view of AMD as a company, let alone whether I prefer AMD or one of its competitors… Or even more far-fetched: that I would be working for one of their competitors.
        I must have explained that dozens of times over the years. You AMDtards just aren’t capable of comprehending the matter.

      • Klimax says:

        I really don’t think 16-core is really that better then Intel’s 10core/20thread one…
        (exception – price AFAIK)

        http://www.anandtech.com/show/5058/amds-opteron-interlagos-6200

        And they didn’t even use that Xeon, just 6/12(for similar price point) Both based on Westmere. Do you know what happens when SandyBridge-EP arrives?

        Hint: We already saw SB per-core performance and of 3960X / 3930k…

      • Scali says:

        Thanks for taking the trouble for replying, but I thought this was so obvious that it did not even need a reply. It was an obvious AMD troll.
        Just take his stuff on non-power-of-two core count for example… Why is it suddenly a problem when Intel does it? AMD was first, with their triple core harvested parts.
        Intel’s current Sandy Bridge E is also a harvested part. It is actually an 8-core die. So Intel will be introducing the full 8-core models at a later time.
        Not that I think there’s much wrong with non-power-of-two core count in the first place. Most modern OSes have schedulers that can deal with that quite adequately.

        Or what about his talk about AMD’s 16-core parts? They are actually two 8-core dies glued together (you know, that technology that was oh-so-bad when Intel did it, but now suddenly it’s da bomb!). Intel currently does not have such CPUs, because there is no need. As you point out, their single-dies already perform quite adequately, so at this point, a dual-die solution would just be over-the-top, both in terms of price and power consumption. AMD on the other hand needs the 16-core monster CPUs because otherwise they’d get too far behind in terms of performance. But as a result the performance-per-watt is rather underwhelming, which is a big issue for the server market these days.

  41. “The integrated GPU in the new Core i3/i5 series is now pretty much on par with the latest AMD and nVidia IGPs. And Intel has claimed that Sandy Bridge’s IGP is going to perform as well as a mainstream discrete card. If they pull that off, they have pre-empted AMD’s Fusion… Another
    failure for AMD”

    Full of shit, as always.

    You sir, are the most ignorant “developer” I’ve come across in the last six months.
    I also like how you have more comment responses than any of your ‘readers’. Did you get tired of spouting bullshit at anadtech? Perhaps if you actually did something useful and had a reason for owning anything more than a p4, you would understand why autodesk, adobe, e-on and many, many other actual x64 developers (that aren’t spending all their time writing silly child blogs) require HT to be bios-disabled or incur penalties. Maybe you’ve never heard of VMware, I really don’t know. However, you continually disregard facts (not the bleatings of someone who sounds like an ignorant gamer-child) like incredibly high tile latency. Know why? Because HT is a waste of die space. A small amount of space yes, yet still a waste.

    Not all of us spend our time running single-threaded legacy programs and writing shitty little default theme WordPress blogs.

    Don’t worry, I wont be returning because I’ve seen your “arguments” and your understanding of cpu architectures is laughably incomplete, as are your responses. You haven’t presented one argument that cannot be easily refuted by people with a far greater technical understanding (and financial infrastructural backing) than you, apparently, can ever hope to achieve.

    Have fun writing five or more responses, kid. I hope you find the expenditure of time on such idiocy fulfilling.

    • Scali says:

      Lol, frustrated much?
      You haven’t presented a single argument, just silly insults. If you bothered to read some of my development-related blogs, you’d know what kind of stuff I’m writing, and that it’s actually very much multi-core (and multi-GPU for that matter).
      I know my stuff… and if what I say could have been refuted easily… then it would have. But nobody has (nope, not you either).

      Heck, what exactly is your point regarding VMWare anyway?
      Just look here: http://www.anandtech.com/show/5058/amds-opteron-interlagos-6200/7
      I don’t see AMD dominating those benchmarks. And that is a dual-die Opteron with 8 modules/16 cores vs a single Xeon with 6 cores and HT.
      As I also covered here: https://scalibq.wordpress.com/2012/02/14/the-myth-of-cmt-cluster-based-multithreading/
      CMT just doesn’t work… No, not with VMWare either. So you insulted me for no good reason at all… what an idiot 🙂

Leave a comment