The myth of CMT (Cluster-based Multithreading)

The first time I heard someone use the term ‘CMT’, I was somewhat surprised. Was there a different kind of CPU multithreading technology that I somehow missed? But when I looked it up, things became quite clear. If you google the term, you’ll mainly land on AMD marketing material, explaining ‘cluster-based multithreading’ (or sometimes also ‘clustered multithreading’):

This in itself is strange, as one page you will also find is this:

Triggered by the ever increasing advancements in processor and networking technology, a cluster of PCs connected by a high-speed network has become a viable and cost-effective platform for the execution of computation intensive parallel multithreaded applications.

So apparently the term ‘cluster-based multithreading’ has been used before AMD’s CMT, and is a lot less confusing: it just speaks of conventional clustering of PCs to build a virtual supercomputer.

So CMT is just an ‘invention’ by AMD’s marketing department. They invented a term that sounds close to SMT (Simultaneous Multithreading), in an attempt to compete with Intel’s HyperThreading. Now clearly,  HyperThreading is just a marketing-term as well, but it is Intel’s term for their implementation of SMT, which is a commonly accepted term for a multithreading approach in CPU design, and has been in use long before Intel implemented HyperThreading (IBM started researching it in 1968, to give you an idea of the historical perspective here).

Now the problem I have with CMT is that people are actually buying it. They seem to think that CMT is just as valid a technology as SMT. And worse, they think that the two are closely related, or even equivalent. As a result, they are comparing CMT with SMT in benchmarks, as I found in this Anandtech review a few days ago:

AMD claimed more than once that Clustered Multi Threading (CMT) is a much more efficient way to crunch through server applications than Simultaneous Multi Threading (SMT), aka Hyper-Threading (HTT).

Now, I have a problem with comparisons like these… Let’s compare the benchmarked systems here:

Okay, so all systems have two CPUs. So let’s look at the CPUs themselves:

  • Opteron 6276: 8-module/16-thread, which has two Bulldozer dies of 1.2B transistors each, total 2.4B transistors
  • Opteron 6220: 4-module/8-thread, one Bulldozer die of 1.2B transistors
  • Opteron 6174: 12-core/12-thread, which has two dies of 0.9B transistors each, total 1.8B transistors
  • Xeon X5650: 6-core/12-thread, 1.17B transistors

Now, it’s obvious where things go wrong here, by just looking at the transistorcount: The Opteron 6276 is more than twice as large as the Xeon. So how can you have a fair comparison of the merits of CMT vs SMT? If you throw twice as much hardware at the problem, it’s bound to be able to handle more threads better. The chip is already at an advantage anyway, since it can handle 16 simultaneous threads, where the Xeon can only handle 12.

But if we look at the actual benchmarks, we see that the reality is different: AMD actually NEEDS those two dies to keep up with Intel’s single die. And even then, Intel’s chip excels in keeping response times short. The new CMT-based Opterons are not all that convincing compared to the smaller, older Opteron 6174 either, which can handle only 12 threads instead of 16, and just uses vanilla SMP for multithreading.

Let’s inspect things even closer… What are we benchmarking here? A series of database scenarios, with MySQL and MSSQL. This is integer code. Well, that *is* interesting. Because, what exactly was it that CMT did? Oh yes, it didn’t do anything special for integers! Each module simply has two dedicated integer cores. It is the FPU that is shared between two threads inside a module. But we are not using it here. Well, lucky AMD, best case scenario for CMT.

But let’s put that in perspective… Let’s have a simplified look at the execution resources, looking at the integer ALUs in each CPU.

The Opteron 6276 with CMT disabled has:

  • 8 modules
  • 8 threads
  • 4 ALUs per module
  • 2 ALUs per thread (the ALUs can not be shared between threads, so disabling CMT disables half the threads, and as a result also half the ALUs)
  • 16 ALUs in total

With CMT enabled, this becomes:

  • 8 modules
  • 16 threads
  • 4 ALUs per module
  • 2 ALUs per thread
  • 32 ALUs in total

So nothing happens, really. Since CMT doesn’t share the ALUs, it works exactly the same as the usual SMP approach. So you would expect the same scaling, since the execution units are dedicated per thread anyway. Enabling CMT just gives you more threads.

The Xeon X5650 with SMT disabled has:

  • 6 cores
  • 6 threads
  • 3 ALUs per core
  • 3 ALUs per thread
  • 18 ALUs in total

With SMT enabled, this becomes:

  • 6 cores
  • 12 threads
  • 3 ALUs per core
  • 3 ALUs per 2 threads, effectively ~1.5 ALUs per thread
  • 18 ALUs in total

So here the difference between CMT and SMT becomes quite clear: With single-threading, each thread has more ALUs with SMT than with CMT. With multithreading, each thread has less ALUs (effectively) than CMT.

And that’s why SMT works, and CMT doesn’t: AMD’s previous CPUs also had 3 ALUs per thread. But in order to reduce the size of the modules, AMD chose to use only 2 ALUs per thread now. It is a case of cutting off one’s nose to spite their face: CMT is struggling in single-threaded scenario’s, compared to both the previous-generation Opterons and the Xeons.

At the same time, CMT is not actually saving a lot of die-space: There are 4 ALUs in a module in total. Yes, obviously, when you have more resources for two threads inside a module, and the single-threaded performance is poor anyway, one would expect it to scale better than SMT.

But what does CMT bring, effectively? Nothing. Their chips are much larger than the competition’s, or even their own previous generation. And since the Xeon is so much better with single-threaded performance, it can stay ahead in heavy multithreaded scenario’s, despite the fact that SMT does not scale as well as CMT or SMP. But the real advantage that SMT brings is that it is a very efficient solution: it takes up very little die-space. Intel could do the same as AMD does, and put two dies in a single package. But that would result in a chip with 12 cores, running 24 threads, and it would absolutely devour AMD’s CMT in terms of performance.

So I’m not sure where AMD thinks that CMT is ‘more efficient’, since they need a much larger chip, which also consumes more power, to get the same performance as a Xeon, which is not even a high-end model. The Opteron 6276 tested by Anandtech is the top of the line. The Xeon X5650 on the other hand is a midrange model clocked at 2.66 GHz. The top model of that series is the X5690, clocked at 3.46 GHz. Which shows another advantage of smaller chips: better clockspeed scaling.

So, let’s not pretend that CMT is a valid technology, comparable to SMT. Let’s just treat it as what it is: a hollow marketing term. I don’t take CMT seriously, or people who try to use the term in a serious context, for that matter.

This entry was posted in Hardware news and tagged , , , , , , . Bookmark the permalink.

29 Responses to The myth of CMT (Cluster-based Multithreading)

  1. NewImprovedjdwii says:

    Simple, They want to be like HP/Apple/Nintendo and that’s be different, Now i will say SMT usually scales around 20-30% where CMT can be 55-80%, But i will agree wiith you and say its harder to do since its a bigger die and it just means Amd doesn’t make as much money as Intel.

  2. Pingback: AMD Steamroller | Scali's OpenBlog™

  3. Pingback: Anonymous

  4. Pingback: AMD's New High Performance Processor Cores Coming Sometime in 2015 - Giving Up on Modular Architecture

  5. Pingback: AMD Confirms Development of High-Performance x86 Core With Completely New Architecture

  6. Pingback: AMD’s New High Performance Processor Cores Coming Sometime in 2015 … « Reviews Technology

  7. Ventisca says:

    so you’re saying that AMD’s CMT is nothing but marketing gimmick?
    I’m no expert, but after reading your article, (maybe) I have a similar opinion. :D
    The module is actually two core, but just under one instruction fetch and decode. So what’s AMD done is not same level of technology of SMT, instead, they just do more thread in more core.
    The new-ish AMD’s core architecture, Steamroller, split the decode unit for each core in the module so each module has 2 instruction decoder, so it’s clear that that they are actually two “separated” core.

    • Randoms says:

      They are still sharing the branch predictor, fetch and the SIMD cluster.

      So it is still need to separated cores. It is a step backwards from the original CMT design, but is it still a CMT design.

  8. Pingback: AMD FX Series Making a Comeback Within Two Years - APU 14 Conference Reveals Future Roadmaps

  9. Pingback: F.A.Q pertanyaan yang sering diajukan tentang Arsitektur AMD CMT yang ada di AMD APU dan FX - SutamatamasuSutamatamasu

  10. Lionel Alva says:

    Would you know of any tenable alternatives to SMT then?

    • Scali says:

      Well no… There is no alternative. Why should there be an alternative? That’s like asking “What is an alternative to cache?” or “What is an alternative to pipelining instructions?”
      There are no alternatives, they are just techniques to improve performance in a CPU design.

  11. Scali says:

    Yay, gets posted on Reddit for the umpteenth time… Cognitive dissonance ensues with posters there, trying to discredit this piece hard… with far-fetched and non-sensical arguments (actually going against the AMD marketing material that I put directly on here. If you have to argue against AMD’s own marketing material in order to discredit my article, you know you’ve completely lost it)… But nobody is man enough to comment here.
    The reason I can’t wait for AMD going bankrupt is that it is hopefully the end of these AMD fanboys.
    I am tired of their endless insults and backstabbing.

    • UIGoWild says:

      Do you think the cpu market will be better without cometition? It doesn’t take a marketing degree to understand that without competition, prices would sky-rocket and innovation would go slower. Now I guess you’re thinking that I’m an AMD fan and all that, but that just childish. I’m not trying to defends people who insluted you, being a fanboy of a company and never thinking twice is not clever at all.

      Although, by saying:

      The reason I can’t wait for AMD going bankrupt is that it is hopefully the end of these AMD fanboys.

      You kinda show that you’re just the opposite. An “Anti-AMD”. Thats not better than a fan boy. I hope AMD will get better and that we’ll see a real competition now that they announced that they’re going for SMT, not because I’m a AMD fan, but because I want the best for the customers.

      • UIGoWild says:

        Okay. Lets say I haven’t been perfectly clear. And yeah my comment may have looked like a attack or something, but I was just thinking that you were at risk to ruin your credibility by saying that you wished for AMD to go bankrupt.

        You said:
        Nice try, but I’m anti-fanboy, not anti-AMD.

        So okay, I might have been reacting a bit too quickly. Actually, I totally agree with you on that point. Being a fanboy of a company, any company, is not a clever choice. But I still hold to my point: I would rather keep AMD in the race just to be sure there’s a “tangible” competitor to intel (or nvidia for that matter). I would be saying the same thing if Intel was the one lagging behind. I may be pessimistic, but I don’t like the idea of having only one company holding more than 70% of a market. (Which is already a huge chunk and the actual share of intel at the moment [ps. don’t quote me on that but I’m pretty its close to that].)

        And even though the competition over performance wasn’t really strong (its been forever since AMD was close to Intel), I still think that this competition was good for the customers in the end.

      • Klimax says:

        You are still massively wrong. There is still competition. It is called older Intel’s chips. If there are no improvements and price higher then the only sold new chips will be replacements and trickle of new computers. And massive second hand market. There for no price change is to be expected. Look up monopoly pricing. It is not what you think it is. Not even remotely.

      • Scali says:

        Do you think the cpu market will be better without cometition?

        This is the fallacy known as a ‘leading question’.

        It doesn’t take a marketing degree to understand that without competition, prices would sky-rocket and innovation would go slower.

        This is the fallacy known as ‘slippery slope’.

        You kinda show that you’re just the opposite. An “Anti-AMD”. Thats not better than a fan boy.

        Nice try, but I’m anti-fanboy, not anti-AMD.

        Anyway, if you take a glimpse at reality for a moment, you’ll see that we’ve effectively been without any real competition for many years in the CPU-market. Prices didn’t exactly skyrocket so far, and innovation didn’t exactly slow down. What we do see is that innovation has moved into other areas than just CPU-performance at all cost (such as the breakneck GHz-race in the Pentium3/4-era, which customers didn’t exactly benefit from. They received poor, immature products with a tendency to overheat, become unstable or just break down, from both sides).
        Currently there’s innovation in things like better power-efficiency, Intel scaling down their x86 architectures to also move into tablet/smartphone/embedded markets, and more focus on graphics acceleration and features (for the first time ever, Intel is actually the leader in terms of GPU features, with the most complete DX12 GPUs on the market).

  12. Justin Ayers says:

    “There is still competition. It is called older Intel’s chips.” But the key you’re missing is that competition between businesses is essential.

    • Klimax says:

      Not necessary for some markets. Like CPU market. Because even five years old chips can be good enough for many people, they form effective competition to new chips since potential buyers don’t have pressing need to upgrade them and if new chips were substantially more expensive then even new buyers can skip them and get old chips.

      One of reasons why monopoly are not illegal, only abuse of dominant/monopoly position is. And you forgot that we are already there. AMD ceased to be competitor to Intel about four to six years ago.

      • HowDoMagnetsWork says:

        Let’s assume that Intel actually will end up increasing their prices, believing they’d make more money. Then customers buy more older chips. Years pass, barely any new Intel CPUs are bought, most of the old ones are out of stock. What now? If AMD is in the race, people switch to AMD, even if their devices are half as good as Intel’s. If AMD is not in the race, customers will be forced to pay Intel tremendous prices or just not use their products. Of course, if the company is full of good people, they would never do that, rendering competition useless. But what company is full of good people? Competition is very important for any market.

      • Scali says:

        People aren’t forced to buy new CPUs. CPUs don’t really break down or wear out (in case you missed it, earlier this year, I was part of the team that released 8088 MPH, a demo that runs on the original IBM PC 5150 from 1981. We used plenty of original 80s PCs during development, with their original 8088 CPUs, and they still worked fine, 30+ years after they were made).
        There’s no point in buying older chips if you already have an older chip like that.
        Likewise, performance-per-dollar is a delicate balance. If Intel makes their CPUs too expensive, people simply will not upgrade, because they cannot justify the cost vs the extra performance (perhaps you youngsters don’t know this, but in the good old days when Intel was the only x86-supplier, it often took many years for a new CPU architecture to become mainstream. For example, the 386 was introduced in 1985, but didn’t become mainstream until around 1990. It was just too expensive for most people, so they bought 8088/286 systems instead).

        This means that Intel is always competing against itself, and has only limited room for increasing prices. At the same time they constantly need to improve performance at least a little, to keep upgrades attractive enough.
        If they don’t, they will price themselves out of their own market. If people don’t buy new CPUs, Intel has no income. Which is obviously a scenario that Intel needs to avoid at all costs.

        AMD is really completely irrelevant in most of today’s market already, because their fastest CPUs can barely keep up with mainstream Intel CPUs of a few generations ago. A lot of people have already upgraded to these CPUs or better, and have no interest in getting an AMD CPU at all, even if AMD would give them away for free.
        So we’ve already had the scenario of Intel competing against its older products for many years now. Not much will change if AMD disappears completely.

        It seems a lot of AMD fanboys think that the whole CPU market is in the sub-$200 price bracket where AMD operates. In reality most of it is above that.

  13. Reality Cop says:

    Scali, you’re damn blind. In those “good old days when Intel was the only x86 supplier”:

    1. x86 wasn’t the only option. You had PCs built with MOS, Motorola, and Zilog CPUs all over the place. You had Sun SPARC workstations.

    2. Intel was NOT the only x86 supplier. AMD, NEC, TI, and other were making x86 clones before 1990.

    • Scali says:

      Oh really now?

      1. x86 wasn’t the only option. You had PCs built with MOS, Motorola, and Zilog CPUs all over the place. You had Sun SPARC workstations.

      You think I didn’t know that? I suggest you read some of my Just keeping it real articles. You could have figured it out anyway, since I explicitly said ‘x86 supplier’.

      2. Intel was NOT the only x86 supplier. AMD, NEC, TI, and other were making x86 clones before 1990.

      They were not clones, they were ‘second source’. These fabs made CPUs of Intel’s design, commissioned by Intel. That’s like saying TSMC makes ‘Radeon and GeForce clones’ because they build the actual GPUs that nVidia and AMD design.
      For all intents and purposes, these second source CPUs are Intel CPUs. Intel was the only one designing the x86 CPUs, even if other fabs also manufactured them (which was the point in that context anyway).

      What is your point?

      • k1net1cs says:

        “What is your point?”

        Likely trying to look overly smart.
        At least he tried…but IGN said “6/10 for looking up Wikipedia”.

        Funny how a “Reality Cop” who tried to call you out has to be directed to a collection of articles titled “Just Keeping It Real” for actual, real info on what you’ve done.

  14. OrgblanDemiser says:

    Sooo… who care if AMD continue to exists? Does it hurts anyone? Personally as long as my computer works fine and don’t cost me too much I’m happy with that.

    • Scali says:

      It’s mostly AMD’s marketing and its fanboy following, which distort the truth, misleading/hurting customers.

      • OrgblanDemiser says:

        True. But isn’t it the case with most companies nowaday? I mean, just looking at some HDMI cables boxes make me laugh sometimes. (i.e High speed 1080P ready, Gold plated and such.) Internet providers displaying the speeds in Mega bits instead of Mega bytes. Apple showcasing a good old tablet pen, calling it an “innovation”. (I’ll be careful and not going to extrapolate on this.) And to be topical with recent news: (“recent”) Volkswagen. (No need to add more :P)

        At this point it seems like the customer is taken for a fool at every corner. Fanboy or not, I guess you have to be careful and seek the truth backed by facts and not by advertisement money.

        So again, with AMD, I think people have to admit that when you buy their chips. You buy sub par components. For budget builds I agree the price might be a valuable argument, but its sub par nonetheless.

      • Scali says:

        Fanboy or not, I guess you have to be careful and seek the truth backed by facts and not by advertisement money.

        That is what this blog is here for.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s