Meet Big Sur, the astonishing server that can hold up to 8 GPUs.
[insert hyperbolic praises here]
So, after all the hype and marketing bullshit I still don’t get it. What’s wrong with the SuperMicro 4028GR? Or if you want to go back in time, with the 4027GR? How are they “less versatile”, really? It’s just a server you put in a rack!
The Big Sur can only take Tesla or Quadro cards. Why? Because that stupid power connector is purposefully on the side on GTX cards and so the server’s regular top cover (or side) is in the way.
So they only put the power socket on the bottom for Tesla and Quadro cards, which you can buy for 4 times the price. Nice move, huh?
The SuperMicro system, on the other hand says fuck all that shit, put a modified cover on the top and make them fit:
This server costs around 4,000 USD. Then with this dumb-looking cover you can put 8 GTX Titan X’es in there for another 8,000 USD. That’s 12,000 USD in total (excluding CPU, storage and RAM that’s the same in both cases).
So let’s say you buy the cheapest Tesla cards for the Big Sur, around 3,000 USD each, that’s 24,000 USD. Facebook didn’t announce pricing but even if the Big Sur costs you 0 (zero) dollars, it’s still twice as pricey as the SuperMicro solution.
I’m sorry, I’m not convinced.Tags: big sur, facebook, gpu server, machine learning, supermicro