If you’ve used, or are considering, AWS/Azure/GCloud for Machine Learning, you know how crazy expensive GPU time is. And turning machines on and off is a major disruption to your workflow. There’s a better way. Just build your own Deep Learning Computer. It’s 10x cheaper and also easier to use. Let’s take a closer look below.
Whoah—building your own is actually cheaper? I haven’t built my own computer since…college? The fascinating part of the article is why this is true:
There’s a reason why datacenters are expensive: they are not using the Geforce 1080 Ti. Nvidia contractually prohibits
the use of GeForce and Titan cards in datacenters. So Amazon and other providers have to use the $8,500 datacenter version
of the GPUs, and they have to charge a lot for renting it. This is customer segmentation at its finest folks!
Huh. I’d be a little surprised—assuming this is in fact true—if this situation continued for long. Even if it actually does make economic sense to build (vs rent) for the moment, my guess is that won’t be true in 2 years.