Unpacking the GTX 1660 And 1660 Ti Release For Salad
If you follow gaming or tech news, then you’re probably well aware that Team Green made big headlines recently with the NVIDIA GeForce® GTX 1660 Ti and 1660 or the 16 series GPUs. If you haven’t checked out their specs yet, take a look at the numbers:
“Looking at their base models, both of these cards fit squarely in the under-$300 MSRP class, a spot that used to belong only to AMD’s Radeon RX 500 cards and Nvidia’s GeForce GTX 1050- and 1060-class GPUs. Now, with the release of the GeForce GTX 1660 and GTX 1660 Ti, “Turing”-based GPUs are established in that price range. It’s high time to see how the GeForce GTX 1660 fares against its most similarly priced (and best outfitted) AMD equivalent, the Radeon RX 590.”
Stobing further asserts that with these two cards NVIDIA “aims for the market’s heart,” a sentiment I wholly agree with. The RTX models had all the bells and whistles with ray tracing and all that jazz, but were anything but affordable. The RTX 2080 Ti retails for ~$1,200 — for context, you can build an entire gaming PC for less than $1,000.
If you’re concerned about a performance drop corresponding to the price, I wouldn’t worry so much. If you have a GTX 1070+, you’re not really the market for this card anyway, but the performance gains are substantial if you’re one of many PC gamers still using a 1050 Ti or GTX 1060. Just check out these side-by-side comparisons:
While this is undoubtedly great news for PC gamers who are ballin’ on a budget, the folks at Salad Technologies and I have to wonder; what does the 16 series mean for crypto mining and GPU machine learning? Luckily, a couple intrepid writers did a lot of thinking for us, and we’d like to share their work with our readers.
Machine Learning & Artificial Intelligence by Mike MacKenzie
Are 16 Series GPUs Good For Machine Learning And Mining?
Tim Dettmers, whose tagline reads “making deep learning accessible” (our kinda guy), has a long-running article concerning what consumer-grade GPUs are best for deep learning (aka machine learning, AI algorithm training, etc.). You can find that piece on his blog, here:
A recent update added the 1660 Ti as a viable card for deep learning, if a limited one. Naturally, we’re pretty stoked about this, but how does the 16 series fare for mining? Again, somebody did a lot of legwork compiling the numbers, but this time on r/gpumining.
The conclusion? The 1660 Ti is an efficient card that runs cool while mining, but has cost concerns. It does take a while to earn back that ~$275 you drop on the card. Of course, that’s assuming you only intend to use the card to mine. Much of the card’s value is derived from its gaming performance.
In conclusion, the 16 series is looking like great news for Salad, and we’re tentatively giddy about its possibilities. These cards satisfy and bolster our three main areas of interest:
- Machine Learning
Not only are the specs sufficient to entice consumers new and old, but the price points virtually guarantee a significant expansion for each market. Each additional PC that can chop Salad is a little victory for us. What do you all think? Are you going to get a 1660 Ti, or are you saving up for that 2080 Ti in the Salad reward carousel? Let us know, we’d love to hear from you!