It Is All About BTC, LTC, ETH, DOGE, KAS mining as well as other alternative crypto currencies
There are probably not much, if any, people out there mining for crypto coins on the latest and fastest Nvidia Geforce GTX Titan X GPUs, but that does not mean that the video cards based on them are not good for mining. The GTX Titan X does come with 12 GB of video memory, something that is not very useful for crypto mining, but it also comes with 50% more CUDA cores as compared to GTX 980. So you might expect to get up to about 50% performance in crease in terms of hashrate, but that performance increase may come up with twice or even more higher price compared to how much a single GTX 980 costs. So it seems that the GTX Titan X is not very price/performance efficient choice and it might be better to go for two GTX 980s instead not only for mining, but also for gaming as well.
Nevertheless we still ran some benchmarks to see what is the actual hashrate difference between a GTX 980 and a GTX Titan X GPU using the latest ccMiner 1.5.45 SP-MOD. Do note that the Titan X is still a Compute 5.2 GPU, just like the GTX 980, so it does not bring a new evolution in terms of compute capabilities – just more raw power. As you can see from the table with the results we are seeing in between 30% to almost 50% performance increase in the hashrate of the GTX Titan X GPU as compared to what the GTX 980 currently offers. As we have already mentioned the price difference compared to the actual performance you get makes the GTX 980 the better choice for mining, and you can go for two of these instead of a single Titian X.
7 Responses to Nvidia GeForce GTX 980 vs GTX Titan X Mining Performance
Steven
April 22nd, 2015 at 14:06
When you’re doing these GPU performance tests against various crypto algorithms, is there any chance you could also test them against oclvanitygen?
Scrypter
April 23rd, 2015 at 18:05
What I find interesting about this benchmark is that modern gfx cards today can now mine scrypt-based coins faster than Gridseed (350-500 KHS) and ZeusMiner Blizzard (1300-1400 KHS).
The GTX980 mines Scrypt (592 KHS) faster than a Gridseed silver or gold unit, and Lyra2 almost as fast as a ZeusMiner Blizzard unit.
For me this means that it has now become profitable again to mine altcoins with pc graphics hardware. :)
Gridseed and Zeusminer are now outdated in 2015.
Steven
April 24th, 2015 at 02:56
Scrypter: They may be faster, but they’re still nowhere near as efficient – something you’d have to take into account when throwing around words like ‘profitable’ :)
Scrypter
April 27th, 2015 at 13:22
@Steven: I assume that by your words “nowhere near as efficient” you mean that the GTX980 takes more electricity and thus the overall cost is higher than mining with overclocked Gridseed 500 KHS and Zeusminer 1400 KHS units. Right? ;)
Well, all I can tell you is that I have been mining with Gridseed and Zeusminer Blizzard units for the whole year 2014. I am now mining with Zeusminer Lightning X6 (42000 KHS) and have taken my Gridseeds and Zeusminer Blizzards offline. This year 2015 I got my electricity bill for the previous year and it is pretty high!
So I can’t imagine that a single GTX980 eats more power and electricity than a Gridseed or Blizzard unit, because I haven’t tested that gfx card yet… but if you say it’s less efficient then it must be so. ;)
hashaholic
May 7th, 2015 at 19:12
Scrypter, Your electric bill would have been MUCH higher if you were GPU mining. I had 5 blizzards last year. the wattage at the all was 250 watts and 6.6-7 MHS. blizzards are about 26 KHS per watt vs this card (gtx 980) which has a tdp of 165 watts gets 3.59 KHS per watt.
Lex
March 23rd, 2016 at 05:17
Something to note. I don’t anyone has really touched on this before and they should look at it. It’s not just about the technology of said cards but in all due honesty, the amount of memory you have on these cards aren’t even being close to be fully utilized. The CCMiner while it’s good, still only handles 4 GB memory with ease, anything more and I’ve noticed the hashrate actually drops (which is bad). The efficiency of this miner needs to be shored up, not so much the algorithms and proof of work, just how the hardware threads and memory is used on the cards themselves.
The difference between the 980 and the Titan X is just the clocking of the cards that makes the difference really. For example, the Scrypt based and associated forks are based on memory bandwidth more than anything, although having more memory at a high bandwidth isn’t necessarily better at this point in time because it’s not optimized for anything past 4 GB on each card. While the miner app can handle more, it’s not so great with the thread vs. extra memory present. (It’s get sloppy!)
For example if you had a card that had the exact same technology as the 980 and Titan X, had twice the RAM bandwidth with a decent CAS timing on said memory and only had 4 GB RAM, it would kick the crap out of the 980 and Titan X on those mining tests.
Note:
Having extra CUDA cores doesn’t account for much increase in processing speed as does the overclocking of said cores. Scrypt and forks therein are more a memory bandwidth based process as I’ve said before. Unlike SHA based coins that use the cores more than they do the memory, then the amount of cores and clocking of said cores really makes a difference.
Two other things that will skew your tests is whether or not you’re using Stratum, let alone a miner that has Extra Nonce Subscribe (and connected to said server that uses those) or if you’re using the old Getwork system with Long-Poll enabled. You should grab all that data and post that, too. That will give people a better idea of where the problem is and what can be done about it.
Also, not all pools use Slush’s reference design of the Stratum protocol, some use a modified version of it, so that it’s compatible but not 100%. That can have an affect on your mining numbers too, make sure you’re mining this from the same pool and also have the same ping timing within +/- 8 ms difference is fine.
Lex
March 23rd, 2016 at 05:18
Above comment was supposed to read as:
“I don’t know anyone that has really touched on this before and they should look at it. It’s not just about the technology of said cards but in all due honesty, the amount of memory you have on these cards aren’t even being close to be fully utilized.”