All About BTC, LTC, ETH mining as well as other alternative crypto currencies
Time for a quick look at the power usage of the GeForce GTX 1070, the reference design from Nvidia that has a TDP of 151W set by the manufacturer, making it pretty energy efficient for the performance it provides for crypto currency mining. We are checking the situation with the different algorithms apart from Ethereum, because we already know that the GTX 1070/1080 is far from great choice for Ethereum mining when using Windows and we are testing the mining for other algorithms with Windows 7.
Looking at the results from the different algorithms the Nvidia GeForce GTX 1070 Founders Edition does seem to be about 30% on average slower (25%-33%) compared to its bigger brother – the Nvidia GeForce GTX 1080 Founders Edition at the stock settings. The situation with the power usage difference between the two cards is also very similar to the performance difference in the tested algorithms. So the next very interesting question is how well the GTX 1070 will overclock and how will the OCed performance compare to a stock GTX 1080 and an overclocked one. We are going to be posting our results from the GTX 1070 overclock soon, so stay tuned for them.
While we were testing and comparing crypto mining performance we have actually noted that the comparison between GTX 1070 and GTX 980 Ti is actually more interesting as you can see from the table with results. As you can clearly see the GTX 1070 is slightly faster than GTX 980 Ti, but power usage wise it is much more efficient. This can give you a better idea on the evolution of performance and power usage between the previous and this generation of Nvidia GPUs. The GTX 1070 is also a more attractive choice for building multi-GPU mining rigs at the moment as compared to the GTX 1080, though you might want to wait a bit more for the non-reference designs to come out that will allow more serious overclocking and thus even better performance.
Time for a quick look at the power usage of the GeForce GTX 1080, the reference design from Nvidia that has a TDP of 180W set by the manufacturer, making it pretty energy efficient for the performance it provides for crypto currency mining. We are checking the situation with the different algorithms apart from Ethereum, because we already know that the GTX 1080 is far from great choice for Ethereum mining.
The GTX 1080 Founders Edition TDP limit of 180 Watt has a power limiter that can give you up to extra 20% increase, moving the slider to increase the maximum available power to be used by the GPU. This can help you get some extra hashrate for the algorithms that max out the 180 Watt TDP of the GPU and as you can see these are actually quite a few. The non reference versions of GTX 1080 should come with higher TDP limits and even higher factory clocks, so they should be able to achieve even more performance as they will be able to keep higher boost frequency of the GPU while mining. Then again you might also want to lower the power limit in order to reduce the power usage and get an even better hashrate per Watt of power used, but for that you might have to also play with the voltage of the GPU for best results and also the clocks.
In the table above you can see that Neoscrypt and Lyra2Re are far from maxing out the TDP of the GTX 1080, the reason for that is that they do not perform that well in terms of hashrate, so some optimizations for them might help give you better performance. Other than that the power usage is high and the hashrate you get is seriously increased as compared to the previous generation of Nvidia GPUs, so performance per Watt for the GTX 1080 is actually great in most algorithms for crypto mining.
We’ve read some comments about people complaining from the stock ZeusMiner Blizzard Scrypt ASICs’ like them being crappy and getting too hot. From our personal experience we can also confirm that the 60W power adapters provided with the miners do get hot, but this is to be expected with a power draw of the device of about 44W at 300 MHz and 48W at 328 MHz. Our unit that we are currently testing is hot to the touch, but only using that to judge can be misleading as any temperature higher the one of our body is perceived as hot. Anyway, we did decide to try using the Blizzard miner with a high-quality ATX power supply and measure the actual power draw we are getting…
We have attached the ZeusMiner Blizzard to an 80Plus Platinum power supply and the result was a bit surprising – very low efficiency of the power supply due to the low load. Clearly the 1200W Corsair power supply is not designed to be very efficient with a load of just about 50W, actually 48-49W measured as used by the miner, so not much different than what we got from the standard PSU. The difference here however is that due to the low efficiency that the PSU is running at the actual power consumption of the miner off the power socket is about 64W. We did measure 48W of power usage with the stock power adapter supplied with the miner at the power socket, so it seems that these 60W power adapters are quite efficient in converting the 220V power to 12V. Of course by increasing the number of miners connected to the power supply and raising the load to at least 10% or more the efficiency should increase and make things right, though no point in running just a single Blizzard off an ATX computer power supply – better stick to the power adapter supplied with the miner.