All About BTC, LTC, ETH mining as well as other alternative crypto currencies
Time for a quick look at the power usage of the GeForce GTX 1080, the reference design from Nvidia that has a TDP of 180W set by the manufacturer, making it pretty energy efficient for the performance it provides for crypto currency mining. We are checking the situation with the different algorithms apart from Ethereum, because we already know that the GTX 1080 is far from great choice for Ethereum mining.
The GTX 1080 Founders Edition TDP limit of 180 Watt has a power limiter that can give you up to extra 20% increase, moving the slider to increase the maximum available power to be used by the GPU. This can help you get some extra hashrate for the algorithms that max out the 180 Watt TDP of the GPU and as you can see these are actually quite a few. The non reference versions of GTX 1080 should come with higher TDP limits and even higher factory clocks, so they should be able to achieve even more performance as they will be able to keep higher boost frequency of the GPU while mining. Then again you might also want to lower the power limit in order to reduce the power usage and get an even better hashrate per Watt of power used, but for that you might have to also play with the voltage of the GPU for best results and also the clocks.
In the table above you can see that Neoscrypt and Lyra2Re are far from maxing out the TDP of the GTX 1080, the reason for that is that they do not perform that well in terms of hashrate, so some optimizations for them might help give you better performance. Other than that the power usage is high and the hashrate you get is seriously increased as compared to the previous generation of Nvidia GPUs, so performance per Watt for the GTX 1080 is actually great in most algorithms for crypto mining.
We continue our series of tests of the new Nvidia GeForce GTX 1080 Founders Edition (reference design) for crypto currency mining after yesterday we have checked the situation for mining Ethereum with the new Pascal GPUs and have seen the not so great results. Ethereum and the Dagger-Hashimoto algorithm it uses is still doing better on AMD GPUs, but Ehereum mining is set to end at some point with the altcoin switching to PoS only and the forks that use the same algorithm are not yet providing a real alternative. So with the growing difficulty and the switch to PoS at some point in probably less than a year you should also be interested in how the GTX 1080 performs in other new GPU mineable algorithms as well as in the more popular ones that. This is the reason we are now going to compare the GeForce GTX 1080 Founders Edition with GTX 980 Ti and GTX 970 to see the difference in terms of performance in other algorithms as well. As you will probably see the GTX 1080 does provide a nice performance boost compared to the two other older generation alternatives, but with its currently high price even the power saving does not make it great option for building multi-GPU mining rigs for the moment. The more interestingly priced GTX 1070 will most likely be the better choice for building 6 GPU mining rigs succeeding the GTX 970 as the best Nvidia-based GPU for crypto mining, though we are yet to test the 1070 in order to confirm that.
The ccminer forks we have used for testing:
– Blakecoin (Blake256-8rounds), Decred (Blake256-14rounds), Vcash (Blake256-8rounds) ccminer
– Lyra2REv2 ccminer
– X11evo ccminer
– All other algorithms
Tests were done under Windows 7 with already existing ccMiner releases that are not compiled with CUDA 8.0 and with support for Compute 6.1 that the GTX 1080 uses, though without optimizations specific for GTX 1080 or the Pascal architecture in general we are probably not going to see much of a difference. Anyway, we do have planned to do additional testing comparing the difference in terms of performance with existing ccMiner forks compiled with CUDA 8.0 and Compute 6.1 against the results here achieved with CUDA 6.5/7.5 and Compute 5.2 releases that were tested here.
We had almost problem-free experience testing the above algorithms on the GTX 1080, though there are some things that we need to note. For example the Nist5 default intensity (21?) was crashing the miner, so tested with 20 where it was working fine. The very low Neoscrypt performance on the GTX 1080 is because the algorithm probably needs special optimizations to take advantage of the GPU. The Lyra2RE performance was not that much faster on the GTX 1080 as compared to GTX 980 Ti, so here some optimizations will most likely result in increased performance. Note that we have also tested X11, though now that is has moved to the ASIC phase you probably won’t want to GPU mine it anyway.
What is clearly seen from the comparison above is that the new GTX 1080 performs about twice as fast (on average) than a single GTX 970 and we are using Gigabyte Windoforce 970 OC GPU comparing to stock 980 Ti and GTX 1080. The GTX 1080 uses about the same power as a single GTX 970, but offers twice the performance, but the real issue here is the price at the moment. If the GTX 1080 was about twice the price of GTX 970 it might’ve been an interesting option for miners that are currently using Nvidia GTX 970 mining rigs, but it is unfortunately more like 2.5 and not two times. Comparing the GTX 1080 to the GTX 980 Ti shows an average performance advantage of almost 50% in favor of the GTX 1080, with not that great price difference between the two cards (the GTX 1080 is about 30% more expensive). So definitely no point in going for GTX 980 Ti instead of GTX 1080 for mining, though still both cards are not the best choice for multi-GPU mining rigs, then again if you are buying just a single GPU for gaming and want to also mine with it when not gaming the GTX 1080 might be much more interesting that for use on multi-GPU mining rigs.
The new Nvidia GeForce GTX 1080 (Pascal-based) video cards have been available for about two weeks now and we have finally managed to get one GTX 1080 to play around with it and see how good it performs for crypto crurrency mining. We are starting with Ethereum as the currently most popular altcoin for GPU mining and unfortunately the GTX 1080 does not do great for ETH mining. You should already know that Eehereum is better on AMD GPUs than on Nvidia and the new Pascal GPUs such as the GTX 1080 don’t do great either and there are some issues with them on Windows for the moment. The GTX 1080 cards are using faster in terms of clock speed GDDR5X video memory that might do great for gaming, but apparently it does not do great for memory intensive algorithms such as Ethereum. In fact it seems that the GTX 1080 is slower because of the GDDR5X than the GTX 1070 that uses regular GDDR5 video memory, and when you add the high price of the 1080 it is most definitely not good choice for Ethereum mining like it might be for gaming.
We have compiled a Windows binary of the latest pre-release of Genoil’s ethminer 0.9.41 fork version 1.1.3 (source) with CUDA 8.0 and Compute 6.1 that is used by the new GTX 1080 and GTX 1070 to test with and you can find a download link below. So let us get to the hashrates you can expect from the GTX 1080 by mining under Windows and then from Linux. If you are using Windows 7 or 8.x you will notice that with the default settings the miner will crash when trying to load the DAG file into the video memory of GTX 1080, regardless if you are using OpenCL or CUDA mode. Other OpenCL only miners such as qtminer will also fail with a driver crash, this is a driver issue and even if you manage to not crash the driver you will get a disappointingly low performance. You can run the Genoil CUDA fork of ethminer in CUDA mode with the
-U option and add the following parameters
--cuda-grid-size 2048 --cuda-block-size 128 to prevent the driver crash, however you will be getting less than 1 MHS in terms of hashrate, so pointless.
If you move to Windows 10 the situation is slightly better, but not that much actually. With the latest video drivers 368.39 for Windows 10 you will be able to mine Ethereum, unlike on Windows 7/8.x, but the hashrate you will get is still going to be disappointingly low at just about 4-5 MHS. Again a driver issues, however there is a talk about an upcoming driver update that should fix the problem of low hashrate at least for Windows 10 that is expected sometime next month (we cannot confirm this however).
So the only thing that is left to do if you already got a GTX 1080 GPU or more than one and want to mine Ethereum with it is to go for Linux. Under Linux people are reporting about 23 MHS on average as hashrate for mining Ethereum on GTX 1080, a speed that is a bit higher than what you can get from GTX 970, GTX 980 or GTX 980 Ti, but still a bit disappointing compared to what you can get from high-end AMD GPUs. The GTX 1070 that we already mentioned is doing better for Ethereum should be capable of around 27 MHS under Linux (in Windows they apparently have the same low performance issues for the moment), though we have not yet been able to personally verify that. So even with the low power consumption these hashrates from the GTX 1080/1070 are not that great and when you add in the high price of the GPUs at the moment and the driver issues with Windows, you can pretty much forget about being happy with mining Ethereum with these video cards. They should be better capable for other altcoin algorithms that are not memory intensive like Ethereum and we are off to checking that next, so stay tuned for more results.