GTX 10 series new miner help

Hello World.

I am totally new to litecoin and mining. I happened to have a linux machine (Ubuntu 16.4) that happens to have 2 Geforce GTX 10 series cards in it, for Hashcat purposes. A GTX 1070 and a 1080. Anyway, today i decided to start playing with litecoin and sett up a pool at . I am using cgminer 3.7.2 and don’t know what to make of the results i am getting…

anyway im running cgminer from terminal like so:

./cgminer --scrypt -o stratum+tcp:// -u {my.pool} -p {mypassword} -I 20 -w 512 --thread-concurrency=33792 --temp-cutoff 90

with the following results:

 [2017-09-10 00:15:05] Summary of per device statistics:
 [2017-09-10 00:15:05] GPU0                | (5s):324.2K (avg):336.7Kh/s | A:0 R:0 HW:0 WU:215.3/m                    
 [2017-09-10 00:15:05] GPU1                | (5s):396.2K (avg):406.0Kh/s | A:512 R:0 HW:0 WU:367.2/m 

GPU 0 = 1080
GPU 1 = 1070

seems a bit low to me or am i high? also a little backwards, shouldn’t i gett better performance from the 1080… anyway, being as i am a total noob, i am assuming there is something i can do to get better performance. please show me the way.

seems about right for a 1080…(non-Ti I assume?) your drivers play a big role too…make sure your are using ccminer witht he newest cuda architecture (this one uses cuda 8.0 which works way better than the 6.5 my 750Ti came with)
this might give you a better rating:
however you current speed is double a 750ti…how many GB cache do your cards have? this plays a big factor in speed as well as internet connection and distance of the pool you are mining to…if the server is on the other side of the world youll get the work slower and send it slower than if the server were running solo at your house if that makes sense
hope this helps
just make sure that the miner architecture matches the current driver you have installed…and use the most up to date ccminer (which is also known as cudaminer)

I have to be honest most of what you just said is greek to me.

1st, i am on linux/Ubuntu 16.4 and the link you offered seems to be only windows.

on linux im running cgminer 3.7.2 (thought i heard it was the last cgminer to offer gpu)

what do you mean by how many GB cache do i have? ram? ram on each card? (forgive me im a total noob)

both my 1080(NOT ti) and 1070 are Founders editions cards with custom water blocks… (fyi im hovering under 40 degrees celcius at those rates running for hours)

find a version of ccminer for Ubuntu that uses cuda 8.0 architechture and then update your video card driver to version 8.0…if the miner you install uses cuda 6.5 then install driver 6.5…architecture 5.0 = driver 5.0…etc…just make sure the two match…you are using coumpute version 6.5 architecture so you want a version of ccminer that uses cuda 6.5

heres a link on how to do it :

but install the driver you need first:

you should just about double your speed with this version of ccminer…you can also compile the windows ccminer form source in Ubuntu and then configure, run make install and you should be good…
good luck…its gonna take some research and a lot of playing around until you get what your doing if your not understanding any of the terminology

1 Like

Thanks… will look into all this… sounds “fun”

@bakd247 it was a struggle but i finally got cuda 8.0 running and my nvidia drivers working but i can’t seem to make the ccminer

I get 100 errors and it terminates…

sample error:
/usr/local/cuda/include/surface_functions.h(485): error: expected a “;”

can you help? I am able to run hashcat on my gpus and cuda device query sees all my cards… cgminer no longer works but i assume thats because it doesn’t support my new cuda

i rebooted and as a sanity check ran the make file again in ccminer and it seems to be “making”

I would note to others that its a good idea to have another device with internet around as you may run into a need for google if you get caught in a log in loop with ubuntu or issues shutting down lightdm

did you install the 8.0 driver for your cards? will connect and fail every time if your drivers don’t match the miner architecture…this was the one thing that took me a while to grasp when I started…you may just need to install the latest cuda SDK in terminal which should have compute 8.0 included

what driver would that be? i installed the latest nvidia driver on ubuntu.

when i run ccminer pointing to my litecoinpool: stratum+tcp:// shows my worker as active but 0.0kh/s :frowning:

and the terminal output is just this over and over:
[2017-09-10 20:00:43] x11 block 1275040
[2017-09-10 20:00:43] GPU #2: GeForce GTX 1070, 12727
[2017-09-10 20:00:43] GPU #1: GeForce GTX 1060 6GB, 8940
[2017-09-10 20:00:43] GPU #0: GeForce GTX 1080, 12757
[2017-09-10 20:00:57] x11 block 1275041
[2017-09-10 20:00:57] GPU #2: GeForce GTX 1070, 12689
[2017-09-10 20:00:57] GPU #0: GeForce GTX 1080, 12598
[2017-09-10 20:00:57] GPU #1: GeForce GTX 1060 6GB, 8911
[2017-09-10 20:00:59] x11 block 1275041
[2017-09-10 20:00:59] GPU #2: GeForce GTX 1070, 12521
[2017-09-10 20:00:59] GPU #1: GeForce GTX 1060 6GB, 8900
[2017-09-10 20:00:59] GPU #0: GeForce GTX 1080, 12933

$ ubuntu-drivers devices
== /sys/devices/pci0000:00/0000:00:01.0/0000:01:00.0 ==
vendor   : NVIDIA Corporation
modalias : pci:v000010DEd00001B80sv000010DEsd0000119Ebc03sc00i00
driver   : nvidia-375 - third-party free
driver   : nvidia-378 - third-party free
driver   : nvidia-384 - third-party free recommended
driver   : nvidia-381 - third-party free
driver   : xserver-xorg-video-nouveau - distro free builtin

== ==
driver   : intel-microcode - distro non-free

turns out i was missing my algorithm -a option…

[2017-09-10 22:08:29] GPU #0: GeForce GTX 1080, 625.75 kH/s
[2017-09-10 22:08:29] accepted: 199/199 (diff 0.018), 1965.27 kH/s yes!
[2017-09-10 22:08:34] GPU #1: GeForce GTX 1060 6GB, 562.16 kH/s
[2017-09-10 22:08:34] accepted: 200/200 (diff 0.009), 1965.05 kH/s yes!
[2017-09-10 22:08:35] GPU #2: GeForce GTX 1070, 780.21 kH/s
[2017-09-10 22:08:35] accepted: 201/201 (diff 0.020), 1965.03 kH/s yes!
[2017-09-10 22:08:38] accepted: 202/202 (diff 0.010), 1965.15 kH/s yes!
[2017-09-10 22:08:39] GPU #2: GeForce GTX 1070, 780.86 kH/s
[2017-09-10 22:08:39] accepted: 203/203 (diff 0.023), 1965.39 kH/s yes!
[2017-09-10 22:08:41] GPU #0: GeForce GTX 1080, 591.78 kH/s
[2017-09-10 22:08:41] accepted: 204/204 (diff 0.950), 1963.88 kH/s yes!
[2017-09-10 22:08:54] GPU #2: GeForce GTX 1070, 781.18 kH/s
[2017-09-10 22:08:54] accepted: 205/205 (diff 0.027), 1963.87 kH/s yes!
[2017-09-10 22:09:03] GPU #1: GeForce GTX 1060 6GB, 563.11 kH/s
[2017-09-10 22:09:03] accepted: 206/206 (diff 0.005), 1963.66 kH/s yes!

More like it… looking like the 1070 is the best and the 1080 is just dissapointing… i have heard people saying they get 900 on 1080s though…

Ive heard not so good reviews on the 1080 for hashing power (1060 and 1070 Ti versions seems to blow them out of the water) but I am glad you got it working…I cant wait to see what the 1050ti does when I finally get one…
FYI you might get better pay mining dash or monero with a gpu…
my 750ti does 170h/s on monero(compute 5.0 miner is the only one I can find) and 2.5 mh/s on dash(ccminer)…its just finding the right pool that’s the challenge
only reason I changed what I mine with my gpu is because I bought ASIC’s for litecoin though…otherwise I’d just be mining litecoin w/ GPU because it doesn’t take over a month to get a payout like monero and dash do

Can you recommend a good litecoin pool?
heres a list of most pools and their contributions to the network

I like myself just because I don’t run my miners 24/7 so I like to make sure I get paid for all my time and the PPS pay system ensures that…plus they pay more than 100% because of merge mining…

guess i got somewhat lucky for my first pool pick :slight_smile:

was just looking at other mining pools with ccminer documentation and found minergate? and what currency one should mine in a pool like that?

I quickly tested mining BTC there and saw slightly better hash rates on litecoinpool (1.2kh/s vs1.9kh/s) But the power consumption on my total rig drops from 450w to 250w… (weird … guesing it has to do with the algorithm) FYI when i went back to LTC coin, i suddently noticed my gtx1080 getting close to 700kh/s for a bit… (interesting)

litecoin takes less power to mine than bitcoin does therefore it is more efficient to mine it…same with x11 coins like dash…I mine monero and it barely takes any power from my card so I can still use my rig for other stuff and mine in the background but when mining bitcoin, litecoin or ether I cant use my rig at all because all the processing power is going to mining…

personally I didn’t like minergate because I like to see the shares being accepted…seemed to have a slightly higher hashrate with it though on dash anyway…because lack of transparency is a big issue with me…I will calculate every share to make sure they are paying me like they say they are

UGH!!! im stuck in a login loop… i tried to over clock my cards with instructions here:

and now im caught in a login loop i purged / uninstalled my nvidia drivers and tried starting from scratch but i can’t get passed my log in screen… if i ssh in i can run ccminer but my hasrate has dropped down to 500H/s :frowning:
can you help?

first check your running the correct algorithm…instructions are x11 instructions…litecoin uses x11 but salts it which makes a scrypt algorithm (pretty sure that’s how it goes not exact on that fact)
I gotta admit I am a little confused as to why you’d be able to login with ssh and not as usual also but your ssh might be your problem only if you don’t have full admin rights applied to ssh
check your connection and make sure your running full duplex when using ssh maybe?
or you can maybe try downloading the proxy form litecoinpool ( )and run ccminer to…this will give you some cache using the proxy…
always check ./ccminer --help
may need to disable longpolling since your using ssh ccminer might think your trying to longpoll to the machine your logging in with

from the looks of those instructions you may also try changing the amount of memory the cards are using…instructions seem like they would take a lot of playing with to get the settings perfect

I use GPUTweak that came with my card and it works great for overclocking…its not a tweakable as this would be but it is an easy overclocking solution

make sure you have openssl installed and updated on both machines…mainly the one you are connection from…so you’d have SSH (secure Socket Host) running on the machine running the miner and Open SSL (Secure Socket Link) on the PC your connection externally with…pretty sure you need SSL installed on the host machine too

one more thing to try in ccminer is --api-allow (ip address) or --api-bind (ip address) of link machine you are externally connecting from

./ccminer --help to double check

well the real problem is that i am getting caught in a log in loop on the ubuntu screen. I uninstalled my nvidia drivers and can now log in, but cant seem to get them installed again and my resolution is all messed up even though im not using my nvidia cards for display… I tried going through all the steps again but no matter what, when i try to install nvidia drivers, it complains that Xserver is running even after calling lightdm stop or service lightdm stop

So i reinstaled ubuntu desktop and purged Nvidia, this allows me to log in normally but with awful resolution. I reinstalled nvidia (via ubuntu-drivers autoinstall ) and got caught in a login loop again but i can ssh in. while sshed in I am able to run hashcat (which is what this machine was originally built for) and it sees all my cards and gives me expected rates for hashcat. So thats good, at least i know my nvidia drivers and cards are working… can’t run ccminer, but thats because i have to reinstall cuda… doing that now and its taking a while… I have googled the hell out of “login loop ubuntu 16.04” and tried everything… had this problem before but the internet was a help… not this time… :frowning:

ok cuda is installed and from my ssh i ran my scrypt

No protocol specified
Failed to connect to Mir: Failed to connect to server socket: No such file or directory
Unable to init server: Could not connect: Connection refused

ERROR: The control display is undefined; please run `nvidia-settings --help` for usage information.

No protocol specified
Failed to connect to Mir: Failed to connect to server socket: No such file or directory
Unable to init server: Could not connect: Connection refused

ERROR: The control display is undefined; please run `nvidia-settings --help` for usage information.

Power limit for GPU 00000000:01:00.0 was set to 100.00 W from 180.00 W.

Warning: persistence mode is disabled on this device. This settings will go back to default as soon as driver unloads (e.g. last application like nvidia-smi or cuda application terminates). Run with [--help | -h] switch to get more information on how to enable persistence mode.

All done.
*** ccminer 2.2.1 for nVidia GPUs by tpruvot@github ***
    Built with the nVidia CUDA Toolkit 8.0 64-bits

  Originally based on Christian Buchner and Christian H. project
  Include some algos from alexis78, djm34, sp, tsiv and klausT.

BTC donation address: 1AJdfCpLWPNoAMDfHF1wD5y8VgKSSTHxPo (tpruvot)

[2017-09-13 08:42:04] Using JSON-RPC 2.0
[2017-09-13 08:42:04] Starting on stratum+tcp://
[2017-09-13 08:42:04] 3 miner threads started, using 'cryptonight' algorithm.
[2017-09-13 08:42:06] Stratum difficulty set to 1063 (1.063)
[2017-09-13 08:42:07] GPU #1: GeForce GTX 1070, 8007 MB available, 15 SMX
[2017-09-13 08:42:07] GPU #1: 960 threads (9.875) with 60 blocks
[2017-09-13 08:42:07] GPU #0: GeForce GTX 1080, 7981 MB available, 20 SMX
[2017-09-13 08:42:07] GPU #0: 1280 threads (10.25) with 80 blocks
[2017-09-13 08:42:07] GPU #2: GeForce GTX 1070, 8007 MB available, 15 SMX
[2017-09-13 08:42:07] GPU #2: 960 threads (9.875) with 60 blocks
[2017-09-13 08:42:11] GPU #1: GeForce GTX 1070, 380.34 H/s
[2017-09-13 08:42:12] accepted: 1/1 (diff 1.604), 380.34 H/s yes!
[2017-09-13 08:42:12] accepted: 2/2 (diff 2.071), 380.34 H/s yes!
[2017-09-13 08:42:12] GPU #2: GeForce GTX 1070, 143.59 H/s
[2017-09-13 08:42:13] accepted: 3/3 (diff 5.416), 727.61 H/s yes!
[2017-09-13 08:42:13] accepted: 4/4 (diff 1.845), 727.61 H/s yes!
[2017-09-13 08:42:13] accepted: 5/5 (diff 2.587), 727.61 H/s yes!
[2017-09-13 08:42:13] GPU #0: GeForce GTX 1080, 172.27 H/s
[2017-09-13 08:42:13] accepted: 6/6 (diff 1.148), 899.89 H/s yes!
[2017-09-13 08:42:16] GPU #1: GeForce GTX 1070, 597.47 H/s
[2017-09-13 08:42:16] GPU #2: GeForce GTX 1070, 591.11 H/s
[2017-09-13 08:42:16] accepted: 7/7 (diff 1.606), 1691.42 H/s yes!
[2017-09-13 08:42:16] accepted: 8/8 (diff 1.290), 1691.42 H/s yes!
[2017-09-13 08:42:17] accepted: 9/9 (diff 1.728), 1691.42 H/s yes!
[2017-09-13 08:42:18] accepted: 10/10 (diff 3.287), 1691.42 H/s yes!
[2017-09-13 08:42:18] accepted: 11/11 (diff 1.178), 1691.42 H/s yes!
[2017-09-13 08:42:19] GPU #0: GeForce GTX 1080, 523.52 H/s
[2017-09-13 08:42:21] GPU #2: GeForce GTX 1070, 589.68 H/s
[2017-09-13 08:42:21] GPU #1: GeForce GTX 1070, 584.84 H/s
[2017-09-13 08:42:21] accepted: 12/12 (diff 2.204), 1693.22 H/s yes!
[2017-09-13 08:42:21] accepted: 13/13 (diff 1.318), 1693.22 H/s yes!
[2017-09-13 08:42:21] accepted: 14/14 (diff 1.354), 1693.22 H/s yes!
[2017-09-13 08:42:22] accepted: 15/15 (diff 2.176), 1693.22 H/s yes!
[2017-09-13 08:42:23] accepted: 16/16 (diff 9.434), 1693.22 H/s yes!

this is a minergate scrypt (i know but was getting results)

as you can see my rate is H/s instead of kH/s thats really bad…