Author

Topic: OFFICIAL CGMINER mining software thread for linux/win/osx/mips/arm/r-pi 4.11.0 - page 346. (Read 5805537 times)

legendary
Activity: 3583
Merit: 1094
Think for yourself
I have issue about efficiency ...

U:9:02/m and U:5.02/m can i ignore it or worry?



thx

U: or Utility is how many shares per minute.  Why would you ignore or worry about it?  It's good information.
legendary
Activity: 3583
Merit: 1094
Think for yourself
Kano,
I thought ASIC's were purpose built for Bitcoin mining.  In the thread linked below this guy is saying people are mining Alt Coins with ASIC's.  I was wondering if you had any insight on this?  Can CGMiner mine Alt Coins with ASIC's?
Thanks,
Sam

https://bitcointalksearch.org/topic/m.2086549
cgminer can mine any double sha256 coin with the ASICs it supports (i.e. not scrypt coins like litecoin) and avalons have been used to terrorise some of those altcoins already.

OK, I had no idea.  Thanks for the verification.
Sam
legendary
Activity: 1792
Merit: 1008
/dev/null
I'm totally for keeping CGMiner as simple as possible. I actually wonder if one day you will drop GPU support for strictly ASIC support. However, the higher performance of native cuda seems to be pushing the cards into the realm of slightly profitable. A 580 seems to push 280KH/s, which is about $1.25/day profit.

I'm curious as to what miner this would be where they were able to get 280 out of it. As I'm one of those poor bastards trying to make just 1 BTC on my 670 with a profit of ~$0.30 - $0.50 a day
i can confirm this since i have a 580 too (slighty OC).
if your interested which miner it is, the miner is called cudaminer
also with cudaminer you only use RAM on your GPU, you dont need alot of sysRAM to mine it Smiley my cudamminer uses 35kB RAM and allocates 1252 MB on my GPU (this depends on the launch-config of cudaminer)
newbie
Activity: 13
Merit: 0
Seems i'm lucky for problems.
Got the message stratum connection was interrapted. Fine.. but why cgminer stay still and didn't  switch to another pool (a have 2 pools in conf file)?
sr. member
Activity: 448
Merit: 251
I have issue about efficiency ...

U:9:02/m and U:5.02/m can i ignore it or worry?



thx
newbie
Activity: 21
Merit: 0
I'm totally for keeping CGMiner as simple as possible. I actually wonder if one day you will drop GPU support for strictly ASIC support. However, the higher performance of native cuda seems to be pushing the cards into the realm of slightly profitable. A 580 seems to push 280KH/s, which is about $1.25/day profit.

I'm curious as to what miner this would be where they were able to get 280 out of it. As I'm one of those poor bastards trying to make just 1 BTC on my 670 with a profit of ~$0.30 - $0.50 a day
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
Kano,
I thought ASIC's were purpose built for Bitcoin mining.  In the thread linked below this guy is saying people are mining Alt Coins with ASIC's.  I was wondering if you had any insight on this?  Can CGMiner mine Alt Coins with ASIC's?
Thanks,
Sam

https://bitcointalksearch.org/topic/m.2086549
cgminer can mine any double sha256 coin with the ASICs it supports (i.e. not scrypt coins like litecoin) and avalons have been used to terrorise some of those altcoins already.
legendary
Activity: 3583
Merit: 1094
Think for yourself
Kano,
I thought ASIC's were purpose built for Bitcoin mining.  In the thread linked below this guy is saying people are mining Alt Coins with ASIC's.  I was wondering if you had any insight on this?  Can CGMiner mine Alt Coins with ASIC's?
Thanks,
Sam

https://bitcointalksearch.org/topic/m.2086549
legendary
Activity: 3583
Merit: 1094
Think for yourself

"kernel" : "scrypt,scrypt",

"gpu-fan" : "100,100",
"gpu-engine" : "800-930,800-930",

"temp-cutoff" : "92,92",
"temp-overheat" : "88,88",
"temp-target" : "85,85",

"scrypt" : true,

"auto-gpu" : true

Well it looks like your using scrypt which I know nothing about.

Your using auto gpu which is good.  I don't know if your engine range is OK?

Your not using auto fan so I would enable that and set the range for the fans to something like 60-85 instead of running at 100.  Still seems odd that your GPU is even reaching 92.  Are you running with your case on?

Since your not using auto fan and running them at 100% the only variable to control the temperature is to lower the engine clock and the floor is 800.

Something is missing.
Case is open, but cards i quite close to each other(no risers). So one of them is real hot, another one is 75 degrees.And the hot one ritching 90 degrees then cuting off by cgminer, and even when temperature low to target it shown 0 kh/s until reboot GPU thrue the menu.

That part is by design.

From the top post
Quote
If the temperature goes over the
cutoff limit (95 degrees by default), cgminer will completely disable the GPU
from mining and it will not be re-enabled unless manually done so.

But I'm saying that your GPU's shouldn't even be reaching the temp cutoff, so you need to work on that somehow.
newbie
Activity: 13
Merit: 0

"kernel" : "scrypt,scrypt",

"gpu-fan" : "100,100",
"gpu-engine" : "800-930,800-930",

"temp-cutoff" : "92,92",
"temp-overheat" : "88,88",
"temp-target" : "85,85",

"scrypt" : true,

"auto-gpu" : true

Well it looks like your using scrypt which I know nothing about.

Your using auto gpu which is good.  I don't know if your engine range is OK?

Your not using auto fan so I would enable that and set the range for the fans to something like 60-85 instead of running at 100.  Still seems odd that your GPU is even reaching 92.  Are you running with your case on?

Since your not using auto fan and running them at 100% the only variable to control the temperature is to lower the engine clock and the floor is 800.

Something is missing.
Case is open, but cards i quite close to each other(no risers). So one of them is real hot, another one is 75 degrees.And the hot one ritching 90 degrees then cuting off by cgminer, and even when temperature low to target it shown 0 kh/s until reboot GPU thrue the menu.
legendary
Activity: 3583
Merit: 1094
Think for yourself

"kernel" : "scrypt,scrypt",

"gpu-fan" : "100,100",
"gpu-engine" : "800-930,800-930",

"temp-cutoff" : "92,92",
"temp-overheat" : "88,88",
"temp-target" : "85,85",

"scrypt" : true,

"auto-gpu" : true

Well it looks like your using scrypt which I know nothing about.

Your using auto gpu which is good.  I don't know if your engine range is OK?

Your not using auto fan so I would enable that and set the range for the fans to something like 60-85 instead of running at 100.  Still seems odd that your GPU is even reaching 92.  Are you running with your case on?

Since your not using auto fan and running them at 100% the only variable to control the temperature is to lower the engine clock and the floor is 800.

Something is missing.
full member
Activity: 208
Merit: 100
can someone explain to me what --gpu-powertune  x   (where x is a number like -10 or 10) does?

Does this try to undervolt or overvolt the card by x%?

Also just wondering if my hash rate for litecoin is ok for a 6950+5770 system, I'm getting a combined  rate of 615Khash/sec is that good?
newbie
Activity: 13
Merit: 0
Can some one please explain me what happens. I'm useing 2 x 7950
In config file i have an option "temp-cutoff" : "90,90"
So when the temperature goes ower 90 cgminer turns off a card.... but  another card. So the "cold" card goes off(temperature lowing), and hot card stay working with rising temperature, but cgminer saying that card is off.
So cgminer turning off the wrong video card in my case.
cgminer 3.1
Lean how to use --gpu-map
Thank you.

Make a correct option "gpu-map" : "0:1,1:0"

Now i face with situation that after cut off and cool down GPU dont want to back to work before restart it (restart it from menu works fine)

Are you using auto gpu and auto fan?
If so do you have a range set for your engine clock and fan speed?
Also do you have a temp target set?
Seems odd that your GPU's are even reaching 90c.

My config :

"intensity" : "20,20",
"vectors" : "1,1",
"worksize" : "256,256",
"kernel" : "scrypt,scrypt",
"lookup-gap" : "2,2",
"thread-concurrency" : "21712,21712",
"shaders" : "1792,1792",
"gpu-fan" : "100,100",
"gpu-engine" : "800-930,800-930",
"gpu-map" : "0:1,1:0",
"gpu-powertune" : "-11,-11",
"temp-cutoff" : "92,92",
"temp-overheat" : "88,88",
"temp-target" : "85,85",
"api-port" : "4028",
"expiry" : "120",
"gpu-dyninterval" : "7",
"gpu-platform" : "0",
"gpu-threads" : "1",
"hotplug" : "5",
"log" : "5",
"no-pool-disable" : true,
"queue" : "1",
"scan-time" : "60",
"scrypt" : true,
"temp-hysteresis" : "3",
"shares" : "0",
"auto-gpu" : true
legendary
Activity: 3583
Merit: 1094
Think for yourself
Can some one please explain me what happens. I'm useing 2 x 7950
In config file i have an option "temp-cutoff" : "90,90"
So when the temperature goes ower 90 cgminer turns off a card.... but  another card. So the "cold" card goes off(temperature lowing), and hot card stay working with rising temperature, but cgminer saying that card is off.
So cgminer turning off the wrong video card in my case.
cgminer 3.1
Lean how to use --gpu-map
Thank you.

Make a correct option "gpu-map" : "0:1,1:0"

Now i face with situation that after cut off and cool down GPU dont want to back to work before restart it (restart it from menu works fine)

Are you using auto gpu and auto fan?
If so do you have a range set for your engine clock and fan speed?
Also do you have a temp target set?
Seems odd that your GPU's are even reaching 90c.
sr. member
Activity: 658
Merit: 250
Good to know, I guess I shouldn't expect every commit to work perfectly in marginal cases like cross-compiling.

EDIT: I saw some new commits and tried again. Now it works.
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
My cross-compiling setup suddenly fails on the newest git master version. I get multiple errors about sys/socket.h not being found. I traced the problem back to commit 31aa4f6cebc51e26b349606fd78d71954bda87da. Commit 657e64477b75603bc9b08eed425bc47f606814cb and everything before that compiles correctly. Is there a new dependency that I'm now missing, or is this a bug?
It's just not complete yet for ming which I assume you're compiling for.
sr. member
Activity: 658
Merit: 250
My cross-compiling setup suddenly fails on the newest git master version. I get multiple errors about sys/socket.h not being found. I traced the problem back to commit 31aa4f6cebc51e26b349606fd78d71954bda87da. Commit 657e64477b75603bc9b08eed425bc47f606814cb and everything before that compiles correctly. Is there a new dependency that I'm now missing, or is this a bug? The only sys/socket.h I have (/usr/include/x86_64-linux-gnu/sys/socket.h) is not under my mingw toolchain, but that wasn't a problem when cross-compiling until now. For reference, here are the files in my toolchain: http://pastebin.com/sSEqFL63
newbie
Activity: 13
Merit: 0
Can some one please explain me what happens. I'm useing 2 x 7950
In config file i have an option "temp-cutoff" : "90,90"
So when the temperature goes ower 90 cgminer turns off a card.... but  another card. So the "cold" card goes off(temperature lowing), and hot card stay working with rising temperature, but cgminer saying that card is off.
So cgminer turning off the wrong video card in my case.
cgminer 3.1
Lean how to use --gpu-map
Thank you.

Make a correct option "gpu-map" : "0:1,1:0"

Now i face with situation that after cut off and cool down GPU dont want to back to work before restart it (restart it from menu works fine)
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
Can some one please explain me what happens. I'm useing 2 x 7950
In config file i have an option "temp-cutoff" : "90,90"
So when the temperature goes ower 90 cgminer turns off a card.... but  another card. So the "cold" card goes off(temperature lowing), and hot card stay working with rising temperature, but cgminer saying that card is off.
So cgminer turning off the wrong video card in my case.
cgminer 3.1
Lean how to use --gpu-map
sr. member
Activity: 322
Merit: 250
Hd 6950problems

Tried diffrent types off sdkand drivers, now can i finally farm bitcoin true java on bitminter.
Still no bitcoin farming on cgminer
Want to use it for ltc or ftc. But can only work with that java app...

Can you help me please?
Jump to: