Author

Topic: OFFICIAL CGMINER mining software thread for linux/win/osx/mips/arm/r-pi 4.11.0 - page 333. (Read 5805537 times)

-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
"gpu-engine" : "1100",
"gpu-memclock" : "0-750",
"gpu-memdiff" : "0",

With Memdiff of 0 your telling your GPU to use the same memclock as your engine.

You can't specify a range for memclock you have to use one value.

I think 7xxx series cards can only have a engine to memory difference of 150Mhz.
This
sr. member
Activity: 448
Merit: 250
I was just commenting on his comments about setting the values and failing and using external programs.

Yeah that looks weird (doesn't make sense) memclock isn't a range, and using atoi(blah) on that range would return 0

I said I'm not using external programs.

I have also tried to set it to a single value (700 from memory) but that had no effect either.
sr. member
Activity: 336
Merit: 250
I've you trying to use MSI after burner to overclock your card ? I'm personnaly use it with no issue Wink
Stock is 1250 for me too and I raide up to 1500mhz (elpida chips not so good Sad )

No, I haven't installed any tools for overclocking (it's a Gigabyte card), only setting the main clock in cgminer.

I want to reduce the memory clock to reduce power usage and heat generation.
As per the README/GPU-README

cgminer uses the ATI ADL Library to adjust the clocks on the cards.

If ATI didn't put support in the library to change to the settings you request, cgminer will attempt, and report the new setting the library chose - usually the default setting for values they don't support.

If you use an external program, make sure you disable ADL in cgminer --no-adl

thanks for the tips, better perf with --no-adl using msi afterburner Smiley
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
I've you trying to use MSI after burner to overclock your card ? I'm personnaly use it with no issue Wink
Stock is 1250 for me too and I raide up to 1500mhz (elpida chips not so good Sad )

No, I haven't installed any tools for overclocking (it's a Gigabyte card), only setting the main clock in cgminer.

I want to reduce the memory clock to reduce power usage and heat generation.
As per the README/GPU-README

cgminer uses the ATI ADL Library to adjust the clocks on the cards.

If ATI didn't put support in the library to change to the settings you request, cgminer will attempt, and report the new setting the library chose - usually the default setting for values they don't support.

If you use an external program, make sure you disable ADL in cgminer --no-adl

So his settings are valid?  Using a range for memclock and memdiff of 0 at the same time, that is?
I was just commenting on his comments about setting the values and failing and using external programs.

Yeah that looks weird (doesn't make sense) memclock isn't a range, and using atoi(blah) on that range would return 0
legendary
Activity: 3583
Merit: 1094
Think for yourself
I've you trying to use MSI after burner to overclock your card ? I'm personnaly use it with no issue Wink
Stock is 1250 for me too and I raide up to 1500mhz (elpida chips not so good Sad )

No, I haven't installed any tools for overclocking (it's a Gigabyte card), only setting the main clock in cgminer.

I want to reduce the memory clock to reduce power usage and heat generation.
As per the README/GPU-README

cgminer uses the ATI ADL Library to adjust the clocks on the cards.

If ATI didn't put support in the library to change to the settings you request, cgminer will attempt, and report the new setting the library chose - usually the default setting for values they don't support.

If you use an external program, make sure you disable ADL in cgminer --no-adl

So his settings are valid?  Using a range for memclock and memdiff of 0 at the same time, that is?
legendary
Activity: 3583
Merit: 1094
Think for yourself
"gpu-engine" : "1100",
"gpu-memclock" : "0-750",
"gpu-memdiff" : "0",

With Memdiff of 0 your telling your GPU to use the same memclock as your engine.

You can't specify a range for memclock you have to use one value.

I think 7xxx series cards can only have a engine to memory difference of 150Mhz.
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
I've you trying to use MSI after burner to overclock your card ? I'm personnaly use it with no issue Wink
Stock is 1250 for me too and I raide up to 1500mhz (elpida chips not so good Sad )

No, I haven't installed any tools for overclocking (it's a Gigabyte card), only setting the main clock in cgminer.

I want to reduce the memory clock to reduce power usage and heat generation.
As per the README/GPU-README

cgminer uses the ATI ADL Library to adjust the clocks on the cards.

If ATI didn't put support in the library to change to the settings you request, cgminer will attempt, and report the new setting the library chose - usually the default setting for values they don't support.

If you use an external program, make sure you disable ADL in cgminer --no-adl
sr. member
Activity: 336
Merit: 250
I've you trying to use MSI after burner to overclock your card ? I'm personnaly use it with no issue Wink
Stock is 1250 for me too and I raide up to 1500mhz (elpida chips not so good Sad )

No, I haven't installed any tools for overclocking (it's a Gigabyte card), only setting the main clock in cgminer.

I want to reduce the memory clock to reduce power usage and heat generation.

msi afterburner is compatible with all cards Wink
sr. member
Activity: 448
Merit: 250
I've you trying to use MSI after burner to overclock your card ? I'm personnaly use it with no issue Wink
Stock is 1250 for me too and I raide up to 1500mhz (elpida chips not so good Sad )

No, I haven't installed any tools for overclocking (it's a Gigabyte card), only setting the main clock in cgminer.

I want to reduce the memory clock to reduce power usage and heat generation.
sr. member
Activity: 336
Merit: 250
sr. member
Activity: 448
Merit: 250
Is it normal for the memory clock on a 7950 to be locked and ignore the values I specify either in the config file or on the command line? With the following config file on Windows (I also have the same problem on Linux), the memory clock runs at 1250MHz

Code:
{
"pools" : [
        {
                "url" : "http://cryptominer.org:9332/",
                "user" : "username",
                "pass" : "password"
        }
]
,
"intensity" : "10",
"vectors" : "1",
"worksize" : "64",
"kernel" : "poclbm",
"lookup-gap" : "0",
"thread-concurrency" : "0",
"shaders" : "0",
"gpu-engine" : "1100",
"gpu-fan" : "0-85",
"gpu-memclock" : "0-750",
"gpu-memdiff" : "0",
"gpu-powertune" : "0",
"gpu-vddc" : "0.000",
"temp-cutoff" : "95",
"temp-overheat" : "85",
"temp-target" : "80",
"api-port" : "4028",
"auto-fan" : true,
"auto-gpu" : true,
"expiry" : "120",
"gpu-dyninterval" : "7",
"gpu-platform" : "0",
"gpu-threads" : "1",
"hotplug" : "5",
"log" : "5",
"no-pool-disable" : true,
"queue" : "0",
"scan-time" : "60",
"temp-hysteresis" : "3",
"shares" : "0",
"kernel-path" : "/usr/local/bin"
}

Do I need to disable auto-gpu or something different?
legendary
Activity: 3583
Merit: 1094
Think for yourself
MY two PCs are stil shutting down because overheating, what can I do to prevent this, how to make cgminer slows down with mining when overheating!

If it is your GPU's overheating then use the auto fan and auto gpu command line arguments.

There are examples in the executive summary in the top post of this thread.

I have auto-fan flag, but no auto-gpu because it overheats more if I have auto-gpu.


That makes no sense.  What's your target temp?
member
Activity: 84
Merit: 10
Luke-Jr just posted this in the Cairnsmore FPGA thread:

"In brief, cg = original GPU miner bfg was based on, plus old bfgminer FPGA code with various things broken and a few minor things added

These days it usually only makes sense to use BFGMiner."

Anyone here beg to differ?
Ya, BFG is basically a ripoff of CG. LJR is always trying to discredit CGMiner at every turn, and promote his miner instead. He was the one to originally work on adding FPGA support to CGMiner, but the other devs couldn't work with him, so he make his own fork (now BFG). Both CG and BFG have now added support for different ASICs, but IMO Con and Kano do it better.See Here for the full story.

Ah - somehow thought so - funny how I gathered that from Luke-Jr's tone  Cool
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
Luke-Jr just posted this in the Cairnsmore FPGA thread:

"In brief, cg = original GPU miner bfg was based on, plus old bfgminer FPGA code with various things broken and a few minor things added

These days it usually only makes sense to use BFGMiner."

Anyone here beg to differ?
I've now rewritten all the old serial drivers to use the new usbutils in cgminer - that all the new drivers do/will use.
Ztex is still the same as it is libusb also - but doesn't handle hotplug since it is before the new code (and I don't have one)

3.1.1 doesn't have the new Icarus driver yet - that I've now written and is testing in git
The new driver to come Linux is OK for that, but Windows needs some work on it to solve some hotplug issues.

3.1.1 will auto detect and handle correctly BFL, BAJ and MMQ devices.
The Icarus driver is still the old serial-USB driver in 3.1.1

The next will include usbutils/auto/hotplug for all the Icarus: ICA, BLT, LLT and AMU (not 100% sure about CMR yet though)
sr. member
Activity: 412
Merit: 250
MY two PCs are stil shutting down because overheating, what can I do to prevent this, how to make cgminer slows down with mining when overheating!

If it is your GPU's overheating then use the auto fan and auto gpu command line arguments.

There are examples in the executive summary in the top post of this thread.

I have auto-fan flag, but no auto-gpu because it overheats more if I have auto-gpu.
hero member
Activity: 896
Merit: 1000
Luke-Jr just posted this in the Cairnsmore FPGA thread:

"In brief, cg = original GPU miner bfg was based on, plus old bfgminer FPGA code with various things broken and a few minor things added

These days it usually only makes sense to use BFGMiner."

Anyone here beg to differ?

I tried bfgminer once because it advertised support for the dynamic clocking ability of the Cairnsmore1 FPGA boards with the hashvoodoo bitstream.
On p2pool, the dead on arrival shares went through the roof at ~50%. Never saw this kind of behaviour with cgminer, including with FPGAs (Icarus and Cairnsmore 1 with fixed frequencies).

I use MPBM for my Cairnsmore1 (it supports dynamic clocking) and cgminer for my GPUs and Icarus boards (and hopefully for my Avalon when it finally is delivered) .
legendary
Activity: 952
Merit: 1000
Luke-Jr just posted this in the Cairnsmore FPGA thread:

"In brief, cg = original GPU miner bfg was based on, plus old bfgminer FPGA code with various things broken and a few minor things added

These days it usually only makes sense to use BFGMiner."

Anyone here beg to differ?
Ya, BFG is basically a ripoff of CG. LJR is always trying to discredit CGMiner at every turn, and promote his miner instead. He was the one to originally work on adding FPGA support to CGMiner, but the other devs couldn't work with him, so he make his own fork (now BFG). Both CG and BFG have now added support for different ASICs, but IMO Con and Kano do it better.See Here for the full story.
full member
Activity: 140
Merit: 100
STATUS=S,When=1369336525,Code=78,Msg=CGMiner coin,Description=cgminer 3.1.1|COIN,Hash Method=sha256,Current Block Time=1369336161.322788,Current Block Hash=00000000000000c8fb30dc785e322b76269315b15684575fabb06a6d9a1175b8,LP=false,Network Difficulty=18446744073709553000.00000000|

---openwrt (ar71xx, mips, wr703n), why the network diff is ... command is echo -n "coin" | nc IP 4028
member
Activity: 84
Merit: 10
Luke-Jr just posted this in the Cairnsmore FPGA thread:

"In brief, cg = original GPU miner bfg was based on, plus old bfgminer FPGA code with various things broken and a few minor things added

These days it usually only makes sense to use BFGMiner."

Anyone here beg to differ?
hero member
Activity: 497
Merit: 500
the readme doesn't help me so much :/

And it can't unless you actually read it and spend the time doing research so that you can comprehend it as well.

If your unwilling or incapable of moving toward the goal of understanding then I would, kindly, suggest your finding something else to spend your/our time on.
Sam

I've try to set up to "thread-concurrency" : "",

But I have this error

GPU0: invalid nonce - HW error


Here my setup after reading the readme in deep

Code:
{
"pools" : [
{
"url" : "stratum+tcp://eu.wemineltc.com:3333",
"user" : "xxx",
"pass" : "xxx"
}
]
,
"intensity" : "20",
"vectors" : "1",
"worksize" : "256",
"kernel" : "scrypt",
"lookup-gap" : "0",
"thread-concurrency" : "",
"shaders" : "1792",
"gpu-engine" : "0-0",
"gpu-fan" : "0-0",
"gpu-memclock" : "0",
"gpu-memdiff" : "0",
"gpu-powertune" : "0",
"gpu-vddc" : "0.000",
"temp-cutoff" : "95",
"temp-overheat" : "85",
"temp-target" : "75",
"api-port" : "4028",
"expiry" : "120",
"gpu-dyninterval" : "7",
"gpu-platform" : "0",
"gpu-threads" : "1",
"hotplug" : "5",
"log" : "5",
"no-pool-disable" : true,
"queue" : "1",
"scan-time" : "30",
"scrypt" : true,
"temp-hysteresis" : "3",
"shares" : "0",
"kernel-path" : "/usr/local/bin"
}

This is in the README word for word. Maybe someone can translate for you. If you do not wish to read then maybe you should be paying someone to do this for you.
Jump to: