Author

Topic: OFFICIAL CGMINER mining software thread for linux/win/osx/mips/arm/r-pi 4.11.0 - page 380. (Read 5805537 times)

legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
Looks like chrome has 2.11.4 listed as malicious?  I got a warning after I downloaded it.
chrome? It now includes an AV? and it's as stupid as most of the windows AV programs for detecting cgminer rather than the viruses that can put it there? Sigh.
hero member
Activity: 924
Merit: 1000
Watch out for the "Neg-Rep-Dogie-Police".....
Looks like chrome has 2.11.4 listed as malicious?  I got a warning after I downloaded it.

Chrome = Google = malicious  Cheesy
legendary
Activity: 3583
Merit: 1094
Think for yourself
Looks like chrome has 2.11.4 listed as malicious?  I got a warning after I downloaded it.

Never heard that one before.
hero member
Activity: 938
Merit: 1000
www.multipool.us
Looks like chrome has 2.11.4 listed as malicious?  I got a warning after I downloaded it.
sr. member
Activity: 658
Merit: 250
Would like to request a feature. If you launch with -c filename.conf then change that to the default filename when going to Write Settings.

This is already possible using the --default-config option.
newbie
Activity: 30
Merit: 0
I need some help with my Cgminer https://bitcointalksearch.org/topic/error-all-devices-disabled-cannot-mine-169290 , thanks

./cgminer -n result:
Quote
CL Platform 0 vendor: Advanced Micro Devices, Inc.                   
CL Platform 0 name: AMD Accelerated Parallel Processing                   
CL Platform 0 version: OpenCL 1.2 AMD-APP (1016.4)                   
Error -1: Getting Device IDs (num)                   
clDevicesNum returned error, no GPUs usable                   
0 GPU devices max detected 

When I try to start Cgminer
Quote
Started cgminer 2.11.4                   
Error -1: Getting Device IDs (num)                   
clDevicesNum returned error, no GPUs usable                   
All devices disabled, cannot mine!
newbie
Activity: 47
Merit: 0
Both of them appear to fail on *nix, which I'd prefer to use... I actually completely gave up on Linux,because cgminer incorrectly reported 1.250, and thought they weren't appropriately setting voltages.
I then switched to windows, and because afterburner adjusts and shows voltages correctly... I thought I was ok, until I started seeing cgminer reporting back 1.250... so I gave up completely and just started setting configs for 1.250 and overclocking (to get as much hashing as possible to compensate as I could), as it seemed undervolting wasn't possible with the cards I had.

This isn't the case, and it's simply being reported incorrectly by cgminer.  At least I know now... 1.250 vs. 1.100 has been a significant cost on utility bill.

Anyway, now that I know, there's a very good chance I'll get back to linux (yay!).
I'm good, and glad to know that this is known.
newbie
Activity: 47
Merit: 0
Anyone else noticing cgminer not correctly controlling and/or reporting vddc correctly on 79xx cards?  

[SNIP]
cgminer simply reports back whatever the ATI Display Library (ADL) says the voltage is, and can only modify whatever that ADL allows.

VERY old long running known issue ...

The ATI ADL library, written by ATI of course, decides what is allowed.
If you send a change, to a value it doesn't like, it will say OK, then change it back to the default.
Go discuss that with ATI ... if you think it's a problem Smiley

This might sound crazy... but is there a way to know if you're getting invalid information?  Or known drivers/GPUs/?? that don't report back correct info?
If so, putting unknown rather than incorrect information might be cleaner.

I was unaware of the issue (Did I miss it in the "No one but sephtin ever read's me")?
Doesn't sound like a major overhaul... feature request perhaps?

Edit: s
hero member
Activity: 770
Merit: 502
I have no idea then, probably wait for someone else with knowledge. sorry.
full member
Activity: 223
Merit: 100
if i write: cgminer -n, i get:
 [2013-04-07 15:14:09] CL Platform 0 vendor: Advanced Micro Devices, Inc.                   
 [2013-04-07 15:14:09] CL Platform 0 name: AMD Accelerated Parallel Processing                   
 [2013-04-07 15:14:09] CL Platform 0 version: OpenCL 1.1 AMD-APP-SDK-v2.4 (595.10)                   
 [2013-04-07 15:14:09] Platform 0 devices: 2                   
 [2013-04-07 15:14:09]    0   Cypress                   
 [2013-04-07 15:14:09]    1   Cypress                   
 [2013-04-07 15:14:09] 2 GPU devices max detected   

I think that the gpus are correctly configured.
hero member
Activity: 770
Merit: 502
Thanks, i solved (i have to add "--enable-scrypt" after the "./configure" command)

but now with this line:
cgminer -o stratum+tcp://coinotron.com:3334 -u xxxx -p x --script
i get a very low hash speed for 2x5870:
(5s):58.75K (avg):58.34Kh/s | A:2  R:0  HW:0  U:1.1/m  WU:144.2/m

It seems that cgminer are using CPU rather than GPU, How can i fix?

You need to find which platform your gpus are, e.g. 0,1,2 or 3. cgminer.conf platform cmd. Example:
Code:
"gpu-platform" : "0",

Consolidated Litecoin Mining Guide for 5xxx, 6xxx, and 7xxx GPUs
full member
Activity: 223
Merit: 100
Thanks, i solved (i have to add "--enable-scrypt" after the "./configure" command)

but now with this line:
cgminer -o stratum+tcp://coinotron.com:3334 -u xxxx -p x --script
i get a very low hash speed for 2x5870:
(5s):58.75K (avg):58.34Kh/s | A:2  R:0  HW:0  U:1.1/m  WU:144.2/m

It seems that cgminer are using CPU rather than GPU, How can i fix?
hero member
Activity: 770
Merit: 502
Have you tried using cgminer.conf?

Code:
"scrypt" : true,
full member
Activity: 223
Merit: 100
Hi guys, I'm trying to mining LTC using 2x5870 with this line:
cgminer -o stratum+tcp://coinotron.com:3334 -u xxxx -p x --script

The command "--scrypt" aren't accepted by cgminer!

the output is: [2013-04-07 12:51:40] cgminer: --scrypt: unrecognized option

Then i tried with cgminer --enable-scrypt, but the output is:

cgminer: --enable-scrypt: unrecognized option

Even if the command "--enable-scrypt" is on the README file!
donator
Activity: 543
Merit: 500
How can I put the "--device|-d " parameter in a config file?
Say I only want to use devices 2 and 3 for mining.

I tried "device" : "2,3" but that didn't work.
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
VERY old long running known issue ...

The ATI ADL library, written by ATI of course, decides what is allowed.
If you send a change, to a value it doesn't like, it will say OK, then change it back to the default.
Go discuss that with ATI ... if you think it's a problem Smiley
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
Anyone else noticing cgminer not correctly controlling and/or reporting vddc correctly on 79xx cards?  

[SNIP]
cgminer simply reports back whatever the ATI Display Library (ADL) says the voltage is, and can only modify whatever that ADL allows.
newbie
Activity: 47
Merit: 0
Anyone else noticing cgminer not correctly controlling and/or reporting vddc correctly on 79xx cards?  

Example on 7950:
Afterburner - Set GPU to 1100core/1600mem @1100mV
CGMiner -- using conf file with GPU set to 1100core/1600mem @1100mV
Code:
...
"thread-concurrency" : "24000,24000,24000,24000",
"shaders" : "0,0,0,0",
"gpu-engine" : "850-1100,850-1100,850-1100,850-1100",
"gpu-fan" : "0-100,0-100,0-100,0-100",
"gpu-memclock" : "1600,1600,1600,1600",
"gpu-memdiff" : "0,0,0,0",
"gpu-powertune" : "20,20,20,20",
"gpu-vddc" : "1.100,1.100,1.100,1.100",
...

Start CGMiner, and it reports 1.250V under GPU Management:
Code:
GPU 0: 683.0 / 682.7 Kh/s | A:165  R:3  HW:0  U:2.62/m  I:20
67.0 C  F: 100% (4416 RPM)  E: 1100 MHz  M: 1600 Mhz  V: 1.250V  A: 99%  P: 20%
Last initialised: [2013-04-07 01:28:34]
Intensity: 20
Thread 0: 679.2 Kh/s Enabled ALIVE

GPU 1: 678.7 / 671.3 Kh/s | A:165  R:1  HW:0  U:2.62/m  I:20
58.0 C  F: 100% (3983 RPM)  E: 1100 MHz  M: 1600 Mhz  V: 1.250V  A: 99%  P: 20%
Last initialised: [2013-04-07 01:28:34]
Intensity: 20
Thread 1: 675.6 Kh/s Enabled ALIVE

GPU 2: 661.7 / 656.1 Kh/s | A:190  R:2  HW:0  U:3.02/m  I:20
58.0 C  F: 100% (4025 RPM)  E: 1100 MHz  M: 1600 Mhz  V: 1.250V  A: 99%  P: 20%
Last initialised: [2013-04-07 01:28:34]
Intensity: 20
Thread 2: 660.1 Kh/s Enabled ALIVE

GPU 3: 671.6 / 671.1 Kh/s | A:172  R:1  HW:0  U:2.73/m  I:20
55.0 C  F: 100% (4358 RPM)  E: 1100 MHz  M: 1600 Mhz  V: 1.250V  A: 99%  P: 20%
Last initialised: [2013-04-07 01:28:34]
Intensity: 20
Thread 3: 667.7 Kh/s Enabled ALIVE

[E]nable [D]isable [I]ntensity [R]estart GPU [C]hange settings
Or press any other key to continue

Afterburner and trixx report that mV is set to 1100.  GPUz never shows it above 1.080v.  And setting it to 1.250 in afterburner (shows 1.250 in afterburner, trixx, and GPUz shows 1.180'ish under load).  There's also a HUGE difference when measured at the wall... so I'm convinced that it's cgminer that's not correctly reporting the voltage..

I'm on the latest beta drivers currently (13.3B3), but have tried stable (13.1), same issue.

I'm mining altcoin, but I'd be surprised if that was related (??).. ?
If there's any additional info I can provide, let me know.

Edit (more detail):
On Win7x64, using cgminer-2.11.4-windows from the apps/cgminer on github.  (Same issue on 2.11.3).
Confirmed with one other with 7950s, so it isn't just me.  Wink
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
USB does, yes. But not the devices in question.

If that's the case, then I suppose I see no real benefit to using libusb either.
cgminer already implements functionality that uses the advantages of libusb

...
...
Another is the usb API stats
All devices have statistics recorded about all I/O to them, including the initial control transfers that the serial-USB code doesn't even know about
Yes, this is made possible using libusb. Too bad it's completely useless.

Just thought I'd point out this little gem of utter stupidity by the retard Luke-Jr

So he's been hashing away on a BFL SC for how many days now? Edit: 6 or 7 days!

And ... the latest problem has been a supposed 500ms latency issue with I/O

Pity they chose the crappy miner code to test all this on - coz cgminer (my USB statistics) already reports all this through the API on a per command level. Min, Max, Total, Count ... (average = Total/Count of course)

So I guess they would have known about this how many days ago if they were testing with cgminer instead of the 30 year old crap code used in the crap clone?
Jump to: