Author

Topic: OFFICIAL CGMINER mining software thread for linux/win/osx/mips/arm/r-pi 4.11.0 - page 395. (Read 5806015 times)

legendary
Activity: 1361
Merit: 1003
Don`t panic! Organize!
one question though, i am using cgminer to solo mine PPC and i have also an alternative pool (coinotron)
in command line. localhost doesn't support LP but coinotron does so
Local mining NEVER have LP because you mine ENTIRE block all the time Smiley
Same for every coin.
newbie
Activity: 36
Merit: 0
ok got myself up and running - put a dummy plug on the 5770 and got it running straight away. Thought the dummy plugs were only necessary for Windows - maybe you can get away without them on Ubuntu but for the sake of 3 x 22c resistors and one of the plethora of DVI/VGA convertors I have in the parts bin it finished off 4 days of stuffing around quite quickly.

Although when I first ran cgminer I was getting really bad hash rate, dropped the -g to 1 and 5770 worked and 7950 halved, tried to do "-g 2,1" but errored with "2,1 is not an integer". So running two instances of cgminer - maybe this could be fixed in a future release?

Hash rates disappointing compared to cgminer in windows 7 on identical 5770. So going to abandon Ubuntu for now and try windows 7...

Although I do read that I have been asking for trouble mixing 5XXX and 7XXX cards which may not have helped either.
hero member
Activity: 607
Merit: 500
i prefer sha-256 coins than the scrypt ones.
2.11.2 version is very nice, quick and stable.
one question though, i am using cgminer to solo mine PPC and i have also an alternative pool (coinotron)
in command line. localhost doesn't support LP but coinotron does so
In cgminer window i see:
"Long-polling activated for localhost:9902 via coinotron.com:8322/LP"
what does this mean? is it good or bad?
may i use only localhost in the command line?
once in a while i see some accepted shares from pool 1 (coinotron), is it normal?
thanks
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
Another litecoin miner question.

I have a dual boot computer.  It has a 6 core AMD processor, 2 GBytes memory, and 2 5850 cards.  It mines bitcoins well.   Under Ubuntu 11.04 it will not mine litecoins.  But if I boot to Windows 7 and run the same version of cgminer it will mine litecoins well.

The hardware, the cgminer version and the cgminer options are the same.  Only the operating system is different.

Windows is running AMD 13.1.  Ubuntu is running 12 something.

Is this the solution: Mine with cgminer 2.11.2 on Windows 7 and not on Ubuntu 11.04?

Sam
Driver and SDK versions matter but in a different way to those for bitcoin mining. There is no reason you can't ltc mine the same on linux for 5xxx with the right driver/sdk combo. I'd recommend AMD driver 12.8 with SDK 2.7 for litecoin mining.
member
Activity: 76
Merit: 10
Nvidia just sux for mining.
Read this: https://en.bitcoin.it/wiki/Mining_hardware_comparison

It should at least outperform my CPU though, right?
I think my setup is having a hard time switching between the integrated Intel graphics and the 680M.  I've tried to tell the laptop to run cgminer using the 680M but think I'm falling short.

I only see one graphics card in cgminer also.

F'ing nVidia Optimus...
legendary
Activity: 1361
Merit: 1003
Don`t panic! Organize!
member
Activity: 76
Merit: 10
Hi all,

Thanks for creating and maintaining such a great program!

Does anyone mine Litecoin/scrypt on a GPU with cgminer and want to share their hashrate?  I'm using an nVidia 680M and getting about 7Kh/s, but my CPU miner gets about 40Kh/sec.  I know nVidia cards aren't great for Bitcoin, but does my hashrate seem right for LTC?  I'm using Windows 7 pro with an Intel i7 3720QM, 32 GB RAM.

Thanks,

_theJestre
sr. member
Activity: 451
Merit: 250
Another litecoin miner question.

I have a dual boot computer.  It has a 6 core AMD processor, 2 GBytes memory, and 2 5850 cards.  It mines bitcoins well.   Under Ubuntu 11.04 it will not mine litecoins.  But if I boot to Windows 7 and run the same version of cgminer it will mine litecoins well.

The hardware, the cgminer version and the cgminer options are the same.  Only the operating system is different.

Windows is running AMD 13.1.  Ubuntu is running 12 something.

Is this the solution: Mine with cgminer 2.11.2 on Windows 7 and not on Ubuntu 11.04?

Sam
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
Code:
export GPU_MAX_ALLOC_PERCENT=100
export GPU_USE_SYNC_OBJECTS=1
Since the "GPU_MAX_ALLOC_PERCENT" and "GPU_USE_SYNC_OBJECTS" are linux only, what do us windows users do?
The first is not required on windows (it's on by default anyway). The second decreases CPU usage, so if you have that problem on windows, the only way to improve on it is use linux.
newbie
Activity: 57
Merit: 0

7970 @ 1135/1890, LG 2, TC 22392:
Code:
 GPU 0:  72.0C 3413RPM | 714.6K/715.7Kh/s | A:0 R:1 HW:0 U:0.00/m I:20

Uhm. With LG 2 and TC 22392 I get:

Code:
[2013-03-15 16:04:33] Maximum buffer memory device 0 supports says 805306368
[2013-03-15 16:04:33] Your scrypt settings come to 1467482112
[2013-03-15 16:04:33] Error -61: clCreateBuffer (padbuffer8), decrease CT or increase LG

Ok, the above error seems going away using:
Code:
export GPU_MAX_ALLOC_PERCENT=100
export GPU_USE_SYNC_OBJECTS=1
(I never ever needed this thing before).
As always YMMV, just because it works on mine doesn't mean it will work on yours - motherboard, CPU, and ram actually matter with scrypt. However -g 1 is now almost mandatory with these higher TCs. I'm making 1 GPU thread default in scrypt in the next version (just made it into git).

Since the "GPU_MAX_ALLOC_PERCENT" and "GPU_USE_SYNC_OBJECTS" are linux only, what do us windows users do?
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
CGMiner keeps telling me "Disabling extra threads due to dynamic mode." How do I stop it from doing this?
Don't use dynamic mode. Smiley

Gee, thanks. I hadn't thought of that.  Roll Eyes

HOW?
... as it says in the README that no one reads ...
Code:
--intensity|-I  Intensity of GPU scanning (d or -10 -> 10, default: d to maintain desktop interactivity)
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/

7970 @ 1135/1890, LG 2, TC 22392:
Code:
 GPU 0:  72.0C 3413RPM | 714.6K/715.7Kh/s | A:0 R:1 HW:0 U:0.00/m I:20

Uhm. With LG 2 and TC 22392 I get:

Code:
[2013-03-15 16:04:33] Maximum buffer memory device 0 supports says 805306368
[2013-03-15 16:04:33] Your scrypt settings come to 1467482112
[2013-03-15 16:04:33] Error -61: clCreateBuffer (padbuffer8), decrease CT or increase LG

Ok, the above error seems going away using:
Code:
export GPU_MAX_ALLOC_PERCENT=100
export GPU_USE_SYNC_OBJECTS=1
(I never ever needed this thing before).
As always YMMV, just because it works on mine doesn't mean it will work on yours - motherboard, CPU, and ram actually matter with scrypt. However -g 1 is now almost mandatory with these higher TCs. I'm making 1 GPU thread default in scrypt in the next version (just made it into git).
hero member
Activity: 896
Merit: 1000
CGMiner keeps telling me "Disabling extra threads due to dynamic mode." How do I stop it from doing this?
Don't use several threads (-g 1)
hero member
Activity: 591
Merit: 500
CGMiner keeps telling me "Disabling extra threads due to dynamic mode." How do I stop it from doing this?
Don't use dynamic mode. Smiley
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
I presume you mean a BFL ASIC (wont be much point running FPGAs once everyone has an ASIC)
So, unless we change the defaults between now and then, the first step (in the future) of building from a git clone would be:
./autogen.sh --disable-opencl --enable-bflsc

This is exactly what I was looking for. Thanks!


I just tried compiling with that flag turned on, and got an error:
Code:
usbutils.c:1842:16: error: âbflsrc_drvâ undeclared (first use in this function)

It looks like line 1842 has a typo:
Code:
					drv_count[[b]bflsrc_drv[/b].drv_id].limit = lim;

bflsrc_drv should be bflsc_drv, to match the definition at line 151.
... Smiley
member
Activity: 112
Merit: 10
I presume you mean a BFL ASIC (wont be much point running FPGAs once everyone has an ASIC)
So, unless we change the defaults between now and then, the first step (in the future) of building from a git clone would be:
./autogen.sh --disable-opencl --enable-bflsc

This is exactly what I was looking for. Thanks!


I just tried compiling with that flag turned on, and got an error:
Code:
usbutils.c:1842:16: error: âbflsrc_drvâ undeclared (first use in this function)

It looks like line 1842 has a typo:
Code:
					drv_count[[b]bflsrc_drv[/b].drv_id].limit = lim;

bflsrc_drv should be bflsc_drv, to match the definition at line 151.
member
Activity: 112
Merit: 10
I presume you mean a BFL ASIC (wont be much point running FPGAs once everyone has an ASIC)
So, unless we change the defaults between now and then, the first step (in the future) of building from a git clone would be:
./autogen.sh --disable-opencl --enable-bflsc

This is exactly what I was looking for. Thanks!
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
I was hoping by now I could have said something like "GPU mining is deprecated, only ASIC code is supported from here on"...

Speaking of which, I'm setting up a new Linux install on an old atom netbook that I plan to use as the driver for a BFL Single (hopefully some time this year... sigh).

For configuration testing purposes, I built it with cpu mining enabled, and it's rocking along at 1 Mh/s...

I assume all I'll need for autogen.sh flags when I want to rebuild cgminer to drive the Single is

Code:
autogen.sh --disable-opencl --disable-adl --enable-bitforce

Is there anything else that I should, or might want, to include?


I presume you mean a BFL ASIC (wont be much point running FPGAs once everyone has an ASIC)
So, unless we change the defaults between now and then, the first step (in the future) of building from a git clone would be:
./autogen.sh --disable-opencl --enable-bflsc
If you want both BFL ASIC and BFL FPGA
./autogen.sh --disable-opencl --enable-bflsc --enable-bitforce

If it's a source download, however, the first step would be:
./configure --disable-opencl --enable-bflsc
or
./configure --disable-opencl --enable-bflsc --enable-bitforce
hero member
Activity: 924
Merit: 1000
Watch out for the "Neg-Rep-Dogie-Police".....
I'm getting about 20-25% cpu usage on gentoo (htop shows the main process accounts for all of this, but there are about 10 other cgminer entries) with an amd 8350 and two 7950s. I have another box with one 5770 and one 7950 and it showed similar behavior (about 13% cpu usage). If I disable the 7950 then the usage goes down to 3-4%. I am using -I 5,8 -w 256 -v 1 and I have tried all of the kernels. I have also tried cgminer 2.10.4 and 2.11.2. I have another couple boxes with 5830s and 5770s that have less than 2% cpu usage. Any thoughts as to why the 7950 boxes have higher cpu usage?

From what I can gather, mixing 7xxx series cards with 5xxx or 6xxx series cards causes issues, as they prefer to have different driver/sdk setups. There is plenty of info elsewhere on the forums about it, that's how I found out. I keep all my 7xxx series separate in their own rig. Here's one link that might help you:

https://bitcointalksearch.org/topic/7970-linux-xubuntu-guide-please-77950

Or, do a search for "7970 settings". Hope it helps a bit.......

Peace.
member
Activity: 112
Merit: 10
I was hoping by now I could have said something like "GPU mining is deprecated, only ASIC code is supported from here on"...

Speaking of which, I'm setting up a new Linux install on an old atom netbook that I plan to use as the driver for a BFL Single (hopefully some time this year... sigh).

For configuration testing purposes, I built it with cpu mining enabled, and it's rocking along at 1 Mh/s...

I assume all I'll need for autogen.sh flags when I want to rebuild cgminer to drive the Single is

Code:
autogen.sh --disable-opencl --disable-adl --enable-bitforce

Is there anything else that I should, or might want, to include?
Jump to: