Author

Topic: OFFICIAL CGMINER mining software thread for linux/win/osx/mips/arm/r-pi 4.11.0 - page 383. (Read 5806057 times)

M3t
newbie
Activity: 42
Merit: 0
How do i --enable-cpu ?

doesn't seem to work o.0
newbie
Activity: 42
Merit: 0
Figured I'd test out scrypt mining, since I've never looked at it before today. Yes, yes, I'm late to the LTC party. Still haven't arrived.

Code:
 [2013-04-03 04:29:42] Error -11: Building Program (clBuildProgram)
 [2013-04-03 04:29:42] "/tmp/OCLpFoFQK.cl", line 762: error: identifier "LOOKUP_GAP" is undefined
        const uint ySIZE = (1024/LOOKUP_GAP+(1024%LOOKUP_GAP>0));
                                 ^

"/tmp/OCLpFoFQK.cl", line 763: error: identifier "CONCURRENT_THREADS" is
          undefined
        const uint xSIZE = CONCURRENT_THREADS;
                           ^

2 errors detected in the compilation of "/tmp/OCLpFoFQK.cl".

Internal error: clc compiler invocation failed.


Running a pair of 7970s, Catalyst 12.6 ("8.98.2" driver). Linux 2.6.32-5-amd64. cgminer 2.11.3, latest pull from git. I've tried poking different SDK versions in (2.6, 2.7, 2.8), but 2.8 makes my cards undetectable by cgminer. Have not messed with other catalyst versions yet; I need to pull an image of the USB stick before I try to break something that severely.

Searches on CONCURRENT_THREADS in this thread pointed to a post that referred to a different thread that didn't appear to contain an answer (was the rally to raise funds to get scrypt included in cgminer) and it eventually referred back to this thread. I unfortunately didn't see much on LOOKUP_GAP errors, either.

Has anyone seen this behavior before? Based on the 'undefined' errors when compiling the kernel I assume I'm missing the definitions in the headers with the CL platform, which makes me think SDK or driver version issues... the 2.7 SDK lib/x86_64 files are symlink'd into /usr/lib and the CL header directory into /usr/include. cgminer compiles fine; the problem comes at execution time and only if scrypt is selected as the kernel. (except in the case of 2.8 SDK, in which case it can't enumerate cards regardless of kernel)


EDIT: Fixed. Learned that it's not enough to set kernel:scrypt. also must set scrypt:true.
member
Activity: 81
Merit: 1002
It was only the wind.
Wait, you just said that it was the current, supported, standard interface, and libusb is low level. Then you said libusb adds a lot of abstraction and does the same things as the serial I/O libs, which would make it higher level. Which is it?
Both. For the network analogy, libusb would be libpcap - it adds some programmer-friendly abstractions on top of a raw socket. It's still working with low-level raw sockets, but in an abstracted way.

If we're using that analogy, then the serial I/O libs would be the regular TCP/IP stack. And libpcap is still faster and offers more functionality than the TCP/IP stack, it doesn't reimplement too much.
full member
Activity: 247
Merit: 100
Hello guys,
I've tried today last cgminer(2.11.3win32), on my Win8x64/Radeon 5850 card/Catalyst 13.3 beta but it crash on start.

Code:
cgminer -o stratum+tcp://coinotron.com:3334 -u blabla -p blabla --scrypt --thread-concurrency 8000 -I 18 -g 1 -w 256

Can you help me?
legendary
Activity: 4634
Merit: 1851
Linux since 1997 RedHat 4
osoverflow - careful with that, his code is known to brick Avalons, and who knows what his hacked up untested code will do to the BFL SC.
He released the BFLSC code before a working device even existed ...
Once we have one (soon) we'll complete the code and release a tested working version.
We don't like the idea of releasing untested code to brick people's hardware, so we don't do it.
This of course directly relates to what I said about the crappy clone way back when he copied cgminer almost a year ago ...
https://bitcointalk.org/index.php?topic=78192.40
member
Activity: 81
Merit: 1002
It was only the wind.
That's no reason to continue using deprecated technology. CGMiner could also be a 16-bit binary.
You missed the point. CGMiner is also deprecated software for deprecated technology (GPUs).
Well, if you consider the fact that shortly, when indeed GPU mining will be deprecated, that the crappy clone will ONLY be using the old termios serial IO libraries that were designed around 30 or more years ago, meanwhile, cgminer has been updated to use the libusb library to talk directly to the USB devices rather than via the old serial libraries that put an old interface in front of the USB devices and restrict access to most of the USB functionality ... yes it's quite clear that the clone is old technology and written using the serial library because the guy who wrote it was not only fail in programming ability he chose the simplest interface with the most restrictions coz he had no idea what he was doing ...
You mean the current, supported, standard interface, instead of bypassing it to use a low-level interface that has no benefit whatsoever.

It's like writing your own TCP/IP stack instead of using the one included in the OS. Not only is it stupidly redundant, it also means you've lost support, driver updates, ease of use, forward compatibility with new hardware, and regular-user-mode access.

Speed might be a reason to use libusb rather than 30 year old serial I/O libs... Just saying...
If Kano's lies were true, perhaps. But "30 year old serial I/O libs" is not quite right. While the interface may be 30 years old, the code behind it certainly isn't. Nor is there any need for a new interface. It's also a "library" builtin to the OS itself, so pretty much as little overhead as you can get.
On the other hand, libusb is designed for raw USB access, and non-native on at least Windows. But it does add a lot of abstraction which theoretically harms performance. It then goes and does the same things as the "30 year old serial I/O libs" using a non-standard interface. libusb is nice when there are no existing drivers, but totally the wrong tool for these specific devices. Unfortunately, libusb also lacks any support for asynchronous access on Windows too, which makes some device API improvements impractical - before I can move BFGMiner to a completely asynchronous model, I would need to first do some major improvements to the underlying libusb library itself.

Edit: Disclosure... there is one reason I can see using libusb could be beneficial: unfixed bugs in the OS/official OS drivers, or workarounds for buggy hardware. This is the case with Windows's ACM driver (used by ModMiner) - but easily worked around in software (as BFGMiner has done for a while). The chip used in the Icarus also had a bug that prevented it from working with certain USB hosts - this too, was easily worked around in software. But those are the only reasons I can see using libusb would make sense for a device using a serial interface, and they're already managed just fine without it.

Wait, you just said that it was the current, supported, standard interface, and libusb is low level. Then you said libusb adds a lot of abstraction and does the same things as the serial I/O libs, which would make it higher level. Which is it?
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/

Hi! May I ask you where do I get the bflsc drivers?
Why, do you have one? The drivers are incomplete since the hardware doesn't exist outside of BFL labs yet so we have not posted any of the code.

I wish to have mine! But I was thinking about having the cgminer built and ready when my bfl arrives sometime in the future (I hope!)
It will definitely be available before you get yours.
full member
Activity: 547
Merit: 105
Bitcoin ya no es el futuro, es el presente

Hi! May I ask you where do I get the bflsc drivers?
Why, do you have one? The drivers are incomplete since the hardware doesn't exist outside of BFL labs yet so we have not posted any of the code.

I wish to have mine! But I was thinking about having the cgminer built and ready when my bfl arrives sometime in the future (I hope!)
member
Activity: 81
Merit: 1002
It was only the wind.
That's no reason to continue using deprecated technology. CGMiner could also be a 16-bit binary.
You missed the point. CGMiner is also deprecated software for deprecated technology (GPUs).
Well, if you consider the fact that shortly, when indeed GPU mining will be deprecated, that the crappy clone will ONLY be using the old termios serial IO libraries that were designed around 30 or more years ago, meanwhile, cgminer has been updated to use the libusb library to talk directly to the USB devices rather than via the old serial libraries that put an old interface in front of the USB devices and restrict access to most of the USB functionality ... yes it's quite clear that the clone is old technology and written using the serial library because the guy who wrote it was not only fail in programming ability he chose the simplest interface with the most restrictions coz he had no idea what he was doing ...
You mean the current, supported, standard interface, instead of bypassing it to use a low-level interface that has no benefit whatsoever.

It's like writing your own TCP/IP stack instead of using the one included in the OS. Not only is it stupidly redundant, it also means you've lost support, driver updates, ease of use, forward compatibility with new hardware, and regular-user-mode access.

Speed might be a reason to use libusb rather than 30 year old serial I/O libs... Just saying...
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
Hi! May I ask you where do I get the bflsc drivers?
Why, do you have one? The drivers are incomplete since the hardware doesn't exist outside of BFL labs yet so we have not posted any of the code.
member
Activity: 81
Merit: 1002
It was only the wind.
That's no reason to continue using deprecated technology. CGMiner could also be a 16-bit binary.
You missed the point. CGMiner is also deprecated software for deprecated technology (GPUs).

Haha, as long as scrypt chains exist, I don't think GPUs are gonna be deprecated for a long time.
full member
Activity: 547
Merit: 105
Bitcoin ya no es el futuro, es el presente
Hi! May I ask you where do I get the bflsc drivers?


Thanks!
newbie
Activity: 61
Merit: 0
I have a 7950 and I just added a 7970 to my computer.  Is there a way to run 2 instances of cgminer so I can use different workers and different bin files?  I would think my 7970 would need to use a different bin file to get the most performance.  I guess I just need help figuring out how to set it up.  When I launched it right away both cards started mining, but I did notice my 7950 was mining a little slower than it did as a single card.  Is that normal?

EDIT:
The khash is only a couple lower, not a big deal.  I do need to figure out what to do though.  After I overclocked the 7970 it is still going slower than the 7950, so I am obviously not doing something right.

EDIT:
Someone helped me.  The command is --device 0 or --device 1 etc.

EDIT:
Still getting very slow speeds.  Only about 450khash no mater what I do.  I did see that trixx reports that card is running @ x8 1.1 - don't know if that matters for LTC.

I tried removing and re-installing drivers, but that didn't make a difference.

EDIT:

Following this guide: https://bitcointalksearch.org/topic/m.1635964

After I delete bins and enter: setx GPU_MAX_ALLOC_PERCENT 100
Then simply run: cgminer --scrypt -I 13 --device 0 -o stratum+tcp://stratum.give-me-ltc.com:3333 -u myuser -p mypassword
I get about 550khash.  Then ANY changes I try to make (such as --worksize 256 or whatever) all give me lower speeds.  Here is what the bin file name is that it generated: scrypt130302Tahitiglg2tc22400w64l4.bin
Again, this is for my 7970.
hero member
Activity: 924
Merit: 501
Dual 6870s are not well supported and need a specific driver and sdk combo. What, I can't tell you offhand since they are so rare.

It worked before with 12.3 in conjunction with cgminer 2.7.5.  Any idea what you changed from 2.7.5 to 2.11.3?    ;-)   Or... how can I get back to 2.7.5?

Nevermind... Downgrading cgminer did not fix my problem.

But this guide did:  https://bitcointalksearch.org/topic/building-a-rock-solid-multi-gpu-linux-mining-rig-with-centos-60-170516
member
Activity: 81
Merit: 1002
It was only the wind.
CGMiner has nanosleep and sleep declared, which fucks the build for x86_64-w64-mingw32. Also, pthreads are not listed in the dependencies, and there is no option in configure to specify the prefix for pthreads.
We don't support building for w64 since it serves no useful advantage over 32 bit builds.
BFGMiner has w64-related bugs fixed and officially supported.



That's no reason to continue using deprecated technology. CGMiner could also be a 16-bit binary.
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
Dual 6870s are not well supported and need a specific driver and sdk combo. What, I can't tell you offhand since they are so rare.
hero member
Activity: 924
Merit: 501
problem:

$ ./cgminer -n
 [2013-03-31 14:05:49] CL Platform 0 vendor: Advanced Micro Devices, Inc.
 [2013-03-31 14:05:49] CL Platform 0 name: AMD Accelerated Parallel Processing
 [2013-03-31 14:05:49] CL Platform 0 version: OpenCL 1.1 AMD-APP-SDK-v2.4 (595.10)
 [2013-03-31 14:05:49] Error -1: Getting Device IDs (num)
 [2013-03-31 14:05:49] clDevicesNum returned error, no GPUs usable
 [2013-03-31 14:05:49] 0 GPU devices max detected
$

Tho

aticonfig --lsa
* 0. 06:00.0 AMD Radeon HD 6800 Series
  1. 07:00.0 AMD Radeon HD 6800 Series
* - Default adapter

thoughts?
Usual initial advice: Have you installed AMD driver, configured Xorg for all your devices, started it and exported the DISPLAY variable.

Yessir.  I've installed several versions of the AMD driver, at the moment I'm working with 12.3 which I think is V8.951.  I configure Xorg using the standard   aticonfig -f --adapters=all --initial  .  Yes I exported export DISPLAY=:0

What I did is take two perfectly working machines and upgrade the kernel and f'd myself in the process so I'm trying to get back to where I was.  The big screw was once I got the kernel upgraded I could no longer see one of my cards.  Not even with lspci, so I'm returning to the known working kernel after a reformat.

I'm happy to use any linux if there is a better version.  Because my cards are 6870x2 (with 2 gpu's per card) I have problems.. the only native drivers for the card are winblows based.  So there are weird things that happen:

Code:
 [P]ool management [G]PU management [S]ettings [D]isplay options [Q]uit
 GPU 0:  88.0C 425531RPM | 252.0M/251.0Mh/s | A:5948 R:22 HW:0 U:3.50/m I: 4
 GPU 1: 100.0C 100%    | 280.1M/278.6Mh/s | A:6705 R:15 HW:0 U:3.94/m I: 4

see the speed of the first fan reported in actual RPM vs a percent... I don't know if others see that.  This is a single pci-e card reporting the two gpus.  I did have 4 gpu churning along fine before the "upgrade" (2 x 6870x2).  

I went to
Code:
cd /use/share/ati 
./amd-uninstall.sh -force
on reboot enter init 3 by hitting "a" and adding a "3" to the end of the command line ...
yum install fglrx64_p_i_c... (which I had minutes before built upon this very machine)
on reboot enter init 3 by hitting "a" and adding a "3" to the end of the command line ...
aticonfig -f --adapters=all --initial  
init 5

all works as expected I compile cgminer just fine even showing access to GPU's but on run

Code:
$ ./cgminer -n
 [2013-03-31 14:05:49] CL Platform 0 vendor: Advanced Micro Devices, Inc.
 [2013-03-31 14:05:49] CL Platform 0 name: AMD Accelerated Parallel Processing
 [2013-03-31 14:05:49] CL Platform 0 version: OpenCL 1.1 AMD-APP-SDK-v2.4 (595.10)
 [2013-03-31 14:05:49] Error -1: Getting Device IDs (num)
 [2013-03-31 14:05:49] clDevicesNum returned error, no GPUs usable
 [2013-03-31 14:05:49] 0 GPU devices max detected
$

POW/BANG/CRASH
no love.  

I saw a blurb in bgminer where a guy modified a line in ocl.c but my ocl.c is slighly different from his.  Here's his ?fix?  to a similar error.

Any and all advice welcome.  Thanks!

Initial post:
https://bitcointalksearch.org/topic/m.1711097

Solution post (step by step installation under CEntOS):
https://bitcointalksearch.org/topic/building-a-rock-solid-multi-gpu-linux-mining-rig-with-centos-60-170516
newbie
Activity: 12
Merit: 0
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
How can I use my computer while mining with cgminer?

I'm currently mining Bitcoins and Litecoins, (AMD 5850)
for Bitcoins I use 'GUIMiner' using 'OpenCL miner',
for Litecoins I use 'GUIMiner-scrypt' using 'cgminer'.

Now, for Bitcoins I just add the flags "-v -w128 -f 60" and it works extremely fast even when I play BF3 and nothing shatters (If I play heavy graphics games it will go down to ~150Mha/s tops, and when I finish playing it jumps back to ~300Mah/s), the computer works flawlessly,

but for Litecoins whenever I use the flag "-I d" it goes down to 10% of it's power (from 325Kha/s to 30-20Kha/s) even when I just look at the computer!

So again, my question is, is there any way to have the same experience with performance while using the computer with cgminer?
Or, alternatively, is there anyway I can mine Litecoins with GPU, using the computer at the same time and still have maximum performance as I just demonstrated that I have with OpenCL miner?


thanks for the help, hope I gave all the details necessary!
Yes, that IS the experience with scrypt in dynamic mode. You have to sacrifice loads of hashing performance for the desktop to be usable. Alternatively find a static intensity where you can deal with the lag.
Jump to: