Author

Topic: [ANN] cudaMiner & ccMiner CUDA based mining applications [Windows/Linux/MacOSX] - page 129. (Read 3426930 times)

legendary
Activity: 3164
Merit: 1003
How can you NOT know how to use AWS to mine...?
ok i signed up if you help me out and i make 5btc a day ill give you 1  Smiley
i looked into it further and there's no way i can do this unless its a copy and paste you have to be an engineer . gotta get back to I.V. chelation therapy to get the lead and mercury out. going to delete the account. o well.

i did see on video that its $.65 hrs for one gpu no way can one make money not free.
I still think you should upgrade... You are asking is there anybody wanting to lose 2~3hours compiling the source

Strange thing, I can't copy and paste cudamining.co.uk

ok what if I try Display Driver Uninstaller? do you think that may help?
legendary
Activity: 1400
Merit: 1050
github/djm34/ccminer updated with latest code:

* improved hashrate on m7
* fixed --help in cpu-miner.c (thanks to epsylon3 from irc #ccminer)
* added average+statistical uncertainty on average (the stastical error (gaussian error) tells how significant is the quoted average and depends on the number of shares, both are pretty much useless in solo unless there is a huge instamine period) (*)
* some minor change in keccak256
* added -F to m7 (please don't use on 1gh, it increase the number of rejected share)

(*) I don't think it is the best way to calculate the average hashrate as it doesn't take into account period of time without share... but only the recorded hashrate at the time of the discovery of a share. I need to figure out something a bit different.
The code is based on cayars implementation with some modifications and implementation of the standard deviation.

Regarding the standard deviation, this one varies in 1/sqrt(#shares) so it should decrease with time showing that the average is becoming more and more relevant (unless there are large variation to which the standard deviation is sensitive).

still have to update cudamining (almost there  Grin)
cudamining updated

Code:
gcc -std=gnu99 -DHAVE_CONFIG_H -I.  -msse2  -fopenmp -pthread -fno-strict-aliasing  -DSCRYPT_KECCAK512 -DSCRYPT_CHACHA -DSCRYPT_CHOOSE_COMPILETIME   -g -O2 -MT ccminer-cpu-miner.o -MD -MP -MF .deps/ccminer-cpu-miner.Tpo -c -o ccminer-cpu-miner.o `test -f 'cpu-miner.c' || echo './'`cpu-miner.c
cpu-miner.c:22:17: fatal error: cmath: No such file or directory
 #include

already tried to change cmath to cmath.h, no luck :/ ubuntu 14.04 looks like it's c++ only so gcc doesnt work.
You need libstdc++6-...-dev, according to your build chain you've setup.
Can you use math.h ? Just need to add -lm (I assume it must be on every linux distro)
actually math.h works as well for me
legendary
Activity: 914
Merit: 1001
currently trying with math.h instead of cmath
legendary
Activity: 3164
Merit: 1003
djm34
bigjme helped me to compile too, and that to no end. loaded 5.5 cuda...error need 6.5 cuda..... load 6.5 cuda  error need 5.5.  omg im like insane at this point lol  Roll Eyes
legendary
Activity: 914
Merit: 1001
as far as I know, cmath is available under ubuntu, but it only works for c++, not c (at least that's what I read @ stack overflow). I'm currently trying to figure out if it's possible to get it to compile.

github/djm34/ccminer updated with latest code:

* improved hashrate on m7
* fixed --help in cpu-miner.c (thanks to epsylon3 from irc #ccminer)
* added average+statistical uncertainty on average (the stastical error (gaussian error) tells how significant is the quoted average and depends on the number of shares, both are pretty much useless in solo unless there is a huge instamine period) (*)
* some minor change in keccak256
* added -F to m7 (please don't use on 1gh, it increase the number of rejected share)

(*) I don't think it is the best way to calculate the average hashrate as it doesn't take into account period of time without share... but only the recorded hashrate at the time of the discovery of a share. I need to figure out something a bit different.
The code is based on cayars implementation with some modifications and implementation of the standard deviation.

Regarding the standard deviation, this one varies in 1/sqrt(#shares) so it should decrease with time showing that the average is becoming more and more relevant (unless there are large variation to which the standard deviation is sensitive).

still have to update cudamining (almost there  Grin)
cudamining updated

Code:
gcc -std=gnu99 -DHAVE_CONFIG_H -I.  -msse2  -fopenmp -pthread -fno-strict-aliasing  -DSCRYPT_KECCAK512 -DSCRYPT_CHACHA -DSCRYPT_CHOOSE_COMPILETIME   -g -O2 -MT ccminer-cpu-miner.o -MD -MP -MF .deps/ccminer-cpu-miner.Tpo -c -o ccminer-cpu-miner.o `test -f 'cpu-miner.c' || echo './'`cpu-miner.c
cpu-miner.c:22:17: fatal error: cmath: No such file or directory
 #include

already tried to change cmath to cmath.h, no luck :/ ubuntu 14.04 looks like it's c++ only so gcc doesnt work.
You need libstdc++6-...-dev, according to your build chain you've setup.

libstdc++ is already installed.
legendary
Activity: 3164
Merit: 1003
How can you NOT know how to use AWS to mine...?
ok i signed up if you help me out and i make 5btc a day ill give you 1  Smiley
i looked into it further and there's no way i can do this unless its a copy and paste you have to be an engineer . gotta get back to I.V. chelation therapy to get the lead and mercury out. going to delete the account. o well.

i did see on video that its $.65 hrs for one gpu no way can one make money not free.
I still think you should upgrade... You are asking is there anybody wanting to lose 2~3hours compiling the source

Strange thing, I can't copy and paste cudamining.co.uk
i tried to. Back and forth for 24 hrs nothing worked. i would love to upgrade but no older miners will work. and i dont understand anything on that asw.
legendary
Activity: 1792
Merit: 1008
/dev/null
github/djm34/ccminer updated with latest code:

* improved hashrate on m7
* fixed --help in cpu-miner.c (thanks to epsylon3 from irc #ccminer)
* added average+statistical uncertainty on average (the stastical error (gaussian error) tells how significant is the quoted average and depends on the number of shares, both are pretty much useless in solo unless there is a huge instamine period) (*)
* some minor change in keccak256
* added -F to m7 (please don't use on 1gh, it increase the number of rejected share)

(*) I don't think it is the best way to calculate the average hashrate as it doesn't take into account period of time without share... but only the recorded hashrate at the time of the discovery of a share. I need to figure out something a bit different.
The code is based on cayars implementation with some modifications and implementation of the standard deviation.

Regarding the standard deviation, this one varies in 1/sqrt(#shares) so it should decrease with time showing that the average is becoming more and more relevant (unless there are large variation to which the standard deviation is sensitive).

still have to update cudamining (almost there  Grin)
cudamining updated

Code:
gcc -std=gnu99 -DHAVE_CONFIG_H -I.  -msse2  -fopenmp -pthread -fno-strict-aliasing  -DSCRYPT_KECCAK512 -DSCRYPT_CHACHA -DSCRYPT_CHOOSE_COMPILETIME   -g -O2 -MT ccminer-cpu-miner.o -MD -MP -MF .deps/ccminer-cpu-miner.Tpo -c -o ccminer-cpu-miner.o `test -f 'cpu-miner.c' || echo './'`cpu-miner.c
cpu-miner.c:22:17: fatal error: cmath: No such file or directory
 #include

already tried to change cmath to cmath.h, no luck :/ ubuntu 14.04 looks like it's c++ only so gcc doesnt work.
You need libstdc++6-...-dev, according to your build chain you've setup.
legendary
Activity: 1400
Merit: 1050
github/djm34/ccminer updated with latest code:

* improved hashrate on m7
* fixed --help in cpu-miner.c (thanks to epsylon3 from irc #ccminer)
* added average+statistical uncertainty on average (the stastical error (gaussian error) tells how significant is the quoted average and depends on the number of shares, both are pretty much useless in solo unless there is a huge instamine period) (*)
* some minor change in keccak256
* added -F to m7 (please don't use on 1gh, it increase the number of rejected share)

(*) I don't think it is the best way to calculate the average hashrate as it doesn't take into account period of time without share... but only the recorded hashrate at the time of the discovery of a share. I need to figure out something a bit different.
The code is based on cayars implementation with some modifications and implementation of the standard deviation.

Regarding the standard deviation, this one varies in 1/sqrt(#shares) so it should decrease with time showing that the average is becoming more and more relevant (unless there are large variation to which the standard deviation is sensitive).

still have to update cudamining (almost there  Grin)
cudamining updated

Code:
gcc -std=gnu99 -DHAVE_CONFIG_H -I.  -msse2  -fopenmp -pthread -fno-strict-aliasing  -DSCRYPT_KECCAK512 -DSCRYPT_CHACHA -DSCRYPT_CHOOSE_COMPILETIME   -g -O2 -MT ccminer-cpu-miner.o -MD -MP -MF .deps/ccminer-cpu-miner.Tpo -c -o ccminer-cpu-miner.o `test -f 'cpu-miner.c' || echo './'`cpu-miner.c
cpu-miner.c:22:17: fatal error: cmath: No such file or directory
 #include

already tried to change cmath to cmath.h, no luck :/ ubuntu 14.04 looks like it's c++ only so gcc doesnt work.
can you give an equivalent of cmath for ubuntu (I need to calculate square root ) ?
Or do you get a result for the error without it ?
legendary
Activity: 914
Merit: 1001
github/djm34/ccminer updated with latest code:

* improved hashrate on m7
* fixed --help in cpu-miner.c (thanks to epsylon3 from irc #ccminer)
* added average+statistical uncertainty on average (the stastical error (gaussian error) tells how significant is the quoted average and depends on the number of shares, both are pretty much useless in solo unless there is a huge instamine period) (*)
* some minor change in keccak256
* added -F to m7 (please don't use on 1gh, it increase the number of rejected share)

(*) I don't think it is the best way to calculate the average hashrate as it doesn't take into account period of time without share... but only the recorded hashrate at the time of the discovery of a share. I need to figure out something a bit different.
The code is based on cayars implementation with some modifications and implementation of the standard deviation.

Regarding the standard deviation, this one varies in 1/sqrt(#shares) so it should decrease with time showing that the average is becoming more and more relevant (unless there are large variation to which the standard deviation is sensitive).

still have to update cudamining (almost there  Grin)
cudamining updated

Code:
gcc -std=gnu99 -DHAVE_CONFIG_H -I.  -msse2  -fopenmp -pthread -fno-strict-aliasing  -DSCRYPT_KECCAK512 -DSCRYPT_CHACHA -DSCRYPT_CHOOSE_COMPILETIME   -g -O2 -MT ccminer-cpu-miner.o -MD -MP -MF .deps/ccminer-cpu-miner.Tpo -c -o ccminer-cpu-miner.o `test -f 'cpu-miner.c' || echo './'`cpu-miner.c
cpu-miner.c:22:17: fatal error: cmath: No such file or directory
 #include

already tried to change cmath to cmath.h, no luck :/ ubuntu 14.04 looks like it's c++ only so gcc doesnt work.
sr. member
Activity: 330
Merit: 252
xdn rising ? I must have 600k of that crap (assuming I can transfer them and synchronize as well)

...it was at 34 sat in the meanwhile... not bad.
sold my 6-7millions between 5-15 sat never thought it will go so far.

btw. djm thanks for updating ccminer again. great to have you here on board.
legendary
Activity: 1400
Merit: 1050
How can you NOT know how to use AWS to mine...?
ok i signed up if you help me out and i make 5btc a day ill give you 1  Smiley
i looked into it further and there's no way i can do this unless its a copy and paste you have to be an engineer . gotta get back to I.V. chelation therapy to get the lead and mercury out. going to delete the account. o well.

i did see on video that its $.65 hrs for one gpu no way can one make money not free.
I still think you should upgrade... You are asking is there anybody wanting to lose 2~3hours compiling the source

Strange thing, I can't copy and paste cudamining.co.uk
legendary
Activity: 1400
Merit: 1050
Highly interesting Boolberry price development as of lately. Too bad that difficulty has just been rising the same way as the price did.
But if you stubbornly mined throughput the price depression (at difficulty of around 100 G) you would have had a good profitability. Now difficulty is up to 600 G

Christian

The time for solomining seems to be over now. I was lucky mining BBR the last weeks.
But it is nearly the same as with xdn (ducknote) - you never know if a coin rise some day or not.
Hope you stay calm with your big BBR bag and wait until the coin has time to get a stable value.   Cool



xdn rising ? I must have 600k of that crap (assuming I can transfer them and synchronize as well)
sr. member
Activity: 330
Merit: 252
Highly interesting Boolberry price development as of lately. Too bad that difficulty has just been rising the same way as the price did.
But if you stubbornly mined throughput the price depression (at difficulty of around 100 G) you would have had a good profitability. Now difficulty is up to 600 G

Christian

The time for solomining seems to be over now. I was lucky mining BBR the last weeks.
But it is nearly the same as with xdn (ducknote) - you never know if a coin rise some day or not.
Hope you stay calm with your big BBR bag and wait until the coin has time to get a stable value.   Cool


legendary
Activity: 3164
Merit: 1003
github/djm34/ccminer updated with latest code:

* improved hashrate on m7
* fixed --help in cpu-miner.c (thanks to epsylon3 from irc #ccminer)
* added average+statistical uncertainty on average (the stastical error (gaussian error) tells how significant is the quoted average and depends on the number of shares, both are pretty much useless in solo unless there is a huge instamine period) (*)
* some minor change in keccak256
* added -F to m7 (please don't use on 1gh, it increase the number of rejected share)

(*) I don't think it is the best way to calculate the average hashrate as it doesn't take into account period of time without share... but only the recorded hashrate at the time of the discovery of a share. I need to figure out something a bit different.
The code is based on cayars implementation with some modifications and implementation of the standard deviation.

Regarding the standard deviation, this one varies in 1/sqrt(#shares) so it should decrease with time showing that the average is becoming more and more relevant (unless there are large variation to which the standard deviation is sensitive).

still have to update cudamining (almost there  Grin)
can someone complie that with 3788 drivers please  Smiley
legendary
Activity: 1400
Merit: 1050
github/djm34/ccminer updated with latest code:

* improved hashrate on m7
* fixed --help in cpu-miner.c (thanks to epsylon3 from irc #ccminer)
* added average+statistical uncertainty on average (the stastical error (gaussian error) tells how significant is the quoted average and depends on the number of shares, both are pretty much useless in solo unless there is a huge instamine period) (*)
* some minor change in keccak256
* added -F to m7 (please don't use on 1gh, it increase the number of rejected share)

(*) I don't think it is the best way to calculate the average hashrate as it doesn't take into account period of time without share... but only the recorded hashrate at the time of the discovery of a share. I need to figure out something a bit different.
The code is based on cayars implementation with some modifications and implementation of the standard deviation.

Regarding the standard deviation, this one varies in 1/sqrt(#shares) so it should decrease with time showing that the average is becoming more and more relevant (unless there are large variation to which the standard deviation is sensitive).

still have to update cudamining (almost there  Grin)
cudamining updated
legendary
Activity: 3164
Merit: 1003
How can you NOT know how to use AWS to mine...?
ok i signed up if you help me out and i make 5btc a day ill give you 1  Smiley
i looked into it further and there's no way i can do this unless its a copy and paste you have to be an engineer . gotta get back to I.V. chelation therapy to get the lead and mercury out. going to delete the account. o well.

i did see on video that its $.65 hrs for one gpu no way can one make money not free.
legendary
Activity: 1792
Merit: 1008
/dev/null
You are not allowed to use cudaminer/ccminer in your program as its GPL2/GPL3.
GPL prevents you from using source code from a GPL application and put it in another application with non-GPL license. As I explained already, Awesome Miner is NOT using any source code from open source projects.

Awesome Miner is simply executing (or connecting to the API's provided by) ccMiner.exe and Sgminer.exe, and that is fully allowed and doesn't break any GPL licenses.

For example: A user is running ccMiner.exe on one PC and Sgminer.exe on another PC. Now the user want all features provided by Awesome Miner, so the user go ahead and download the free version of the software. Then they simply point Awesome Miner to execute their existing ccMiner.exe and Sgminer.exe. In case of Sgminer, they can also connect to the API using a TCP connection. This is not in conflict with GPL.

ACK, in this case consider donating a small amount to the developers. unless your that greedy Wink
I've considered that. But keep in mind that 90-95% of the Awesome Miner users runs the free version, so I'm already giving away many hundred development hours to the community for free.
Still, the original GPL authors did 100% for free, so if you donate a few coins (dosnt mean you have to continously donate) it would only be fair.
legendary
Activity: 3164
Merit: 1003
Wouldn't building a new rig with another mobo be actually cheaper?

Also would Windows tolerate more than 6 GPUs? I guess with linux you would't have this problem.

Not if you factor in the fact that you have 6 cards per system max, say $400 per system minimum, thats bottom of the barrel stuff, so for 16 gpu's you would need 3 systems

So that's $1200
Yes buying one of those is more expensive in this case, coming to around: $1600

but if you go over that, the people its aimed for you get this

32 gpu's - 6 systems - $2400
32 gpu's - 1 system - $2600

Then factory in space savings as you wont have 5 extra MB's around. It would work out at a saving but not without high card amounts
yes but i didn't know the price. i asked for a quote. i was more interested in the splitters but there price is to high. and problems my arise.  Wink
legendary
Activity: 1400
Merit: 1050
Honestly that's not too bad. its cheaper then buying 2 other systems
And if you have large amounts of cards (28) you could in theory run all of those off 1 system, and save money on buying 4 entire system
but not by far.. with the risk that the mobo doesn't support 16gpu...
well... too expensive for the usage I had in mind (which didn't involved 16gpu and had more to do with cable management... too bad)
sr. member
Activity: 350
Merit: 250
Your building and mine is being ripped apart :p
Jump to: