Pages:
Author

Topic: Wolf's XMR/BCN/DSH CPUMiner - 2x speed compared to LucasJones' - NEW 06/20/2014 - page 24. (Read 547096 times)

hero member
Activity: 794
Merit: 1000
Monero (XMR) - secure, private, untraceable
^You may need libboost 1.55
member
Activity: 81
Merit: 1002
It was only the wind.
That's because you don't have the curl library installed, and if you do, then you forgot to re-run autogen after installing it.

I have curl, libcurl should be shipped with it.

Code:
brew list
autoconf cloog curl isl libtool openssl
automake cmake gcc jansson miniupnpc pkg-config
boost cpuminer-multi gmp libmpc mpfr

Do you have the headers and such, or just the binary shared libs?
legendary
Activity: 2156
Merit: 1131

I just tryed to compile and I've got this :

configure: error: in `/root/cpuminer/cpuminer-multi':
configure: error: could not find crypto
See `config.log' for more details


what's "crypto" ?
legendary
Activity: 1400
Merit: 1000
If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a [email protected]) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalksearch.org/topic/m.7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

It's not released, and it's not going to be released. Trust me.


It was released early this morning..
https://github.com/tsiv/ccminer-cryptonight

I am getting 1,030 H/s with 5 750ti's
member
Activity: 81
Merit: 1002
It was only the wind.
I don't have a Mac and don't like them. However, I have retained compatibility if someone wishes to build for it.

I have and want to build, but cannot do this. Libcurl error, while configure.

That's because you don't have the curl library installed, and if you do, then you forgot to re-run autogen after installing it.
newbie
Activity: 55
Merit: 0
If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a [email protected]) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalksearch.org/topic/m.7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

It's not released, and it's not going to be released. Trust me.


It was released early this morning..
https://github.com/tsiv/ccminer-cryptonight
newbie
Activity: 1
Merit: 0
Im getting around 180H/s from my Xeon E5-2620 using this miner on Windows 7 64bit. My hashrate went up by using this miner from 150H/s. Seems bit small since Im under impression this processor should be rather good? Im new guy to CPU mining, did some mining before with my Quaddro K4000, but its just waste of time and I decided to try out with CPU.

ot: I guess you are the same wolf that runs the pool, Im seeing only 70H/s at pool statistics, normal?
member
Activity: 81
Merit: 1002
It was only the wind.
Why still no OSX binaries?

I don't have a Mac and don't like them. However, I have retained compatibility if someone wishes to build for it.
sr. member
Activity: 364
Merit: 250
If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a [email protected]) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalksearch.org/topic/m.7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards

I think your estimations of the power draw is incorect. I have 3 rigs + my desktop attached to my single socket with these power usage meter things... I have 1 x 280x, 3 x 7970, 7 x 7950 and 1 x 7950 with 1 x 750ti and 1x 650GTX and using my overclocked 2500k to mine, total power draw is 1550 to 1650w from the wall...
full member
Activity: 182
Merit: 100
If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

The cryptonight algo was not designed to be more cpu friendly than gpu friendly. It is more cpu friendly actually.
I'm not complaining, i have some cpu at home (a dual xeon 2687w and a [email protected]) - i'm not pro or against cpu mining. I have a few gpu and some cpu.
But juste mesuring the difference when mining with cpu compared to when your kids computer are not mining is not, well, a good measure.
Such a computer while mining should draw ~250W (mesured at the wall). Maybe i'm wrong, i'll let you make the measure.
A simple rig designed for mining with gpu, with a little cpu (ga2016/2020), in iddle state, draw 80w, and when mining XMR with one R9 280X, draw 250W.
With the 5 R9 280X, 2300H/s, 1000W measured at the wall.
OK, my dual xeon give me 960H/s for less power, but i think we will see a lot of optimization (for both i hope) in a near future.

Quote
This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.
Why do you think it was designed intentionally in this way ? To be fair ?
Gpu friendly coin bring gpu farm and multipool, cpu only or cpu friendly bring botnet/amazon EC2 instances (see the boolberry thread, DGA talk about 200 EC2 for himself, and he is far from being the biggest one). In both case i'm still a very little miner

EDIT :
GPU miner coming to nvidia card (not released yet)
https://bitcointalksearch.org/topic/m.7458872

First test, with 6 x 750Ti : 270W at the wall (something like 35w per card), ~160H/s per cards
member
Activity: 90
Merit: 10
Already doing that.  Must be something on my end.
hero member
Activity: 644
Merit: 502
I keep getting "Stratum authentication failed....retry in 10 secs.  Anybody around??  On xmr pool

There are some pool(s) that got DDOS'd.

Use wolf's pool, it is up and only 1% fee.

http://pool.cryptoescrow.eu/

Sample code to run minerd: minerd -o stratum+tcp://mine.cryptoescrow.eu:3333 -u YOUR-WALLET-ADDRESS -p x -t 4
member
Activity: 90
Merit: 10
I keep getting "Stratum authentication failed....retry in 10 secs.  Anybody around??  On xmr pool
full member
Activity: 154
Merit: 100
What is the stratum for moneropool.com ?

How about mining on a different pool? They took it off the site for a reason.
sr. member
Activity: 364
Merit: 250
What is the stratum for moneropool.com ?
member
Activity: 81
Merit: 1002
It was only the wind.
Anyone have a walkthrough for getting this running on an Amazon EC2 server? (And expected performance would be nice to know.)

At the market price, difficulty, and cost per hour of a spot instance, you are most likely going to loose money. If you are hell bent on trying, I would wait for the net hash to drop back down around 3 MH/s

Do this:

sudo apt-get install git make autoconf automake libcurl4-openssl-dev libjansson-dev build-essential && git clone https://github.com/wolf9466/cpuminer-multi.git && cd cpuminer-multi && CFLAGS="-march=native" ./configure && make

Didn't test it, but it should work.
legendary
Activity: 1232
Merit: 1011
Monero Evangelist
How to compile under OS X.

1.) ./autogen.sh
2.) ./configure CFLAGS="-march=native -mno-avx"
3.) edit Makefile, search string "-fuse-linker-plugin", delete this string/option from the "AM_CFLAGS" setting
(e.g. "AM_CFLAGS = -Ofast -flto -fuse-linker-plugin -funroll-loops \" becomes "AM_CFLAGS = -Ofast -flto -funroll-loops \")
4.) make
5.) Huh
6.) mine monero, take money, get bitches
7.) profit!

legendary
Activity: 2968
Merit: 1198
If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.

Well said. As I have explained before, there is a role for GPU mining, which is why I am currently the largest individual contributor to the bounty for an open source GPU miner. However, GPU mining is not dominant for this algorithm the way it is for most others, merely competitive (as you correctly explain, by design).

full member
Activity: 267
Merit: 100
If you consider cpu mining, you should consider the whole PC consumption, not just CPU.
Making a "traditional" desktop computer with a 4770K will cost more thant a GPU.

You're thinking from a single minded perspective. You are actually seeing the INTENTIONAL limitation of this algorithm.

My kids have a 2500k each and they get 110H/s when the CPU is at 50% the whole time they're on it. They use this miner in Windows. Measured AT THE WALL the power consumption goes up by 30w when the miner starts if the PC was at idle, when I have Hearthstone running in window mode, it is only going up by 20w with the miner.

So effectively, regular crappy $300 computers that I bought for my kids are getting me 110H/s for somewhere between 20w and 30w depending on what they're doing. An R9 280x draws around 300w from the wall at full power, if Claymore's miner is only using half their power, it would be 150w.

To break even in H/s you'd need to be getting closer to 660H/s per card, your results show 460 per card.

This means that people can't just buy a crap tonne of equipment and own the coin. It was intentionally made to be this way.

EDIT: Forgot to mention that the kids don't think it affects their gameplay. They play mostly Hearthstone, Path of Exile, Diablo 3, League of Legends and DotA 2.
member
Activity: 81
Merit: 1002
It was only the wind.
4770k at stock clocks 250h/s at T-4
2600 non K at stock clocks 234 h/s at t-4

its nothing to do with a fan base, its at current the more powerful miner that supplys the highest output.

Nobody seems to be doing careful testing with complete environment data so we can't be sure, but there is a screen shot for this miner showing >300 on a 4770k. I have no idea if that is overclocked or anything else difficult to reproduce.




i dare say overclocked, or in linux which works better for cpu mineing from what i have gathered.

Yeah I don't really understand why Linux would be faster, and I can't come up with a good reason for it, but I never use Windows any more so it isn't something I'm going to work on.

Because the Windows kernel is retarded and won't take hints.

No, seriously. I can give the linux kernel a hint about how I'm going to be using the memory and how it should handle it. It is, of course, free to ignore me, but it usually doesn't, resulting in faster accesses.

The access pattern hints only matter for virtual memory that gets paged, which shouldn't happen. The prefill hint might matter, but only during warmup, after that they should be the same. The hugepage stuff should result in the same speed on Windows as Linux, if everything is working correctly, though I understand this is trickier to arrange on Windows. There might be other issues, such as how threading is being handled, I'm not sure.

I'm pretty sure a good Windows programmer could fix this (as claymore appears to have done) but I'm not one by choice so I can't directly help.





Nope, because readahead. I'm using it for random access, and I tell the linux kernel that with madvise(). It can be mitigated, but not entirely fixed. The reason I haven't done it is because I don't want to boot into Windows to test.

Kernel readahead only applies to I/O. There is also CPU prefetch, but that is automatically detected by the hardware.



But it is I/O, because it's at least sometimes getting paged out. That's why hugepages help - they can't be.
Pages:
Jump to: