Author

Topic: hundreds of CPUs online - Any worth? (Read 2805 times)

sr. member
Activity: 252
Merit: 251
June 17, 2011, 05:26:21 AM
#18
Why does everyone who has access to huge amounts of computing power seem to believe that processors at idle use the same amount of power as processors at full load?


Makes me wonder how they climbed up the ladder in their company to have been given access to them in the first place.

i never said that nor make implication of such!

I don't have to pay the electricity bill. Leased servers, you know.

But someone does.

And when consumption goes from 11w idle to 95W on hundreds of xeons and core 2 quads, "someone" will notice the ~10x increase in electricity costs.
sr. member
Activity: 402
Merit: 250
June 17, 2011, 02:02:23 AM
#17
Why does everyone who has access to huge amounts of computing power seem to believe that processors at idle use the same amount of power as processors at full load?


Makes me wonder how they climbed up the ladder in their company to have been given access to them in the first place.

Douche move, guy.  Douche move.

You sound whiney, presumptuous and in contempt for everything you don't have.

Way to go.

Implying the "idiots" with access to these machines couldn't possibly be as clever as you?... "climbing a corporate ladder" to get access to a PC lab, leased servers or a school network?  How stupid can you be to make that claim?

Better yet - how stupid can you be to think that a lot of us don't have pre-existing infrastructure?

... I can see why you're not on the same corporate ladder rung as the rest of us idiots. Smiley


You're sure making a lot of assumptions based on one post... who's the douche?


And the reason I made that post is because I'm tired of seeing all these posts (at least a half-dozen) from people who have access to large farms of computers and don't realize that running a CPU-Miner on each box will suck down more juice than if they were to just idle all night.  If the OP is not in this category than I apologize to him.

There is a reason i got access to a big amount of servers and you don't. It's called business sense, knowing what's what.
for some people it really doesn't cost more to have 100% used or 0% used CPUs.
sr. member
Activity: 418
Merit: 250
June 16, 2011, 10:53:44 PM
#16
Why does everyone who has access to huge amounts of computing power seem to believe that processors at idle use the same amount of power as processors at full load?


Makes me wonder how they climbed up the ladder in their company to have been given access to them in the first place.

Douche move, guy.  Douche move.

You sound whiney, presumptuous and in contempt for everything you don't have.

Way to go.

Implying the "idiots" with access to these machines couldn't possibly be as clever as you?... "climbing a corporate ladder" to get access to a PC lab, leased servers or a school network?  How stupid can you be to make that claim?

Better yet - how stupid can you be to think that a lot of us don't have pre-existing infrastructure?

... I can see why you're not on the same corporate ladder rung as the rest of us idiots. Smiley


You're sure making a lot of assumptions based on one post... who's the douche?


And the reason I made that post is because I'm tired of seeing all these posts (at least a half-dozen) from people who have access to large farms of computers and don't realize that running a CPU-Miner on each box will suck down more juice than if they were to just idle all night.  If the OP is not in this category than I apologize to him.
member
Activity: 70
Merit: 10
June 12, 2011, 07:40:55 AM
#15
Why does everyone who has access to huge amounts of computing power seem to believe that processors at idle use the same amount of power as processors at full load?


Makes me wonder how they climbed up the ladder in their company to have been given access to them in the first place.

The datacenters I am familiar with in the US, you contract for an amount of power/cooling, not pay per Kw/H.
In my case, the company I work for is paying for 50Kw of power for 10 racks.  Right now we are only using 16Kw.
The datacenter has allocated the power and cooling for us to be using the full 50Kw, so the extra 35kw we are
paying for but not using is at our loss, we pay the bill every month even if we dont use it. 

copper member
Activity: 56
Merit: 0
June 12, 2011, 04:12:15 AM
#14
Why does everyone who has access to huge amounts of computing power seem to believe that processors at idle use the same amount of power as processors at full load?


Makes me wonder how they climbed up the ladder in their company to have been given access to them in the first place.

Douche move, guy.  Douche move.

You sound whiney, presumptuous and in contempt for everything you don't have.

Way to go.

Implying the "idiots" with access to these machines couldn't possibly be as clever as you?... "climbing a corporate ladder" to get access to a PC lab, leased servers or a school network?  How stupid can you be to make that claim?

Better yet - how stupid can you be to think that a lot of us don't have pre-existing infrastructure?

... I can see why you're not on the same corporate ladder rung as the rest of us idiots. Smiley
sr. member
Activity: 402
Merit: 250
June 12, 2011, 04:00:35 AM
#13
Why does everyone who has access to huge amounts of computing power seem to believe that processors at idle use the same amount of power as processors at full load?


Makes me wonder how they climbed up the ladder in their company to have been given access to them in the first place.

i never said that nor make implication of such!

I don't have to pay the electricity bill. Leased servers, you know.
sr. member
Activity: 418
Merit: 250
June 11, 2011, 10:28:07 PM
#12
Why does everyone who has access to huge amounts of computing power seem to believe that processors at idle use the same amount of power as processors at full load?


Makes me wonder how they climbed up the ladder in their company to have been given access to them in the first place.
legendary
Activity: 2618
Merit: 1007
June 11, 2011, 09:17:07 PM
#11
Idle CPUs use far less electricity than CPUs running on full load.

So yes, it might actually cause quite a bump in the electric consumption of both cooling and CPUs.
sr. member
Activity: 402
Merit: 250
June 11, 2011, 09:12:28 PM
#10
a single overclocked 5830 can get over 300 mhash/s

So i'm guessing 300mhash/s takes a year to make even 200€?


No it'll be much less than a year. But yea CPU mining is pointless unless you have free electricity. And even then the returns are still small...

Well, in this case the cpu cycles would be "free", it's just matter of setting them up (Any pooler has API for adding new worker?).

What kind of revenue can one expect with 300mhash/s per month nowadays?

Roughly $150-$200 a month depending on market fluctuations.

Thank you Smiley

So that would mean 1.5$ per node per month, with luck.

I guess not really worth the effort to setup all nodes with miner, even tho it does not add any extra costs for me to do so.
newbie
Activity: 29
Merit: 0
June 11, 2011, 09:10:05 PM
#9
a single overclocked 5830 can get over 300 mhash/s

So i'm guessing 300mhash/s takes a year to make even 200€?


No it'll be much less than a year. But yea CPU mining is pointless unless you have free electricity. And even then the returns are still small...

Well, in this case the cpu cycles would be "free", it's just matter of setting them up (Any pooler has API for adding new worker?).

What kind of revenue can one expect with 300mhash/s per month nowadays?

Roughly $150-$200 a month depending on market fluctuations.
sr. member
Activity: 402
Merit: 250
June 11, 2011, 09:02:24 PM
#8
It's basically a massive waste of electricity between the power being sucked down by the processors and the air conditioning that you need to cool down the place.  Unless you live somewhere where it's freezing cold right now and could make some use of all that heat I don't think it makes sense to run 100 computers at full throttle (likely ~20000 watts) to generate the same amount of bitcoins as a single 300 watt video card.

BTW, people have been fired and arrested before for using work computers for distributed computing so definitely make sure your company is actually OK with this.
http://www.msnbc.msn.com/id/34241415/ns/technology_and_science-space/


These nodes are at a DC and utilized for hosting, but their idle CPU cycles are going 100% to waste Smiley Seriously, if i swapped all of them to ATOMs no one would notice - that's how little those nodes needs CPU. It's all network, ram and storage i/o.

But you are right, it's probably not worth it even to set it all up.

I do however happen to have about 10 un-used computers with atleast 1xPCI-E x16 lying around ... But what is worth say 6870 or 5870 in BTC made in say slush's pool per week right now?
hero member
Activity: 608
Merit: 500
June 11, 2011, 08:49:18 PM
#7
It's basically a massive waste of electricity between the power being sucked down by the processors and the air conditioning that you need to cool down the place.  Unless you live somewhere where it's freezing cold right now and could make some use of all that heat I don't think it makes sense to run 100 computers at full throttle (likely ~20000 watts) to generate the same amount of bitcoins as a single 300 watt video card.

BTW, people have been fired and arrested before for using work computers for distributed computing so definitely make sure your company is actually OK with this.
http://www.msnbc.msn.com/id/34241415/ns/technology_and_science-space/
sr. member
Activity: 402
Merit: 250
June 11, 2011, 08:32:49 PM
#6
a single overclocked 5830 can get over 300 mhash/s

So i'm guessing 300mhash/s takes a year to make even 200€?


No it'll be much less than a year. But yea CPU mining is pointless unless you have free electricity. And even then the returns are still small...

Well, in this case the cpu cycles would be "free", it's just matter of setting them up (Any pooler has API for adding new worker?).

What kind of revenue can one expect with 300mhash/s per month nowadays?
member
Activity: 69
Merit: 10
June 11, 2011, 08:30:17 PM
#5
a single overclocked 5830 can get over 300 mhash/s

So i'm guessing 300mhash/s takes a year to make even 200€?


No it'll be much less than a year. But yea CPU mining is pointless unless you have free electricity. And even then the returns are still small...
sr. member
Activity: 402
Merit: 250
June 11, 2011, 08:26:50 PM
#4
a single overclocked 5830 can get over 300 mhash/s

So i'm guessing 300mhash/s takes a year to make even 200€?
newbie
Activity: 42
Merit: 0
June 11, 2011, 08:21:52 PM
#3
a single overclocked 5830 can get over 300 mhash/s
legendary
Activity: 2618
Merit: 1007
June 11, 2011, 08:18:09 PM
#2
Yes, they are THAT slow on Bitcoin calculations.
sr. member
Activity: 402
Merit: 250
June 11, 2011, 07:52:51 PM
#1
We got over 100 Core2Duo servers with CPU >80% idle, and a few Core2Quad xeons with CPUs idle too.

Are they any worth on mining?

Quick estimates based on testing on a few servers:
 - Core2Quad ~6000-8800khash/s
 - Core2Duo ~2500khash/s

I've been considering this for a long time but never had gotten around to it, until finally today reading up on pooled mining etc. i gave it a try Smiley

So if we put 100 Core2Duo nodes crunching, worth anything?
Quick calc: 100*2500/1000=250Mhash/s

Are CPUs really THAT slow on this type of thing? Someone claimed 2x5830 can do 300Mhash/s, so is it really so that putting these 100 nodes crunching would be able to just barely crank out 2x5830 hashes? :O

i'm using jgarzik's and compared 4way to default algos too.
Jump to: