Author

Topic: [ANN][DASH] Dash (dash.org) | First Self-Funding Self-Governing Crypto Currency - page 6836. (Read 9723597 times)

full member
Activity: 154
Merit: 100
Coinoholic
Why does this garbage GPU miner not work on my machine ?

I get error -11

How about posting your log and telling me what hardware you got instead of just trash talking myself and everybody else who has been slaving to get GPU miners out there.
full member
Activity: 280
Merit: 100
The Future Of Work
do we believe in 0.003 that is the question

oh yee of so little faith!  LOL  I see no trouble getting to 0.003, in fact I expect it by next week, though I could very well be wearing rose colored glasses (duck)
full member
Activity: 280
Merit: 100
The Future Of Work
Why does this garbage GPU miner not work on my machine ?

I get error -11

For AMD, use the latest driver.... 13.14 or something like that, not a beta, stable

And the guys that are working constantly on these GPU miners for us are doing a great job, so I take offence to calling them garbage  Undecided  We have a difficult combo of 11 algorithms to deal with.  The coin was designed to make it hard to transport to GPU so that the network could build slowly.  There was a lot of thought that went into this coin, everything for a reason.
full member
Activity: 294
Merit: 100
do we believe in 0.003 that is the question

No question about it...last night I suspected it won't dip into 0.0016 like couple days ago and bought up @ 0.0018... So glad I did.
Great buy support, coin is outstanding and that's before major exchanges etc.

Question is how long until 0.005. I don't think very long.
sr. member
Activity: 327
Merit: 250
it's a hardware thing!
member
Activity: 81
Merit: 100
do we believe in 0.003 that is the question
sr. member
Activity: 504
Merit: 254
Happy 5GH/s network hash rate day
hero member
Activity: 952
Merit: 500
Hey guys just wondering if anyone tried mining darkcoin with betarig? Is it even supported?

Thanks.
full member
Activity: 182
Merit: 100
Why does this garbage GPU miner not work on my machine ?

I get error -11

what gpu u got

HD5850, HD5870
bad ?
legendary
Activity: 1288
Merit: 1000
$100 a coin possible?

Sure... just send met 500$ and I 'll send you 5 coins  Grin
Hahahaha it's possible..
newbie
Activity: 43
Merit: 0
$100 a coin possible?

Sure... just send met 500$ and I 'll send you 5 coins  Grin
member
Activity: 112
Merit: 10
sr. member
Activity: 504
Merit: 254
Am I misunderstanding something about the diff adjustments, or is something strange going on? It's jumping from the 150s into the 260s while the network hash rate barely moves. Unless the pool display of network hash rate is off of course.

Edited to add: I understand KGW, but it shouldn't make diff jump around that much if the network hash rate goes from (e.g.) 4.3GH to 4.7GH. Should it?

Which pool are you mining on? When I was on smalltimeminer, it used to jump around like that. I think it is a configuration glitch with the MPOS web site... If you are GPU mining, you can see the net difficulty in the miner, and that seems to be more accurate. I think KGW might cause MPOS to spas every now and then

I'm on coinmine.pl at the moment. I'll check the CGMiner output next time it happens, though if it's an MPOS glitch it also extends to the displayed difficulty for solved blocks. Hmmmm.

Diff just jumped from 156 (block 21186) to 200 (block 21187) with no dramatic increase in network hash rate that I could see. Confirmed in CGMiner, so it doesn't seem to be an MPOS display issue. I understand that KGW doesn't cause linear diff increases, but should e.g. a 10% increase in network hash rate (which wasn't the case here) result in a ~33% diff increase?
legendary
Activity: 1358
Merit: 1002

About a 10% increase for me...

Any chance of a windows compiled version?

I removed now the big optimization, too much CPU dependent... Try now and PM me if you want again that 10%, I'll say to you what change on the code...



any chance of optimisations for Xeon E5450? when i get back home i have 28 of these waiting for some hashing

Try this version. I got a +10% with a i7. Some Xeon has the same improvement, same not. Let me know the improvement with your e5450. Send me a pm when u'll try.

ok i will, im getting them wenestday or thursday
legendary
Activity: 1288
Merit: 1000
member
Activity: 119
Merit: 10

About a 10% increase for me...

Any chance of a windows compiled version?

I removed now the big optimization, too much CPU dependent... Try now and PM me if you want again that 10%, I'll say to you what change on the code...



any chance of optimisations for Xeon E5450? when i get back home i have 28 of these waiting for some hashing

Try this version. I got a +10% with a i7. Some Xeon has the same improvement, same not. Let me know the improvement with your e5450. Send me a pm when u'll try.
legendary
Activity: 1358
Merit: 1002

About a 10% increase for me...

Any chance of a windows compiled version?

I removed now the big optimization, too much CPU dependent... Try now and PM me if you want again that 10%, I'll say to you what change on the code...



any chance of optimisations for Xeon E5450? when i get back home i have 28 of these waiting for some hashing
hero member
Activity: 658
Merit: 534
Why does this garbage GPU miner not work on my machine ?

I get error -11

what gpu u got
full member
Activity: 182
Merit: 100
Why does this garbage GPU miner not work on my machine ?

I get error -11
full member
Activity: 182
Merit: 100
Also, isn't heat (from a physics perspective) considered a by product, or mark of inefficiency? An incandescent lightbuld uses part of the energy input to make heat, which is inefficient. An LED light makes much less heat, and is much more efficient (as a measure of energy input vs light output).

Indeed, but all of the electricity used by a computer is turned into heat, there is no other type of energy output (if you ignore light from the screen and sound from the speakers).

As a Physics major I have to jump in here - not all of the energy used is turned into heat, in fact most isn't turned into heat at all. Heat is basically all of the wasted energy needed to run the computer. The rest of the wattage that isn't coming off as heat, the majority of the energy, is being used to do work. Basically flipping a bunch of transistors (switches) many times per second, spin hard drives and fans, etc. Heat is a waste byproduct of doing work in imperfect electrical circuits (ones with resistance, ie not a superconductor) - nothing more!

To the first question, yes it could be that it's a sign of inefficiency - but it could also be a sign that the mining software is not using the full potential of the cards. Wink

Sorry this is getting off topic, but I must disagree with that statement.  Exactly 100% of the electrical energy used by a computer is turned into heat.  All of it.  You can count the spinning hard drive as kinetic energy but this too is converted to heat too when it is powered off and spins down.  The useful work a computer performs is not a form of energy.  If the heat output does not equal the electrical energy input then your computer has broken the law of conservation of energy.

I second this notion, if you were to run a computer in a vacuum (or outer space) it would heat up continuously until it burned out. The energy is expelled 100% as heat on the other end and here on earth the heat is dissipated through oxygen (chip --- heat sink --- air (or water)). The lower temps were seeing mining dark is simply because the miner hasn't been optimized to use 100% capacity. Nothing more, nothing less.



Heat, vibration, light (led), sound - missed a few out.
Also the kinetic energy of a disk is not turned to heat by the computer's online processes.

"if you were to run a computer in a vacuum (or outer space) it would heat up continuously until it burned out."
If it sparked or developed a heat glow you would also be producing light. Also, are there any other wavelengths of electromagnetic radiation produced other than visible and infra-red ? Radio frequency ? Beta radiation ?
Jump to: