- This miner doesn't adjust its work size. That's because of point 1 above...
Oops, I rearranged and forgot to renumber. Fixed.
So the old radeons are slower than current nvidia cards? I sold my radeon a long time ago, so I can't test it.
Probably. Bullet points 3 and 4. Bullet point 3 is a reference to the Bitcoin wiki.
https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3FEssentially, by the same measure that a modern Radeon has a 1.7x advantage over a modern GeForce, the older Radeon does not have that advantage, and in fact has a sloppily estimated 3.5x disadvantage. And that doesn't account for manufacturing process or runtime differences.
Question decade. Can you make it work for an IGP? I.e an integrated HD4200(which people say is actually an HD3k igp)
I don't know. I don't have a Radeon IGP. AMD's spec thing says the Radeon 4200 has the 40 unified shaders supporting the ATI Stream Technology, so it should already work with 64-bit Windows. For the low, low price of 5 BTC, I might be convinced to try to make it work on other platforms, too. I doubt you could recover even the costs of electricity with that chip.
-- Low, low price of 5 BTC my ass?
I think it'd take an eternity to mine that back...
If the miner goes faster on the 3700, than the ~20 Mhash/sec that my horrifyingly over-powered (and now blown-out, evidently) 8800GTS produces, I'd love to play around with it... even if it's stupidly inefficient, you'd really just need to step back and look at how bad nVidias are that people still try to mine with
I'm trying to consider the value of my time, here. And while the difficulty of mining increases, the exchange rate decreases, so I think even 5 BTC is only worth it for playing around. Instead, I give you my source code, which is everything you need to try it on your own, except for the links to the SDKs, and however long it takes it learn how to work those tools.
KernelAnalyzer says the SHA-256 kernel should do 6M threads/sec on a Radeon HD 3870, so I doubt that it will do better than a GeForce 8800GTS. Unless I also made horrible mistakes in writing the kernel.
Oh, yeah, for reference, I'm using ATI Stream SDK 1.4.0 beta, Visual Studio 2008, Python 2.7.2, NumPy 1.6.1, and Boost 1.47.0.