That would pretty much mean a total break of SHA-256. The purpose of a cryptographically secure hash is to make the outputs computationally pseudorandom with respect to the inputs. You imply some clever way to use a correlation between the inputs and outputs for other types of calculations. Even MD5 (or even MD2) isn’t that broken.
There are ways to
successively approximate PI using random numbers. Now, I'm not saying it *can* be done or even suggesting how (I'm smart but not that kind of smart). I'm just saying it wouldn't surprise me. PI is one of those kind of fundamental constants.
That’s true, and it’s an interesting observation; but how does hashing help? Some handwavy scheme to produce pseudorandom numbers faster than your CPU can process them?
;-)
Well to be honest I wasn't talking about "ASIC" but only about the computational power that's around GPUs.
GPUs can’t be used to mine Bitcoin. It’s so infeasible that discussion of GPU mining is banned from the Bitcoin mining forum, per the stickied rules thread there:
3. Mining BITCOIN is done exclusively with dedicated BITCOIN mining hardware based on ASICs -
https://en.wikipedia.org/wiki/Application-specific_integrated_circuit . You CAN NOT meaningfully mine bitcoin today with CPU, GPU or even FPGAs. Bitcoin difficulty adapts to match the amount of mining done on the network and has reached levels trillions of times too high to mine meaningfully with PCs, laptops, tablets, phones, webpages, javascript, GPUs, and even generalised SHA hardware. You will not find software in this section to help you mine bitcoin in this absurdly inefficient manner in this subforum. It would cost you thousands of dollars in electricity per year to earn only a few cents in bitcoin. Even if you combined all the computers in the world, including all known supercomputer, you would not even approach 0.1% of the bitcoin hashrate today. Any discussion outside of ASIC related mining, except in the interests of academia, will be moved to the altcoin mining section. There isn't any point attempting to mine bitcoin with CPU or GPU even in the interests of learning as it shares almost nothing with how bitcoin is mined with ASICs and will not teach you anything.
TL/DR Summary:
- You CANNOT meaningfully mine bitcoin with your PC or laptop no matter how powerful it is.
- You CANNOT meaningfully mine bitcoin with your tablet or phone no matter how powerful it is.
- Mining apps for your phone or tablet that claim to mine bitcoin are almost certainly scams.
- You CANNOT find software here to mine bitcoin with your PC by itself.
- You MIGHT be able to do one of the above with altcoins, but such discussion goes into the altcoin mining section.
- You CANNOT find or post software here to mine on other peoples' PC without their permission.
I have seen some colourable claims that high-end FPGAs can still meaningfully mine Bitcoin. I do not know whether or not that is true; either way, that would be ridiculously expensive compared to buying an ASIC.
My ideas was not about calculating the value of Pi, but the fascination of how much computation power we are putting in, in calculating Bitcoin hashes.
It is indeed astounding. And that is what secures the network against double-spends! To do a so-called “51% attack”, you need to do more computation than everybody else
combined (so that you are doing more than half the total, and everybody else is doing less than half). (
N.b. that even a 51% attacker can only double-spend, and censor transactions. Even a 51% attacker cannot spend arbitrary coins owned by other people, or otherwise violate consensus rules.)
When I read that computers on Bitcoin network are calculating quintillion of hashes per second I thought for the sake of argument let suppose if we combine all GPUs just to calculate the value of Pi,
I should note,
GPUs are not necessarily faster than CPUs. They are faster than CPUs for very specific types of operations: Roughly speaking, tight loops that can be run in parallel. The core speed of a typical GPU is actually
slower than that of a typical high-end CPU! But GPUs do a whole bunch of the same thing at once more slowly, instead of doing one thing at a time more quickly. (That’s the most nontechnical way that I can explain it.)
I am not very much familiar with pi algorithms. I don’t know how well they parallelize. I
do know that they typically require huge amounts of fast storage, as I have repeatedly mentioned. If the algorithm can’t be parallelized, then a CPU will be faster than a GPU. If it’s I/O bound, then the question is irrelevant because your processor wastes time waiting for your storage media. Same as for memory bandwidth (another big issue in high-performance computing).
GPUs beat CPUs at Bitcoin mining because Bitcoin mining is what’s called “embarrassingly parallel”: The algorithm is so easy to parallelize that it’s almost as if it was designed for that. Games use GPUs because many different graphics rendering algorithms (especially 3D) are embarrassingly parallel. AI also does a huge amount of parallel stuff.
Anything single-threaded will run much faster on a CPU. That is essentially why we have CPUs, and don’t just redesign GPUs to run
all of our programs. Most types of general-purpose computing run best on a CPU; a few types of computing run much, much better on a GPU. Different tools for different jobs.
we can calculate 10x60 quintillion digits of Pi that will be one followed by 20 zeros?
Not sure what you mean. I think that your terminology is off.
Anyway, however many digits you mean: How do you intend to
store all of these digits? Really, really
biiiiig disks?