Author

Topic: Towards a Semi-Canonical Encyclopedia of Bitcoin Mining ? (Read 499 times)

newbie
Activity: 8
Merit: 0
An Introduction

As previously a student of a certain electronics and computing program of a certain online university - such that is ostensibly not the same as Trump University, however perhaps semantically similar, namely DeVry University - candidly, I could wish to believe that I know enough about electrical engineering now, enough to at least begin to see how much DeVry has not covered. Although I believe that there are substantial gaps in their program - such as: Altogether a lack of reference with regards to historic computing architectures, viz [Bell 1971] and a complete lack of coverage with regards to mathematical models developed primarily in the physical sciences. Not as though to only complain about these gaps, however, I understand that there is a substantial body of work about the electrical sciences, available via open access journals and public libraries, as well as via book subscription services such as [plug] Safari Books Online. Of course, beyond so much of mathematical abstraction, and beyond so many logical models, there are the works produced of science and technology, in applications of principles of systems engineering - similarly, to an effect of commercial production systems, before there is ever so much of a commercial branding by individual Original Equipment Manufacturer (OEM) institutions. Candidly, without the pith of the works of architects, designers, and engineers, there might not be anything substantial for a commercial production service to develop a brand with.

Not as though to shake tree of canonical concepts and of popular concepts of Intellectual Property, personally I believe that there was an era in computing - perhaps to a duration before the Personal Computer (PC) and likewise before so many advertising campaigns were developed as about PCs as commercial products, and the corresponding works of operating systems development, from UNIX to CP/M to DOS, and BeOS, and so on - beside all of the works developed of institutions, such as the IETF, the ISO, the IEEE, and the ITU, as to the collective behest of international standards in computing and in communications systems. In that time, perhaps so many commercial institutions may've been more inclined to share knowledge about the designs of computing systems. Perhaps it did not seem like so much of a competitive field, in that time. So much as there are any number and manner of canonical works in literature surviving the competitive commercialization of the PC, perhaps that era is not in all ways lost to the contemporary market, or the contemporary design space.

That the Bitcoin Era clearly represents a distinct era in designs and applications of computing systems, personally I don't believe oneself may be well served as if to simply wait for the canonical academia to "Catch up," as it were - ETA within approximately a decade or so? Maybe once IBM has made an industry of it - ye olde "Big Blue," as though?

So, in proceeding to develop a manner of a reference base with regards to existing designs in Bitcoin Computing - that is, in Bitcoin Mining - but of course there is a lot of topical diversity around, even within that simple subject heading. Not only are there the SHA256 blockchain systems, but also the Scrypt, Scrypt-N, Blake-256, and other various kinds of ways to build a blockchain. Then there are the extensions on the blockchain such as with the Ethereum Virtual Machine (EVM) and so on, and thirdly the contemporary services of the effective Bitcoin Infrastructure of the contemporary Internet.

So, not as though to rush along to the topic - as might be at the risk of leaving out a lot of the canonical content - but perhaps it's possible to develop some kind of a formal encyclopedia about the state of the art in cryptocurrency systems? and it be an open-source work - whether or not per se conforming to all of a service model of an anonymoumsly updated Wiki.

Towards such a concept, here is a link to an article - albeit not here at the forum: Case Study: "Brute-Force Computing" with FPGA Arrays -
The FPGA Array itself: AntMiner S5+
.

Personally - in a context of web content development - I think it's easier to work with the WYSIWYG editor in Evernote, juxtaposed to BBmarkup such as at the Bulletin Board forums. So, not as if to gripe about it though, of course one can develop web content on the Web, and share it in a social context? Not as though to strike out on any new agenda of it, thus, I simply thought it might be apropos to share that resource, and this commentary, if perhaps it could be of any use towards Documenting the State of the Art.

Candidly, I would wish to share a further comment, as an observation: That Bitcoin Miner architectures such as the BITMAIN AntMiner machines - no doubt, in a manner similar to other canonically designed Bitcoin mining machines - that these appear to be designed as to function in a manner of "Brute Force" computing? More specifically, it seems that the total TH/s rate of any single product, as such, might be calculated as a manner of an average over the individual GH/s or MH/s rates of individual FPGA or ASIC components in the same computing machines?  Thus, it may not seem to be likely to achieve such hashrate throughput of any single FPGA chip itself.

The BITMAIN AntMiner S5+ for instance, is designed with 144 each of a certain FPGA chip, in each individual enclosure - to a total of 432 total of such "Hash Chips" in the complete three-part model of the S5+. Not as though to criticize the designers, I would estimate that each such "Hash Chip" may be operating independently of the others, such that the first one to come up with a "Winning Hash" - in any duration - is the one that results in the Proof of Work calculation, while the rest of the chips in the array continue to turn over some bits, similarly, all operating on the same blockchain and with the same fundamentally functional hashing methodology?

Without getting into a lot of immediate depth about SHA256 or models for Marakov Decision Processes and Partially Observable Markov Decision Processes,  I believe it may serve to ask a question as to the design of these ostensibly "Brute Force" computing architectures. I'm certain that they've all been designed with a great amount of consideration as towards parallelization in the compute+verify procedures. I'm sure they're not all simply applying the vanilla IEEE SHA256 VHDL module with a small series of logical switches for the checksum verification process - that would certainly be a naive thing to suppose. Certainly, they're all making the most optimal use possible of the very nature of a reconfigurable computing machines, as clock-oriented architectures juxtaposed to pulse-oriented architectures? I'm sure that something as archaic as the old AI Memo 514 may seem to hold absolutely no relevance for these modern things.

I would not wish to suppose to present any too bold claims with regards to this "State of the Art," though I believe it needs a sense of skepticism - like many things in commercial or quasi-commercial products and services. Thus, I propose to document this "State of the Art" - I only hope it doesn't offend the community!

Why I'm not publishing this content as a Wiki - Synopsis: Personally I'm not a huge fan of the anonymous nature of publishing at web-based Wikis - favoring instead, some perhaps more classical frameworks of technical documentation, as with DocBook or with DITA and viz a viz the FreeBSD Handbook, such that I understand is developed in a DocBook format. In lieu of DITA or DocBook, of course there are also the web-based annotation tools such as Diigo and Evernote, such that may also serve in a role with regards to content development
Jump to: