I worked for Cadence Design Systems, and need a bit of a push in the right direction trying to get my head around how the hardware ties together.
Are there any block diagrams?
I've been looking into FPGA and ASIC's and decided that any good GPU is going to keep up with the latest FPGA's less power consumption.
So ASIC cost is the problem and then IP core is there any Verilog/VHDL around to look at?
And what about running OpenCL http://www.solarflare.com/Content/userfiles/documents/Altera-AOE-Acceleration-and-OpenCL.pdf
on a an Altera? has anyone tried this, it looks great maybe price performance is crap.
There are heaps of IP cores around and if Avalon give spec's on here device this will not be need......right?
http://www.chipestimate.com/ lots of sha 256 ip cores here,
We have Altuim designer, a SHA 256 core, Cadence InCyte, a soldering iron.......
Needed a block diagram....
Please help me to shut all the bad ideas down in my head!
Look at Icarus wiki/git to get an idea how bitcoin mining is done in fpga.
Start with bitcoin wiki, understand how block header is structured, how to get "getwork()", what needs to be done to find the correct nonce.
Basically, your fpga/asic board has to find a nonce for a given 64 bytes (32 bytes of midstate+20 bytes of fill+12 bytes of block header from getwork() request). You cannot just use off the shelf sha256 IP core, you have to modify it/have your own so that your board "looks" for the right nonce by scanning/hashing all 32 bit range or part of it, if you decide to distribute the search ranges to more than one chip. The objective is to test one nonce per clock cycle. Once the nonce is found, host is notified. You don't have to have any elaborate host/board protocol. Use Icarus as an example. It works. Unless you want temperature control or some other control features, cancelling work, restarting with different nonce ranges etc.
If you just hash one sha 256 round per trip to the board, your host/board interaction would be extremely slow, and the overall bitcoin mining hash rate would be much slower than any GPU/CPU rates.
BTW, put away that soldering iron, most likely you'll not need it.
Thanks for that, best tip I've had since i joined. Found trying to run Verilog on OSX pain in the arse until i found the Eclipse marketplace plugin.