lol, wat u just said in the quote. A system that costs less and is 5x more efficient than a GPU. Lets see how u make a system that cost less when u buy at the retail level, among other things.
Ok, enough arguing and plucking numbers from the sky. How fast can u get the show on the road?
What is the lol about? Show me where I claimed that I can develop such a system. And please, don't try to change the subject.
I am not sure what you are arguing about, it seems very odd.
You have two systems, one costs less and it’s 5x more efficient than the other, and they both do the same amount of work. So going with the system that is cheaper to buy and operate doesn’t make sense you say. Can you please tell me again why miners freak-out if they think an ASIC has been developed to mine their coin?
I will go back to work and take my crazy pills, and welcome to The Twilight Zone.
I am saying it makes no financial sense. The major GPU coins already have asics, with ETH being the biggest. And those coins that have no asic on it will have asics on it before the FPGA will come close to ever breaking even.
And everytime an asic takes over an alternate coin, all the GPUs and FPGAs that were mining that coin will go to a non-asic coin. That means the remaining non-asic coins will have difficulty skyrocket and guess wat, that means your GPUs and FPGAs mining make less money. Everytime a new asic comes out, the FPGAs make less money. So, there is no niche. There is a limited amount to be earned from mining u see. If the asics keep appearing to take some of it, the switchers (FPGAs/GPUs) will have less and less.
Why not u calculate how much it would cost to make an FPGA that can do the equivalent of mining 504mh like the L3+ or 14TH like the S9.
Run your numbers and see the gap. While I dont have the numbers, I dont think it can work.
Anyways, good luck in your endeavors. Maybe I am wrong.
Just my 2 cents
While your point of view is true in general, there are a couple of exceptions: mainly space and power constraints. Let’s be very specific in our comparison to avoid meaningless comparisons, XCVU9P ($4,000 FPGA) vs GTX 1080 Ti ($800 GPU).
Now let’s take the Phi1612 algorithm and give the FPGA a more realistic 5x performance advantage against the GPU at a power consumption rate of 150 watts (0.150 kWh). I would rather run a 100 FPGA farm (17.5 kWh) vs a 500 GPU farm (87.5 kWh) any day and here’s why. Lower overall costs if you believe the 5x advantage ascribed to the FPGA.
4x FPGAs and components approx. ($16K + $1.2K) in server chassis and add 100 watts for overhead.
4x GPUs and components approx. ($3.2K + $0.65K) in frame and add 100 watts for overhead.
$430K 100 FPGA farm vs $481.25K 500 GPU farm. Can you at least agree that this makes financial sense?
Really? Pretending? Your own words, bolded. Post 266. Above quote lol. Just do something like it will do. It wont be anywhere cheaper lol.
Anyways, no more argument.
Do update your FPGA setup when u do rather than talk nonsense.
Or just dont bother because I dont believe u can do any of it lol