Pages:
Author

Topic: ETH GPUs miners beware! - page 7. (Read 20730 times)

newbie
Activity: 70
Merit: 0
February 14, 2018, 01:48:21 PM
#89
More importantly, the main bottleneck in GPU production is memory availability, there's just not enough chips

Where are you? Do you live in this planet? It does not look like that. GDDR3 has its own manufacturing plant, memory availability is not a problem cause few things use GDDR3 nowadays.

LoL this guy
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
February 14, 2018, 01:32:19 PM
#88
More importantly, the main bottleneck in GPU production is memory availability, there's just not enough chips

Where are you? Do you live in this planet? It does not look like that. GDDR3 has its own manufacturing plant, memory availability is not a problem cause few things use GDDR3 nowadays.
newbie
Activity: 70
Merit: 0
February 14, 2018, 01:25:34 PM
#87
Ethash is memory bandwidth bound...I don't think ASIC can help much there...
and if bitmain can solve that, they better start making graphic cards, they'll earn more money Smiley

More importantly, the main bottleneck in GPU production is memory availability, there's just not enough chips. Even if these devices do materialize, they will simply be priced relative to their performance compared to GPU's but they wouldn't take over the market because they couldn't be produced in large enough volumes nor would they be that much faster than GPU's.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
February 14, 2018, 12:51:53 PM
#86
Good luck making an ASIC with 4Gb ultra-fast memory to each processor.

The way you say, it looks like you are the top engineer of a multi-billion dollar semiconductor company, the reality of somebody or some people can do must be painful to you but don't worry troll, if one or some cant maybe another one or some can.
member
Activity: 388
Merit: 13
February 14, 2018, 12:26:37 PM
#85
Good luck making an ASIC with 4Gb ultra-fast memory to each processor.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
February 14, 2018, 12:24:40 PM
#84
Roughly 70% of monero network hashrate are comes from large botnets like javascript webminers coinhive, etc. So there are no economical space for asics on cryptonight - memory hard algorithm

That is one of the reasons why is not on their asic roadmap, is just not feasible which translates to being not worth to create an asic for it.
full member
Activity: 376
Merit: 103
February 14, 2018, 12:01:16 PM
#83
I wonder if this means an ASIC for cryptonote coins like Monero are just on the horizon then? I don't know the exact details but I think some of the reasons cryptonote is ASIC-resistant are the same as ETH.
Roughly 70% of monero network hashrate are comes from large botnets like javascript webminers coinhive, etc. So there are no economical space for asics on cryptonight - memory hard algorithm
legendary
Activity: 1453
Merit: 1011
Bitcoin Talks Bullshit Walks
February 14, 2018, 11:24:46 AM
#82
Ethash is memory bandwidth bound...I don't think ASIC can help much there...
and if bitmain can solve that, they better start making graphic cards, they'll earn more money Smiley

They did start making graphics cards.  In a sense. Checkout sophon.ai

BR
full member
Activity: 420
Merit: 182
February 14, 2018, 10:58:05 AM
#81
I design ASIC's for a living.  I implement them in FPGA before we tape out.  (Not for crypto)..

The sheer volumes of memory bandwidth needed, and the costs, tell me that someone is smoking some seriously good shit.  I get beat up over 64k of RAM.... 



will you elaborate a bit more? are you saying that the cost of equipment manufacture is not worht designing such an asic miner? THanks.,

If it were me.  And I had the resources. I would make asic for Ethash just to say I got it done when so many day it’s impossible.  When you make billions of profit a year and look to do the same this year.  What’s a few 100k to dabble with an asic for fun?   Anyway cost isn’t the issue here. I’m sure it’s well worth it to learn what an asic can and can’t do.  All about learning.  Look at what we spend yearly on school.  This is just drop in bucket.

BR

@rem26 is right - and I'm an EE, as well. To put this into an analogous terms, an ASIC designed for Ethash would replace the GPU core, not the GPU memory, and as anyone who has tinkered with Ethash knows, the speed of the GPU core has very little to do with hashrate; it is entirely up to memory size and bandwidth. THAT is why Ethash is considered "ASIC-resistant."

full member
Activity: 259
Merit: 108
February 14, 2018, 10:57:14 AM
#80
wondering how the "regular consumer grade or home user grade " Volta gpu's going to work for mining .
currently , with nVidia Volta gpu , ETH mining speed is around 60-68MH/s out of box, with little overclocking , this number can pushed up to 80MH/s with a 250 TDP. Im dual mining with one of my nvidia Titan Volta , ETH/DCR , eth is around 78+MH/s stable , DCR is around 600MH/s , the card is pulling around 180-190 watts at the wall , so im sure there is more potencial in those cards Smiley

i think it is more tailored for a AI application as i remember about 600 cores for that. Other than that there appears to be about 1500 more cores. Can not comment much on memory and core speed, frequency, did not look closely.
For mining 600 AI core are wasted imo.

GTX 1080 Ti   700-800 (OEMs)   11GB DDR5X   Pascal   GP102   3584
GTX 1080      8GB DDR5   Pascal   GP104?   2560

Titan V      12GB HBM2   Volta   GV100?   5120 + 640 tensor


Makes a lot of sense being that Nvidia has invested a lot in AI and autonomous driving systems so to see this tech spill over into GPU's would make sense. They have a lot more riding in that realm than they do on video cards. There's no doubt Volta will be a "better" GPU but how much better at mining it will be, we don't know yet. Clearly they are addressing the mining market by introducing the P1xx cards and I can't see them introducing a product like Volta that uses less power and mines better than the P102's. Wouldn't make sense.

What a lot of people don't realize about "mining" - people like Metroid who spread FUD every single waking day of their life, is that with our rigs we've created a massive calculator. Right now, it profits on cryptocurrency. If Metroid's prophecy rings true and GPU mining dies, the next wave applications will hit. There are so many business applications that would love to leverage this entire network of computational GPU's. GPU "mining" for crypto may eventually die, or rather evolve, but the network will also evolve to be used for other applications besides crypto and those with the equipment will get paid.
legendary
Activity: 1453
Merit: 1011
Bitcoin Talks Bullshit Walks
February 14, 2018, 09:44:01 AM
#79
I design ASIC's for a living.  I implement them in FPGA before we tape out.  (Not for crypto)..

The sheer volumes of memory bandwidth needed, and the costs, tell me that someone is smoking some seriously good shit.  I get beat up over 64k of RAM.... 



will you elaborate a bit more? are you saying that the cost of equipment manufacture is not worht designing such an asic miner? THanks.,

If it were me.  And I had the resources. I would make asic for Ethash just to say I got it done when so many day it’s impossible.  When you make billions of profit a year and look to do the same this year.  What’s a few 100k to dabble with an asic for fun?   Anyway cost isn’t the issue here. I’m sure it’s well worth it to learn what an asic can and can’t do.  All about learning.  Look at what we spend yearly on school.  This is just drop in bucket.

BR
sr. member
Activity: 2506
Merit: 319
February 14, 2018, 04:54:32 AM
#78
Ethash is memory bandwidth bound...I don't think ASIC can help much there...
and if bitmain can solve that, they better start making graphic cards, they'll earn more money Smiley
full member
Activity: 394
Merit: 101
February 14, 2018, 12:56:40 AM
#77
wondering how the "regular consumer grade or home user grade " Volta gpu's going to work for mining .
currently , with nVidia Volta gpu , ETH mining speed is around 60-68MH/s out of box, with little overclocking , this number can pushed up to 80MH/s with a 250 TDP. Im dual mining with one of my nvidia Titan Volta , ETH/DCR , eth is around 78+MH/s stable , DCR is around 600MH/s , the card is pulling around 180-190 watts at the wall , so im sure there is more potencial in those cards Smiley

i think it is more tailored for a AI application as i remember about 600 cores for that. Other than that there appears to be about 1500 more cores. Can not comment much on memory and core speed, frequency, did not look closely.
For mining 600 AI core are wasted imo.

GTX 1080 Ti   700-800 (OEMs)   11GB DDR5X   Pascal   GP102   3584
GTX 1080      8GB DDR5   Pascal   GP104?   2560

Titan V      12GB HBM2   Volta   GV100?   5120 + 640 tensor
full member
Activity: 394
Merit: 101
February 14, 2018, 12:53:41 AM
#76
I design ASIC's for a living.  I implement them in FPGA before we tape out.  (Not for crypto)..

The sheer volumes of memory bandwidth needed, and the costs, tell me that someone is smoking some seriously good shit.  I get beat up over 64k of RAM.... 



will you elaborate a bit more? are you saying that the cost of equipment manufacture is not worht designing such an asic miner? THanks.,
jr. member
Activity: 186
Merit: 4
February 13, 2018, 10:00:57 PM
#75
wondering how the "regular consumer grade or home user grade " Volta gpu's going to work for mining .
currently , with nVidia Volta gpu , ETH mining speed is around 60-68MH/s out of box, with little overclocking , this number can pushed up to 80MH/s with a 250 TDP. Im dual mining with one of my nvidia Titan Volta , ETH/DCR , eth is around 78+MH/s stable , DCR is around 600MH/s , the card is pulling around 180-190 watts at the wall , so im sure there is more potencial in those cards Smiley

I remember the 1080ti doing 38 Mh/s and 2.5Ghs(or 2Gh?) in eth/decred, it also ate around 300W doing that Smiley. These new cards will be monsters, I just don't want to imagine the mark-up on launch Smiley
hero member
Activity: 1498
Merit: 597
February 13, 2018, 09:31:57 PM
#74
wondering how the "regular consumer grade or home user grade " Volta gpu's going to work for mining .
currently , with nVidia Volta gpu , ETH mining speed is around 60-68MH/s out of box, with little overclocking , this number can pushed up to 80MH/s with a 250 TDP. Im dual mining with one of my nvidia Titan Volta , ETH/DCR , eth is around 78+MH/s stable , DCR is around 600MH/s , the card is pulling around 180-190 watts at the wall , so im sure there is more potencial in those cards Smiley
newbie
Activity: 78
Merit: 0
February 13, 2018, 09:25:11 PM
#73
I design ASIC's for a living.  I implement them in FPGA before we tape out.  (Not for crypto)..

The sheer volumes of memory bandwidth needed, and the costs, tell me that someone is smoking some seriously good shit.  I get beat up over 64k of RAM.... 

sr. member
Activity: 661
Merit: 258
February 13, 2018, 09:10:55 PM
#72
In order to be faster than gpus they must build an eth algo dedicated cpu for thier machine so it will only hit eth , or they will build it as equivalent to 18 gpus and in same price.
newbie
Activity: 65
Merit: 0
February 13, 2018, 08:44:37 PM
#71
Nvidia already said it will not focus on mining in 2018 and that tells us they know there is something they cant compete with.

Spinning the rumors here are we? Nvidia are not focusing on mining as they dont need to, why would they? They sell all the cards they can produce, same as AMD so why focus on something that are already profitable for them.

Mining is not going to die in 2018, neither are ASICs coming the ethash in 2018, and even if they do come, they will not make any huge dents in the network hashrate simply due to the way ethash is constructed.

Anyone saying otherwise are technically not worthy to talk about this.

Mining at the moment is 4-5 times more profitable than a year ago, even if it drops to 1/5th of what it is today, im still making a decent profit from my machines - If that happens it is not due to some fantasy ASIC that some random website decides to spread rumours about.

I feel your rage, you must be new to mining and must have paid a lot of money for gpus. There are lots of desperate people around, even integrated gpus they are using to get some few khashes hehe. So to rephrase, Nvidia said it would focus on mining in 2018 and now they said they will not, see my point, they know something that they cant compete with and this thing is recent, about a month, probably they have the same source as I have.

Actually I have been mining since late 2014, I have paid off my gear 3 years ago basically.

I live in an area where the power prices are amongst the highest in the world, and im still making in the area of 250US a day in profits from my rigs - I have nothing against you, but your constant rant about the GPU mining is going to die is getting old - There's at the current time nothing that indicates that it should die, other than more people are joining in which will cut profits down per card depending on the price of the coins etc..

The only "rage" from my side is against you, spreading fake news constantly, ranting about shit that you have no honest knowledge about.

Would I buy new hardware now? Most likely yes if the prices are right, but you really need to stop being a dick.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
February 13, 2018, 06:48:47 PM
#70
If you didn't make same post abut death of GPU mining in 2015, 2016 and 2017, maybe that would mean something

I'm a member on this forum since 2013 and I created only one thread. If you don't believe you can always check if I'm telling the truth or not, very easy to find it out, just few clicks.
Pages:
Jump to: