Author

Topic: Number 9! Ninth altcoin thread. Back to the moon Baby! - page 106. (Read 66416 times)

full member
Activity: 478
Merit: 125
For the US market Nvidia came out and said the pricing for the 3070 starts at $499.99, 3080 $699.99 and like what you linked the 3090 starts at $1,499.99.

3080 founders edition at best buy is $699.99

https://www.bestbuy.com/site/nvidia-geforce-rtx-3080-10gb-gddr6x-pci-express-4-0-graphics-card-titanium-and-black/6429440.p?skuId=6429440

Non FE cards will likely be more.

edit to add:  Agreed on guessing on the hash rates.  Nicehash reported 80 mhs on a 3080 they detected mining on their network.  The 115 mhs pictures were doctored and from a Nvidia partner lab and not a mining farm.  I haven't seen anything on some of the smaller coins or algo's yet.
legendary
Activity: 4256
Merit: 8551
'The right to privacy matters'
I think the 3080 at a cost of $699 doing between 80 to 115 mhs on eth is more attractive than the 3090 at a cost of $1,499 doing 115 to 150 mhs on eth.  

I personally don't see the 3090 doing 150 mhs and feel the roi is going to be significantly longer than the 3080's 10-12 months.  If the 3090 ends on the lower end of eth performance your looking at potentially 2 plus years before it's paid off.  For me I can't justify the difference in time to roi.
Well we are still guessing  a bit on prices and hash rates.
I put my name on this list for all models the 3070 the 3080,  3090.

https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/
full member
Activity: 478
Merit: 125
I think the 3080 at a cost of $699 doing between 80 to 115 mhs on eth is more attractive than the 3090 at a cost of $1,499 doing 115 to 150 mhs on eth.  

I personally don't see the 3090 doing 150 mhs and feel the roi is going to be significantly longer than the 3080's 10-12 months.  If the 3090 ends on the lower end of eth performance your looking at potentially 2 plus years before it's paid off.  For me I can't justify the difference in time to roi.
legendary
Activity: 3444
Merit: 1061
we are the pilots of our own gpus, like pilots of fighter planes.

sometimes you can make a good killing with cheaper gpu, sometimes you can make a killing with better gpus.

my 390s and 480s were defeated in profit by my measly core2quad processors mining pascal coin.

I even remember nvidia miners watching in the sidelines, while amd miners are happily mining ETH since amd users got the edge at that time ...280x's hehe

what you all are comparing is ETH, again.. we all got out there mining many coins. nvidia still has more options to mine, more options = more opportunity for profit. more opportunity for profit = more chances of ROI. you ROI = you don't care if the GPU is expensive  Wink

I'm not saying the most powerful card is the answer because of "density", most powerful and most expensive gpus should meet the "criteria" for it to be used for "density purposes"  (example 280x is better to own than 290x at that time....and 1080ti is best to own in its time too)

and there's..personal preference, i'll repeat.. if you think you can ROI it, then why not?

here is another example mining profits in 2019 is down..who is buying tons of radeon vii? hehe now they refresh newegg page several times daily LOL.

if bull market is coming with full force then we ROI all our gpus be it 5700xt or 3090. remember when home miners cannot max out their mining rooms because rx 470 4gb? LOL.
legendary
Activity: 4256
Merit: 8551
'The right to privacy matters'

If a 5700 xt does 50-52 mh at 125-130 at the wall
and the 3090 does 150 mh at. 230 at the wall

it simply does not have a huge edge.  also I don't think it will do 150  at 230 watts.

more like 130 at 230 watts.  but I  consider buying one as soon as possible just to benchmark on my threadripper.

My 5700 XT Gigabyte does 54Mhs exactly, consuming 135 from the wall

Of course there's room for improvement for 3090 and Nvidia another cards with miners and drivers, but I don't see a huge leap to kill older generations of cards, since polaris it's almost the same, a good gain on hashrate, more efficiency, but not overkill cards
People still use polaris cards to mine

In my country will be difficulty to buy any new generation of cards with a reasonably good price

even here in the usa  🇺🇸 the 3090 is  1500 in  NJ , USA  plus NJ tax of 7% so I would pay

1605 Usd.  I can get four rx 5700 xt for that with maybe 50 usd change. i.e.

200mh for 1605-50 = 1555

I can get six 5600 for 6 x 277= 1662 and get 60 in rebates or 1602 cost for 240mh

so 3090 maybe 150mh for 1605
5700 maybe 200mh for 1555
5600 maybe 240mh for 1602.
legendary
Activity: 3444
Merit: 1061
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

Umm ... the 1080 was a GP 104 with 256bit bus width. The 1080ti was a GP102 with 352 bit bus width, thats where the hash increase is coming from. That, and better binned memory that can clock higher.



yeah, that means even 3080 and 3090 both have the same type of vram. other things can make the the other (3090) more faster than 3080, they have different bus width too.

I just said both have gddr5x but the other one is 42% faster, the statement itself implies that there are other factors too.

well you just have to read english more buddy.




about  "simple mining", i'm using windows 7, i heard there are issues with windows 10 about windows reserving 1gb of vram so if your gpu is 8gb, mining will be 7gb effective. with 24gb or cards with 16gb and above (amd and nvidia future cards). that won't be an issue any more, maybe i'll do windows 10 mining starting year 2023-25 hehe
legendary
Activity: 2366
Merit: 1408

If a 5700 xt does 50-52 mh at 125-130 at the wall
and the 3090 does 150 mh at. 230 at the wall

it simply does not have a huge edge.  also I don't think it will do 150  at 230 watts.

more like 130 at 230 watts.  but I  consider buying one as soon as possible just to benchmark on my threadripper.

My 5700 XT Gigabyte does 54Mhs exactly, consuming 135 from the wall

Of course there's room for improvement for 3090 and Nvidia another cards with miners and drivers, but I don't see a huge leap to kill older generations of cards, since polaris it's almost the same, a good gain on hashrate, more efficiency, but not overkill cards
People still use polaris cards to mine

In my country will be difficulty to buy any new generation of cards with a reasonably good price
legendary
Activity: 4256
Merit: 8551
'The right to privacy matters'
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

Umm ... the 1080 was a GP 104 with 256bit bus width. The 1080ti was a GP102 with 352 bit bus width, thats where the hash increase is coming from. That, and better binned memory that can clock higher.

Quote

3080 is 100mhs (there is a post that says 115mhs at 300w) if you do the math of adding 42% of 3080 to 100mhs that's 142mhs.

now nvidia says 3090 as titan like(they phased out titan, this is the new titan), if 3080ti is not released yet, then 3090 is better than 3080ti

if 3090 is better than 3080ti, then 150mhs is not impossible.


I dont think you understand anything about hardware, hash-rate or how the two correlate. I recommend some more reading.

Even if the 3090 does 150mh it will need 220-230 watts to do it.
And it simply is not a lot better than amd 5700xt for eth mining.

If a 5700 xt does 50-52 mh at 125-130 at the wall
and the 3090 does 150 mh at. 230 at the wall

it simply does not have a huge edge.  also I don't think it will do 150  at 230 watts.

more like 130 at 230 watts.  but I  consider buying one as soon as possible just to benchmark on my threadripper.
hero member
Activity: 751
Merit: 517
Fail to plan, and you plan to fail.
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

Umm ... the 1080 was a GP 104 with 256bit bus width. The 1080ti was a GP102 with 352 bit bus width, thats where the hash increase is coming from. That, and better binned memory that can clock higher.

Quote

3080 is 100mhs (there is a post that says 115mhs at 300w) if you do the math of adding 42% of 3080 to 100mhs that's 142mhs.

now nvidia says 3090 as titan like(they phased out titan, this is the new titan), if 3080ti is not released yet, then 3090 is better than 3080ti

if 3090 is better than 3080ti, then 150mhs is not impossible.


I dont think you understand anything about hardware, hash-rate or how the two correlate. I recommend some more reading.
legendary
Activity: 4256
Merit: 8551
'The right to privacy matters'

1390 more  to set up


i'll get along with your analogy.

$ 1390 more..

for people who think 4 card is a hell lot more easier to manage than 12 card rig?

for people who are thinking of higher resell value?

for people who think they can have more coins to mine(nvidia has more options than amd)?

is 1390 more worth it?..

for me yes...I was able to travel the previous years while those rigs are mining hehe

one 4 card rig vs three 4 card rigs.  and I forgot to mention if you use simple mining it is 2 bucks a month for your rig and 6 for my 3 rigs

so that is 4 x 12 = 48 a year in savings.

Yeah a lot depends on how many rigs you want to do.

But 1390 price difference means it is not a killer of the 5700xt.

both cases you get 600mhs

one case is 1 four card rig at 1000-1100 watts at  cost of 6200-6300 usd
one case is 3 four card rigs at 500-550 watts each total of around 1650 watts. at a cost of 4800-4900 usd

If rig density is an issue you may want the 3090.

If cards keep turning a profit  the 3090 is pretty good.
If it all crashes you still have resale on the 3090

But my point is to argue that the 3090 crushes the 5700xt is simply not the case.

Like I said I may buy 2 of them for my thread rippers.
legendary
Activity: 3444
Merit: 1061

1390 more  to set up


i'll get along with your analogy.

$ 1390 more..

for people who think 4 card is a hell lot more easier to manage than 12 card rig?

for people who are thinking of higher resell value?

for people who think they can have more coins to mine(nvidia has more options than amd)?

is 1390 more worth it?..

for me yes...I was able to travel the previous years while those rigs are mining hehe
legendary
Activity: 4256
Merit: 8551
'The right to privacy matters'
If it does 150 at 180 watts and only costs 1000 it is a good card. Really good.


But if it does 150 at 230 watts  and costs 1500 it is less good.


Here is usa price 1499
get notified for purchase of it.

https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/


Now a 5700 can be found for 350 it will do 52

so 3 will do 156 at 375watts at wall for 1050

and this may do 150 at 230 watts at wall for 1499

a 449usd in the hole for a little less hash.  Say hash is equal to the 3

then you are  saving 145 watts an hour or 3.5 k-watts a day

that is:

10.5 cents a day at 3 cent power or   4761 days
21.0 cents a day at 6 cent power or   2380 days
31.5 cents a day at  9 cent power or  1587 days
42.0 cents a day at 12 cent power or 1190 days
52.5 cents a day at 15 cent power or   952 days


now granted it may take 1  or 1.45 slot space. vs 3 slot space.  But with these boards below.

https://www.ebay.com/itm/biostar-TB250-BTC-D-Pro-Motherboard-Mining-w-CPU-RAM/124292825096?

running 4 of these 3090 vs   12 5700 xt.     same hash rate


   so 155 vs 465  add a decent cooler 175 vs 525 and 1 apw3+ vs 3 apw3+ is 30 vs 90

so 205 to setup 4x 3090
or 615 to setup 12 5700 xt

both do 600mh

the 3090 is   6000+205 = 6205. uses about 1000-1100 watts
the 5700xt is 4200+615 = 4815 uses about 1500-1600 watts

so maybe you save 12 kwatts a day

3 cent power =  36 cents.  daily.  3861 days
6 cent  power =  72 cents. daily.  1930 days
9 cent power = 1.08 usd.   daily.  1287 days
12 cent power = 1.44 usd  daily.    965 days
15 cent power = 1.80  usd daily.   772 days

1390 more  to set up

So you can see it is not a big deal if it does 150 at 230 watts

Heck to be a big deal at 1400 price it needs to do 150 at less then 150 watts

cause you would then save 20 kwatts a day  if it did 150 watts a card vs 230

or

3 cent = 60 cents. or 2316 days
6 cent = 1.20.      or 1158
9 cent = 1.80.     or.   772
12 cent = 2.40.   or    579 days
15 cent = 3.00   or.    463 days

so even if it does 150 mh at 150 watts the price of 1499 makes it not the best mining card.

I would buy 1 or 2 of them and use with my thread rippers.
legendary
Activity: 3444
Merit: 1061
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

3080 is 100mhs (there is a post that says 115mhs at 300w) if you do the math of adding 42% of 3080 to 100mhs that's 142mhs.

now nvidia says 3090 as titan like(they phased out titan, this is the new titan), if 3080ti is not released yet, then 3090 is better than 3080ti

if 3090 is better than 3080ti, then 150mhs is not impossible.


XD you cannot add 42% and say "yes thats the hashrate" XD
And a screenshot say nothing. Wait i make a screenshot with 2000mh hashrate for the 3070, you believe this?
you will see, 3080 nothing more than 70mh.

In the past, much people say AMD RX5700 will reach 70 or 80 mh, but what is the fact?

LOL dude, 1080 and 1080ti are old hehehe ... screenshots are for the ones who didn't see the hashrate yet  Cheesy

LOL dude read my post again. I edited. I know that, and what will you say me with your sentense? You say there is screenshot with 115mh. Where? There is screenshot from a table. So i make screenshot from a table, where i write 2000mh per card, you believe this?

https://tekdeeps.com/in-china-miners-have-already-slammed-down-on-the-new-geforces/

https://twitter.com/RedPandaMining/status/1301935848493993984

there are already two sources pointing to ~100mhs

anyway for your sake here is a 1080 and 1080ti comparison



anyway my point is about 150mhs "possibility" from gddr5x to 6 to 6x. that's two vram generation leaps. understand the analogy. then you will understand why it is a "possibility"

unless you already bought a lot of 5700xt already hehehe  Grin


Example:
2080ti 616 GBps can do 52mh

RTX 3070 512 GBps are around 83% from 2080ti so hashrate 43mh but i think the 30x0 is a bit faster so it can do 45mh

RTX 3080 760 Gbps are around 123 from 2080ti so hashrate 63mh but i think the 30x0 is a bit faster so it can do 65mh


no wonder...you still think that 2080ti is 52mh (probably taken from whattomine) while my 1080ti is 50.5 mhs LOL . i can smell AMD cards around you hehehe Grin

there is no need for fanboi-ism for miners...actually i really want the hbm2e from amd for my next upgrade.
member
Activity: 1558
Merit: 69
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

3080 is 100mhs (there is a post that says 115mhs at 300w) if you do the math of adding 42% of 3080 to 100mhs that's 142mhs.

now nvidia says 3090 as titan like(they phased out titan, this is the new titan), if 3080ti is not released yet, then 3090 is better than 3080ti

if 3090 is better than 3080ti, then 150mhs is not impossible.


XD you cannot add 42% and say "yes thats the hashrate" XD
And a screenshot say nothing. Wait i make a screenshot with 2000mh hashrate for the 3070, you believe this?
you will see, 3080 nothing more than 70mh.
In the past, much people say AMD RX5700 will reach 70 or 80 mh, but what is the fact?

LOL dude, 1080 and 1080ti are old hehehe ... screenshots are for the ones who didn't see the hashrate yet  Cheesy

LOL dude read my post again. I edited. I know that, and what will you say me with your sentense? You say there is screenshot with 115mh. Where? There is screenshot from a table. So i make screenshot from a table, where i write 2000mh per card, you believe this?
legendary
Activity: 3444
Merit: 1061
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

3080 is 100mhs (there is a post that says 115mhs at 300w) if you do the math of adding 42% of 3080 to 100mhs that's 142mhs.

now nvidia says 3090 as titan like(they phased out titan, this is the new titan), if 3080ti is not released yet, then 3090 is better than 3080ti

if 3090 is better than 3080ti, then 150mhs is not impossible.


XD you cannot add 42% and say "yes thats the hashrate" XD
And a screenshot say nothing. Wait i make a screenshot with 2000mh hashrate for the 3070, you believe this?
you will see, 3080 nothing more than 70mh.
In the past, much people say AMD RX5700 will reach 70 or 80 mh, but what is the fact?

LOL dude, 1080 and 1080ti are old hehehe ... screenshots are for the ones who didn't see the hashrate yet  Cheesy
member
Activity: 1558
Merit: 69
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

3080 is 100mhs (there is a post that says 115mhs at 300w) if you do the math of adding 42% of 3080 to 100mhs that's 142mhs.

now nvidia says 3090 as titan like(they phased out titan, this is the new titan), if 3080ti is not released yet, then 3090 is better than 3080ti

if 3090 is better than 3080ti, then 150mhs is not impossible.


XD you cannot add 42% and say "yes thats the hashrate" XD
And a screenshot say nothing. Wait i make a screenshot with 2000mh hashrate for the 3070, you believe this?
you will see, 3080 nothing more than 70mh.

Example:
2080ti 616 GBps can do 52mh

RTX 3070 512 GBps are around 83% from 2080ti so hashrate 43mh but i think the 30x0 is a bit faster so it can do 45mh

RTX 3080 760 Gbps are around 123 from 2080ti so hashrate 63mh but i think the 30x0 is a bit faster so it can do 65mh


In the past, much people say AMD RX5700 will reach 70 or 80 mh, but what is the fact?
legendary
Activity: 3444
Merit: 1061
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.


1080 is 35mhs
1080ti is 50mhs

that's 42.8% difference between 1080 and 1080ti, both gddr5x

3080 is 100mhs (there is a post that says 115mhs at 300w) if you do the math of adding 42% of 3080 to 100mhs that's 142mhs.

now nvidia says 3090 as titan like(they phased out titan, this is the new titan), if 3080ti is not released yet, then 3090 is better than 3080ti

if 3090 is better than 3080ti, then 150mhs is not impossible.
hero member
Activity: 751
Merit: 517
Fail to plan, and you plan to fail.
miner software optimization might get 3090 to 150mhs...hopefully  Grin

No, Eth is memory bound, 3080 and 3090 have the same memory type and speed so the minor difference we will see would be due to 320 vs 384bit bus width. And this isnt on-die HBM, dont expect 90+ on anything.

I pressed my Shenzen source for target selling price for 3090 but no news yet, however my local supplier hinted likely above 1k. Hmmmmm.....

No, prices would not be lower than the founders edition on almost anything. Not this year at-least.
legendary
Activity: 3444
Merit: 1061
miner software optimization might get 3090 to 150mhs...hopefully  Grin

I pressed my Shenzen source for target selling price for 3090 but no news yet, however my local supplier hinted likely above 1k. Hmmmmm.....

3090 is the only gpu that is useful for personal use (best experience with the latest tech).

4k gaming 120 - 144 fps partnered with 32in 4k 120/144hz monitor (fps and hz in sync).

huge monitors like around 43in more or less (latest huge monitors are in 60hz refresh rate currently), huge monitors needs 8k to be an eye candy, 3090 is 60fps at 8k.

for me, good mining density starts with 1 is to 3...3 low end cards vs 1 high end card, like 3x 3090 = 9x 3070.

1 is to 2.5x is a hard choice, you might still get the 3x or more in some coins hehe.
legendary
Activity: 1834
Merit: 1080
---- winter*juvia -----
miner software optimization might get 3090 to 150mhs...hopefully  Grin

I pressed my Shenzen source for target selling price for 3090 but no news yet, however my local supplier hinted likely above 1k. Hmmmmm.....
Jump to: