Pages:
Author

Topic: New NVIDIA Geforce RTX 30 series GPUs - page 7. (Read 3947 times)

legendary
Activity: 3444
Merit: 1061
September 04, 2020, 08:48:16 PM
#45
other games, if you google can consume 7-9gb vram, that's 2017-2019. upcoming new games can easily double that, so 16gb is the sweet spot for gpu graphics.

I have a feeling that is exactly what nvidia is trying to do. I mean most people dont plan to upgrade from their gtx 1080ti because it has 11gb and if they have a 1440 or a 4k monitor, can still even play many games at 4k with that amount of gddr, 10GB for a 3080 this time and age is low, I say that 16gb is minimum for a card like that, even if it was a 12gb would be doable. It's funny, the 3090 24gb x 10gb 3080, nvidia really want people to buy the 3090. Nvidia simply killed the 3080 even before launching it, dead on arrival.

3080 maybe good for mining if gddr6x proves it can hash a significant amount better than gddr6, gddr5x, and gddr5. we'll see..

besides there are gamers that can still push up to $ 700  for GPU budget but not for a $ 1500 one. If results shows it is better than 3070 where an extra $100 can justify the upgrade.

then the 3070ti with 16gb of gddr6 will be an upgrade because it can play games that requires more than 10gb so 3070 users will upgrade to 3070ti and 3080 users with 10gb will upgrade to 3080ti with 16gb.

I think 8nm of 3000 series cards will be a long life span generation before the next big breakthrough will come. where nvidia decides to pull off this "8-10gb to be upgraded to 16gb" double sell technique LOL.

anyway if you bought 3080 10gb ($700) and upgrade to 3080ti 16gb ($800) that's, $1500 all in all purchase..just like 3090 price.....me? fuck that shit spare me the trouble I'm getting the 3090 24gb for my main PC hehe

some will say why not wait for the 3080ti then? well..mining profit of 50% ROI (modest estimate) will make your $1500 purchase equal to a $750 card LOL.

The 3070ti will have to be a hell of a card, i mean, the 3070 is a disappointment and maybe that is what nvidia intended to do, 3070 raw performance is way behind the 3080, so much so that I think the 3070 is somehow a 3060 super because of the 256bit memory hehe, nvidia launched an early 3060 super hehe and that is the 3070 ehhe, the 3070ti might be the real 3070, looking back 2016, there was a gtx 1070 256 bit and the 1080 256 bit too, 1070 was clearly the only gpu to have, nvidia killed the 1080 on arrival, now they've chosen to kill the 3070 on arrival too hehe, if wasn't for the 320 bit memory and amazing raw performance of the 3080, that would be dead of arrival too, can still use it for a year or 2, depending the games you play and the monitor you have, I myself have a 4k monitor and I just can't go back to 1440p anymore.

Nvidia chose the 8nm because they went previously 12nm, so 16nm, 12nm, 8nm and likely 4nm next and also chose samsung because of favours between friends hehe, tsmc is more inclined to amd.

There is no point waiting for the 3080ti and if it comes will be next year march~august, around that and that is too far away, amd can have a 5nm ampere killer by then. AMD is not like before, we still have to see this bignavi but amd changed, they are different, they could this time still lag behind nvidia but they are getting closer to nvidia on every release. The 5700 was amazing, I did not think they could compete equally with 12nm 2070 on price performance and they did. So i'm not counting AMD out at this time.

just like in the past nvidia is competing with itself, a sign that amd might be again under nvidia.

I agree with not counting out AMD, the fury with HBM vrams did really well, the radeon vii did very well too (100mh ETH hash) (BTW cores are the bottleneck LOL)...if big navi uses HBM (new generation) tech it really can compete but if they use gddr6..well nvidia wins again.

amd in "gpu arena" did a thread ripper fuck up style with early HBM adaptions...we will see if they can pull a ryzen 9 39xx thing with big navi GPUs..(watching HBM tech closely)


Those benchs looks very close to what I predicted, 3090 around 20% faster than 3080, and 3080 50% faster than 3070.

I predicted that x2 hashrate of 100mhs as a possibility, people are so 2017 with 50mhs per card LOL. also I predicted up to 150mhs as unlikely but not impossible, driver optimizations and mining software tweaks might squeeze more hashes.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 04, 2020, 03:26:22 PM
#44

Those benchs looks very close to what I predicted, 3090 around 20% faster than 3080, and 3080 50% faster than 3070.
hero member
Activity: 539
Merit: 517
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 04, 2020, 02:06:42 PM
#42
other games, if you google can consume 7-9gb vram, that's 2017-2019. upcoming new games can easily double that, so 16gb is the sweet spot for gpu graphics.

I have a feeling that is exactly what nvidia is trying to do. I mean most people dont plan to upgrade from their gtx 1080ti because it has 11gb and if they have a 1440 or a 4k monitor, can still even play many games at 4k with that amount of gddr, 10GB for a 3080 this time and age is low, I say that 16gb is minimum for a card like that, even if it was a 12gb would be doable. It's funny, the 3090 24gb x 10gb 3080, nvidia really want people to buy the 3090. Nvidia simply killed the 3080 even before launching it, dead on arrival.

3080 maybe good for mining if gddr6x proves it can hash a significant amount better than gddr6, gddr5x, and gddr5. we'll see..

besides there are gamers that can still push up to $ 700  for GPU budget but not for a $ 1500 one. If results shows it is better than 3070 where an extra $100 can justify the upgrade.

then the 3070ti with 16gb of gddr6 will be an upgrade because it can play games that requires more than 10gb so 3070 users will upgrade to 3070ti and 3080 users with 10gb will upgrade to 3080ti with 16gb.

I think 8nm of 3000 series cards will be a long life span generation before the next big breakthrough will come. where nvidia decides to pull off this "8-10gb to be upgraded to 16gb" double sell technique LOL.

anyway if you bought 3080 10gb ($700) and upgrade to 3080ti 16gb ($800) that's, $1500 all in all purchase..just like 3090 price.....me? fuck that shit spare me the trouble I'm getting the 3090 24gb for my main PC hehe

some will say why not wait for the 3080ti then? well..mining profit of 50% ROI (modest estimate) will make your $1500 purchase equal to a $750 card LOL.

The 3070ti will have to be a hell of a card, i mean, the 3070 is a disappointment and maybe that is what nvidia intended to do, 3070 raw performance is way behind the 3080, so much so that I think the 3070 is somehow a 3060 super because of the 256bit memory hehe, nvidia launched an early 3060 super hehe and that is the 3070 ehhe, the 3070ti might be the real 3070, looking back 2016, there was a gtx 1070 256 bit and the 1080 256 bit too, 1070 was clearly the only gpu to have, nvidia killed the 1080 on arrival, now they've chosen to kill the 3070 on arrival too hehe, if wasn't for the 320 bit memory and amazing raw performance of the 3080, that would be dead of arrival too, can still use it for a year or 2, depending the games you play and the monitor you have, I myself have a 4k monitor and I just can't go back to 1440p anymore.

Nvidia chose the 8nm because they went previously 12nm, so 16nm, 12nm, 8nm and likely 4nm next and also chose samsung because of favours between friends hehe, tsmc is more inclined to amd.

There is no point waiting for the 3080ti and if it comes will be next year march~august, around that and that is too far away, amd can have a 5nm ampere killer by then. AMD is not like before, we still have to see this bignavi but amd changed, they are different, they could this time still lag behind nvidia but they are getting closer to nvidia on every release. The 5700 was amazing, I did not think they could compete equally with 12nm 2070 on price performance and they did. So i'm not counting AMD out at this time.
legendary
Activity: 3444
Merit: 1061
September 04, 2020, 09:36:39 AM
#41
other games, if you google can consume 7-9gb vram, that's 2017-2019. upcoming new games can easily double that, so 16gb is the sweet spot for gpu graphics.

I have a feeling that is exactly what nvidia is trying to do. I mean most people dont plan to upgrade from their gtx 1080ti because it has 11gb and if they have a 1440 or a 4k monitor, can still even play many games at 4k with that amount of gddr, 10GB for a 3080 this time and age is low, I say that 16gb is minimum for a card like that, even if it was a 12gb would be doable. It's funny, the 3090 24gb x 10gb 3080, nvidia really want people to buy the 3090. Nvidia simply killed the 3080 even before launching it, dead on arrival.

3080 maybe good for mining if gddr6x proves it can hash a significant amount better than gddr6, gddr5x, and gddr5. we'll see..

besides there are gamers that can still push up to $ 700  for GPU budget but not for a $ 1500 one. If results shows it is better than 3070 where an extra $100 can justify the upgrade.

then the 3070ti with 16gb of gddr6 will be an upgrade because it can play games that requires more than 10gb so 3070 users will upgrade to 3070ti and 3080 users with 10gb will upgrade to 3080ti with 16gb.

I think 8nm of 3000 series cards will be a long life span generation before the next big breakthrough will come. where nvidia decides to pull off this "8-10gb to be upgraded to 16gb" double sell technique LOL.

anyway if you bought 3080 10gb ($700) and upgrade to 3080ti 16gb ($800) that's, $1500 all in all purchase..just like 3090 price.....me? fuck that shit spare me the trouble I'm getting the 3090 24gb for my main PC hehe

some will say why not wait for the 3080ti then? well..mining profit of 50% ROI (modest estimate) will make your $1500 purchase equal to a $750 card LOL.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 04, 2020, 08:08:20 AM
#40
other games, if you google can consume 7-9gb vram, that's 2017-2019. upcoming new games can easily double that, so 16gb is the sweet spot for gpu graphics.

I have a feeling that is exactly what nvidia is trying to do. I mean most people dont plan to upgrade from their gtx 1080ti because it has 11gb and if they have a 1440 or a 4k monitor, can still even play many games at 4k with that amount of gddr, 10GB for a 3080 this time and age is low, I say that 16gb is minimum for a card like that, even if it was a 12gb would be doable. It's funny, the 3090 24gb x 10gb 3080, nvidia really want people to buy the 3090. Nvidia simply killed the 3080 even before launching it, dead on arrival.
legendary
Activity: 3444
Merit: 1061
September 04, 2020, 01:59:07 AM
#39

i didn't specifically say far cry about the vram size issue, far cry is about 1080p vs 1440p and 4k gaming

the textures (other game settings that eats vram besides resolution) in resident evil 2 is a good example LOL

https://linustechtips.com/main/topic/1141516-wtf-my-2080-ti-doesn%E2%80%99t-have-enough-vram-for-resident-evil-2-%3F/

other games, if you google can consume 7-9gb vram, that's 2017-2019. upcoming new games can easily double that, so 16gb is the sweet spot for gpu graphics.

the 3080ti or 3080 super (16gb or 20gb) is supposed to be the best bang for the buck but nvidia must be saving that "ace card" for amd release, if amd manages to pull off a 3080 killer.
legendary
Activity: 3444
Merit: 1061
September 04, 2020, 12:10:04 AM
#37
Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.

monitor size, screen resolution  plus other graphics quality config on PC games tends to eat a lot of vram, my 1080ti 11gb ram gets filled when i max out game settings with my 24 inch 1900x1200 resolution..having ~34 inches monitor will need that 16gb at least, 3080 "10gb" is a teaser, gamers will hate that later on.

1080p gaming is dead, when i played new dawn of far cry, in the hazy/hallucinogen part of the game, the difference with 1080p vs 1440p and 4k with enemy visibility is huge.

10gb of vram for mining, i think it will be fine for quite a long time unless there is some mining software tweak of new algorithms and/or existing algorithms that will hash a lot more when more vram (greater than 10gb) is utilized.

besides, check this out https://www.youtube.com/watch?v=EGzsYCRMVu4 doom eternal 4k gaming, the fps dips as low as 110fps. for gamers that uses and want a 144 hz and above monitor and fps that does not dip below 144fps (hz and fps in sync), 3090 is the answer.
legendary
Activity: 3444
Merit: 1061
September 03, 2020, 11:55:13 PM
#36
Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.

or..... 3080ti/super is an ace under nvidia's sleeve, to be released if amd managed to pull off a 3080 killer hehe
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 03, 2020, 05:20:34 PM
#35
I thought the 3090 was targetted at 8K gaming. Maybe the big VRAM is related to SLI support. My understanding is that only
one GPU's VRAM is used in SLI.

It's also interesting that the 3080 is marketed as the flagship suggesting the 3090 is experimental or special.

Yes, only one vram is used while on SLI and the 3090 is the only gpu that can do SLI, reason they said 8k gaming at 60 fps but majority of these cards are for deep learning, reason they left the 3080 for dead, people would rather buy the 3080 16gb plus for deep learning, so basically what the did here is, want deep learning? 3090 is the only card to do it, want 8k gaming? 3090 is the only gpu to do cause sli support, and lets price the 3090, 2 x the 3080. I believe the 3090 is not even 50% faster than the 3080, more like 25% more performance and that is going overboard with it.

Looking back the last x90 we had was the GeForce GTX 690, price $1000 in 2012.
full member
Activity: 1424
Merit: 225
September 03, 2020, 05:04:20 PM
#34
Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.

I thought the 3090 was targetted at 8K gaming. Maybe the big VRAM is related to SLI support. My understanding is that only
one GPU's VRAM is used in SLI.

It's also interesting that the 3080 is marketed as the flagship suggesting the 3090 is experimental or special.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 03, 2020, 04:35:42 PM
#33
Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.

The 3090 is targeted at AI ie, deep learning..., if for gaming 24gb makes no sense, not bad to have but no point in having for games that amount right now, 16gb should be the normal for 2020, 2016 was 8gb, only 16gb makes sense for a flagship like rtx 3080 in this time and age. It seems micron is having a hard time to produce enough gddrx6 for it, they dont want to sacrifice the ones on 3090 and want the 3080 to take the hit, I guess the 3090 has a higher margin of profit, so the good gddrx6 memory modules will all go to 3090 and the leftover to 3080. I guess nvidia will launch a 3080 super later on with 16gb once things get to hard mass production at micron factories.
hero member
Activity: 2548
Merit: 950
fly or die
September 03, 2020, 09:30:09 AM
#32
I think, today already its not very likely that ETH switch to POS (same as assuming that Bitcoin switch to POS).

Except that Bitcoin never said it would go POS. Maybe it's a perpetual threat to keep miners interested.

Anyways back on topic...

The VRAM size gap between the 3090 & 3080 way out of line with the cuda core count difference.
I wonder why.

Yeah it's always annoying, although you pay the price for that VRAM. But 10GB should be enough to play in 4K with all details, having a bit extra would be nice (12GB or 16GB), but I don't really see the point of 24GB, for gaming at least.
full member
Activity: 826
Merit: 103
September 03, 2020, 06:37:05 AM
#31

I hope they will be on sale soon in stores, this way the prices of the 5700 will be lowered a lot. AMD WILL ALWAYS BE BETTER FOR MINING

That will depend on what coin you want to mine. AMD is more efficient on some algos, while Nvidia gpu's are more efficient on other algos. I've already seen a lot of mining hardware come down in price on Ebay and other sites. Right now you can buy a used 2080ti for around 500 dollars, that is a lot lower than just recently.
jr. member
Activity: 298
Merit: 3
September 02, 2020, 05:19:54 PM
#30
rtx 3090 have NVIDIA CUDA® Cores   10496  maybe mining other algo is better than eth?
full member
Activity: 826
Merit: 103
September 02, 2020, 04:20:54 PM
#29
The 3080 seems reasonably priced for the performance. All who sold their 2080 in the last few months have made the right call, now the 2080 just lost a lot of value. I'm talking gaming performance here, for mining we shall wait for some tests/optimizations.

These prices also make me think that nvidia expects a strong showing from AMD so that's a good thing, too.

Yes, I believe so as well, there was talk from AMD of the Nvidia killer, so perhaps Nvidia is gearing up with top performance from the 3000 series to meet the threat from Big Navi. Will be exiting to see AMD's answer to this surprising big move in performance from the 3080. Competition is great.
member
Activity: 1558
Merit: 69
September 02, 2020, 02:43:58 PM
#28
I've jotted down some quick napkin math.  Hashrate isn't necessarily perfectly linear to total memory bandwidth, but it is a somewhat reasonable analogue to give round numbers.

RTX 3090* ~935-1008 GBps
RTX 3080* 760 GBps
RX 5700 XT - 480 GBps  -  58 mh/s (eth)
RTX 2080ti - 616 GBps - 52 mh/s (eth)
GTX 1660 super - 336 GBps - 30 mh/s (eth)
RX 580 - 256 GBps - 30 mh/s (eth)

I think it would be somewhat reasonable to take a guess at 3080 hashrates in the 60s or low 70s and 3090 in the 90s.  They are neither low tdp nor cheap, so to me this doesn't represent a value proposition.  But ymmv.

Source for the RTX 3000 card : https://wccftech.com/nvidia-geforce-rtx-3070-8-gb-official-launch-price-specs-performance/

RTX 3090          - 936 Gbps - 75~80mh/s (eth) ??
RTX 3080          - 760 Gbps - 63~67mh/s (eth) ??
RTX 3070          - 512 GBps - 40~45mh/s (eth) ??
RX 5700 XT       - 480 GBps - 58 mh/s (eth)
RTX 2080ti        - 616 GBps - 52 mh/s (eth)
GTX 1660 super - 336 GBps - 30 mh/s (eth)
RX 580             - 256 GBps - 30 mh/s (eth)

If that's the case, it looks like sticking with RX 5700 XT would be better.

We can't really just linearly correlate hashing power like that. Factors such as the new GDDR6X compared to the previous GDDR5 versions could offer a boost to hash power and much more, or not. If i had to guess the hashing power boost should be much better.

Anyway i'm convinced to get a 3080 with the new price range. At least one.

GDDR6X is the same thing as GDDR5X in the past, it will not boost anything, can only clock higher than GDDR6 and thats it.
full member
Activity: 1424
Merit: 225
September 02, 2020, 02:10:19 PM
#27
I think, today already its not very likely that ETH switch to POS (same as assuming that Bitcoin switch to POS).

Except that Bitcoin never said it would go POS. Maybe it's a perpetual threat to keep miners interested.

Anyways back on topic...

The VRAM size gap between the 3090 & 3080 way out of line with the cuda core count difference.
I wonder why.

Not an issue for mining but the 3090 is the only one that supports SLI but it's a 3 slot card, leaves no room in the case
for ventilation.

Ampere uses Cuda 11 and compute 8 but I don't see anything that helps mining. Significant software improvements
look unlikely.
newbie
Activity: 72
Merit: 0
September 02, 2020, 01:55:52 PM
#26

I hope they will be on sale soon in stores, this way the prices of the 5700 will be lowered a lot. AMD WILL ALWAYS BE BETTER FOR MINING
Pages:
Jump to: