I have a feeling that is exactly what nvidia is trying to do. I mean most people dont plan to upgrade from their gtx 1080ti because it has 11gb and if they have a 1440 or a 4k monitor, can still even play many games at 4k with that amount of gddr, 10GB for a 3080 this time and age is low, I say that 16gb is minimum for a card like that, even if it was a 12gb would be doable. It's funny, the 3090 24gb x 10gb 3080, nvidia really want people to buy the 3090. Nvidia simply killed the 3080 even before launching it, dead on arrival.
3080 maybe good for mining if gddr6x proves it can hash a significant amount better than gddr6, gddr5x, and gddr5. we'll see..
besides there are gamers that can still push up to $ 700 for GPU budget but not for a $ 1500 one. If results shows it is better than 3070 where an extra $100 can justify the upgrade.
then the 3070ti with 16gb of gddr6 will be an upgrade because it can play games that requires more than 10gb so 3070 users will upgrade to 3070ti and 3080 users with 10gb will upgrade to 3080ti with 16gb.
I think 8nm of 3000 series cards will be a long life span generation before the next big breakthrough will come. where nvidia decides to pull off this "8-10gb to be upgraded to 16gb" double sell technique LOL.
anyway if you bought 3080 10gb ($700) and upgrade to 3080ti 16gb ($800) that's, $1500 all in all purchase..just like 3090 price.....me? fuck that shit spare me the trouble I'm getting the 3090 24gb for my main PC hehe
some will say why not wait for the 3080ti then? well..mining profit of 50% ROI (modest estimate) will make your $1500 purchase equal to a $750 card LOL.
The 3070ti will have to be a hell of a card, i mean, the 3070 is a disappointment and maybe that is what nvidia intended to do, 3070 raw performance is way behind the 3080, so much so that I think the 3070 is somehow a 3060 super because of the 256bit memory hehe, nvidia launched an early 3060 super hehe and that is the 3070 ehhe, the 3070ti might be the real 3070, looking back 2016, there was a gtx 1070 256 bit and the 1080 256 bit too, 1070 was clearly the only gpu to have, nvidia killed the 1080 on arrival, now they've chosen to kill the 3070 on arrival too hehe, if wasn't for the 320 bit memory and amazing raw performance of the 3080, that would be dead of arrival too, can still use it for a year or 2, depending the games you play and the monitor you have, I myself have a 4k monitor and I just can't go back to 1440p anymore.
Nvidia chose the 8nm because they went previously 12nm, so 16nm, 12nm, 8nm and likely 4nm next and also chose samsung because of favours between friends hehe, tsmc is more inclined to amd.
There is no point waiting for the 3080ti and if it comes will be next year march~august, around that and that is too far away, amd can have a 5nm ampere killer by then. AMD is not like before, we still have to see this bignavi but amd changed, they are different, they could this time still lag behind nvidia but they are getting closer to nvidia on every release. The 5700 was amazing, I did not think they could compete equally with 12nm 2070 on price performance and they did. So i'm not counting AMD out at this time.
just like in the past nvidia is competing with itself, a sign that amd might be again under nvidia.
I agree with not counting out AMD, the fury with HBM vrams did really well, the radeon vii did very well too (100mh ETH hash) (BTW cores are the bottleneck LOL)...if big navi uses HBM (new generation) tech it really can compete but if they use gddr6..well nvidia wins again.
amd in "gpu arena" did a thread ripper fuck up style with early HBM adaptions...we will see if they can pull a ryzen 9 39xx thing with big navi GPUs..(watching HBM tech closely)
Those benchs looks very close to what I predicted, 3090 around 20% faster than 3080, and 3080 50% faster than 3070.
I predicted that x2 hashrate of 100mhs as a possibility, people are so 2017 with 50mhs per card LOL. also I predicted up to 150mhs as unlikely but not impossible, driver optimizations and mining software tweaks might squeeze more hashes.