$2,500 RTX-5090 ( 60% faster than 4090 )

Eye tracking would be another one of those tech that (specially if the trend of monitor getting bigger for gamer continue and obviously already there for VR), there 1-2 degree of vision that need to be high resolution (and some margin of error around), specially if the system get 1000 fps fast to react quick enough to a fast eye change, we could be in 300 dpi on a very small portion of the screen and everything else in some old school 720p density and quality I would imagine, where our eye only see movement and bright change of light.
Yea that too.

I believe the Varjo VR headset has a functional implementation of that already, with some caveats of course (did not personally test).
 
Last edited:
This has been floating around on reddit today.
1715278172842.png


If those specs are true the 5090 will be 60% faster... 50% more SM/Cuda Cores/RT Cores/Tensor Cores, 15% Clock increase, 77% more L2 cache, 50% more memory bandwidth.
 
This has been floating around on reddit today.
View attachment 652791

If those specs are true the 5090 will be 60% faster... 50% more SM/Cuda Cores/RT Cores/Tensor Cores, 15% Clock increase, 77% more L2 cache, 50% more memory bandwidth.
Looks like a 24GB card then relying on increase bandwidth from the GDDR7 over G6X and the L2.

If these end up being true.
 
AI reconstruction and frame gen are the future, not pure raster improvements. The 4x stronger GPU will release when we have also have higher refresh rates and resolutions and more complex games, so suddenly you'll need 10x more power again or whatever (random number, but you get the idea).

Not saying we have hit the limits of raster, but when you see how much more power hungry GPUs have become you could actually argue that we have reached some limits already.

And AI reconstruction and frame gen make perfect sense when you think about it for a minute: there's an horrifying amount of GPU/CPU cycles wasted on things human cannot see/perceive whatsoever (it's why those techs work so well, even if they're not perfect yet), so it's essentially just more software optimization and it makes the dream of 1000fps/1000hz + ultra high resolution actually imaginable in our lifetime.
Yes, as you point out, the hardware will be catching up for a long time as it’s a moving target. Upscaling, frame gen, and eye tracking are each good for about a 30% effective performance improvement before they start to noticeably degrade image quality if developers have the time and resources to implement them properly. That's very good, but it still puts the hardware playing catch-up for a long time to come. It makes me reluctant to spend a lot on a GPU in this climate. The 3090 got clobbered by the 4090, which is looking to get wrecked by the 5090, and even a ~$2,500 5090 probably won't be able to max out my display hardware in some of the games I play. Even at the best of times, buying GPU’s is like investing in a melting ice cube, but in this climate it’s 100 degrees outside, so I’m not willing to pay too much for my ice.
 
but in this climate it’s 100 degrees outside,
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...
 
Last edited:
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...
I agree. GPUs more than ever have had staying power where you didn't need to upgrade.
 
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...
Funnily enough, I gamed on a GeForce 3 Ti200 for years until a good deal on a 9800 Pro came along. This goes back to the split market I was talking about in an earlier post. Some people don’t see the point of cards being any faster because they can already max out their displays with room to spare, while the VR and 4k 120+ Hz crowd can’t get a fast enough card to run max settings at any price, and even the 5090 won’t change that.
 
Not sure if GPU ever had longer life time, ability to keep value and longer period before a refresh than now.

Buying a GeForce 3 was much more a melting ice cube, 6 years old 2070 super play every game quite well and will continue to do so for popular AAA game until PS6 only game show up, around a decade later someone bought it, try doing that with a 1996 GPU in 2004...

Although i agree that this is the case, it´s not so simple.

In the Geforce3 era the jump in tech/quality was MASSIVE. I remember the jump from Quake 3 to Doom3. Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.

Today ? The evolution is much more constant but relying in existing tech, incremental updates (better shaders, increased shadows maps etc)

It´s like Raytrace and Raster shadows. RT today is the main drag in GPU evolution.

Raster shadows is as good as the RT with almost no cost, but it´s not trustworthy/correct as the Raytraced one, which are VERY costly.

Man i´m old. I remember the jump from (maybe not entirely time correct):
Doom -> Duke3D - HOLY S***
Duke3D -> Quake 1 - HOLY S***
Quake1 -> Unreal - HOLY S***
Unreal -> Doom3 - HOLY S***
Doom3 -> FarCry - HOLY S***
FarCry -> BF1 - HOLY S***
BF1 to BF2 - HOLY S***
BF2 to Crysis - HOLY S***
Then, something happened, maybe its i getting older, but maybe it´s the "diminishing returns" becoming apparent !?
Crysis to COD MW2 - Pretty good
MW2 to BF3 - Pretty good
BF3 to modern Unreal Engine - Pretty good
UE4/5 to Cyberfunky - Pretty good

And then VR came and..... broken me ?!, monitor/2D/Flat gaming is not exciting as it was.

Playing FO4VR (or any open world game, "VRable" enough) gives a whole new sensory deluge that no image quality stuck in a screen can match.

In HalfLife Alyx i spent hours just appraising the fine details in the weapons, gloves and the world itself. It´s hard to describe, really.

Anyway, the 5090 can´t come soon enough, i hope it bring an uplife of at least 80% in VR perf.
 
Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.
That not how I remember at all, 9700 pro (august 2002)-5950 ultra (october 2003) had launched before Doom 3 and could play it on launch ( I did play Doom 3 on launch on my pc), Doom 3 was August 2004, maybe some leaked pre-built ? Carmark-ID software are not really known to make heavy to run titile, Doom 3 was running on the original 2001 Xbox with its custom Geforce 3.

More mid level GPU like the 6600GT a year later in 2005 could play it perfectly fine:

doom3performance.png


A year felt probably much longer then versus now.

But yes, that mostly the reason, GPU and video game exponential growth slowed down, all easy fruit were grabbed, now it is incredibly hard and costly to make anything better with diminishing return.

GPU now have more not playing the latest game use case than before, the library of old game is obviously much bigger, decoding-encoding platform, crypto for a while now AI workload of many kind....
 
Last edited:
That not how I remember at all, 9700 pro (august 2002)-5950 ultra (october 2003) had launched before Doom 3 and could play it on launch ( I did play Doom 3 on launch on my pc), Doom 3 was August 2004, maybe some leaked pre-built ? Carmark-ID software are not really known to make heavy to run titile, Doom 3 was running on the original 2001 Geforce 3 Xbox.

More mid level GPU like the 6600GT a year later in 2005 could play it perfectly fine:

View attachment 652950

A year felt probably much longer then versus now.

But yes, that mostly the reason, GPU and video game exponential growth slowed down, all easy fruit were grabbed, now it is incredibly hard and costly to make anything better with diminishing return.

GPU now have more not playing the latest game use case than before, the library of old game is obviously much bigger, decoding-encoding platform, crypto for a while now AI workload of many kind....

You are correct about the cards, but to me, those were impossible to get in the time.

IIRC i had a Geforce 2 GTS at the time !
 
Although i agree that this is the case, it´s not so simple.

In the Geforce3 era the jump in tech/quality was MASSIVE. I remember the jump from Quake 3 to Doom3. Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.

Today ? The evolution is much more constant but relying in existing tech, incremental updates (better shaders, increased shadows maps etc)

It´s like Raytrace and Raster shadows. RT today is the main drag in GPU evolution.

Raster shadows is as good as the RT with almost no cost, but it´s not trustworthy/correct as the Raytraced one, which are VERY costly.

Man i´m old. I remember the jump from (maybe not entirely time correct):
Doom -> Duke3D - HOLY S***
Duke3D -> Quake 1 - HOLY S***
Quake1 -> Unreal - HOLY S***
Unreal -> Doom3 - HOLY S***
Doom3 -> FarCry - HOLY S***
FarCry -> BF1 - HOLY S***
BF1 to BF2 - HOLY S***
BF2 to Crysis - HOLY S***
Then, something happened, maybe its i getting older, but maybe it´s the "diminishing returns" becoming apparent !?
Crysis to COD MW2 - Pretty good
MW2 to BF3 - Pretty good
BF3 to modern Unreal Engine - Pretty good
UE4/5 to Cyberfunky - Pretty good

And then VR came and..... broken me ?!, monitor/2D/Flat gaming is not exciting as it was.

Playing FO4VR (or any open world game, "VRable" enough) gives a whole new sensory deluge that no image quality stuck in a screen can match.

In HalfLife Alyx i spent hours just appraising the fine details in the weapons, gloves and the world itself. It´s hard to describe, really.

Anyway, the 5090 can´t come soon enough, i hope it bring an uplife of at least 80% in VR perf.
I'd drain my bank account to play through half life alyx max settings on a VR headset that's 4k per eye with MicroLED
 
Although i agree that this is the case, it´s not so simple.

In the Geforce3 era the jump in tech/quality was MASSIVE. I remember the jump from Quake 3 to Doom3. Doom 3 was unbelivable heavy, like, 3-4 fps with overclock.

It simply erected a wall that no current gpu at the time could surpass. It took YEARS to it be playable in a home pc.

Today ? The evolution is much more constant but relying in existing tech, incremental updates (better shaders, increased shadows maps etc)

It´s like Raytrace and Raster shadows. RT today is the main drag in GPU evolution.

Raster shadows is as good as the RT with almost no cost, but it´s not trustworthy/correct as the Raytraced one, which are VERY costly.

Man i´m old. I remember the jump from (maybe not entirely time correct):
Doom -> Duke3D - HOLY S***
Duke3D -> Quake 1 - HOLY S***
Quake1 -> Unreal - HOLY S***
Unreal -> Doom3 - HOLY S***
Doom3 -> FarCry - HOLY S***
FarCry -> BF1 - HOLY S***
BF1 to BF2 - HOLY S***
BF2 to Crysis - HOLY S***
Then, something happened, maybe its i getting older, but maybe it´s the "diminishing returns" becoming apparent !?
Crysis to COD MW2 - Pretty good
MW2 to BF3 - Pretty good
BF3 to modern Unreal Engine - Pretty good
UE4/5 to Cyberfunky - Pretty good

And then VR came and..... broken me ?!, monitor/2D/Flat gaming is not exciting as it was.

Playing FO4VR (or any open world game, "VRable" enough) gives a whole new sensory deluge that no image quality stuck in a screen can match.

In HalfLife Alyx i spent hours just appraising the fine details in the weapons, gloves and the world itself. It´s hard to describe, really.

Anyway, the 5090 can´t come soon enough, i hope it bring an uplife of at least 80% in VR perf.

I had an ATi Radeon 9600XT back then and it really struggled with the Doom 3 and Far Cry at 640x480 resolution on a CRT monitor. Only Half Life 2 ran decently.

It was only when I got a Geforce 7600GT a few years later that I was able to finally play Doom 3 and Far Cry properly and even on a higher resolution 1024x768 LCD.
 
I'd drain my bank account to play through half life alyx max settings on a VR headset that's 4k per eye with MicroLED

I believe it's coming before 2030. Pimax Crystal Super is looking VERY interesting at 3840x3840 (QLED alas)

I had an ATi Radeon 9600XT back then and it really struggled with the Doom 3 and Far Cry at 640x480 resolution on a CRT monitor. Only Half Life 2 ran decently.

It was only when I got a Geforce 7600GT a few years later that I was able to finally play Doom 3 and Far Cry properly and even on a higher resolution 1024x768 LCD.
i know right !? Doom3 was a brick to run.
 
That not how I remember at all, 9700 pro (august 2002)-5950 ultra (october 2003) had launched before Doom 3 and could play it on launch ( I did play Doom 3 on launch on my pc), Doom 3 was August 2004, maybe some leaked pre-built ? Carmark-ID software are not really known to make heavy to run titile, Doom 3 was running on the original 2001 Xbox with its custom Geforce 3.

More mid level GPU like the 6600GT a year later in 2005 could play it perfectly fine:

View attachment 652950

A year felt probably much longer then versus now.

But yes, that mostly the reason, GPU and video game exponential growth slowed down, all easy fruit were grabbed, now it is incredibly hard and costly to make anything better with diminishing return.

GPU now have more not playing the latest game use case than before, the library of old game is obviously much bigger, decoding-encoding platform, crypto for a while now AI workload of many kind....
Doom 3 was playable on my 9700 pro, 1024x768 at med or low IQ... not super great but finished the game and enjoyed it on that card. My buddy upgraded to a 9800 pro when that came out and Doom 3 ran flat out well on that card at 1024x768 high IQ settings. I then played through it again on a 6800GT and really enjoyed it then with everything cranked. I also modded Doom 3 (shaders and more) later on when I ran gtx470s with surround and wow I think I brought it to the apex of it's IQ potential at 5760x1080 on those cards. I spent a bunch of time modding it summer of 2010.
 
I call it quits at 1k. I like pc gaming but I don't like it that much. Hiking and camping equipment can go a long way with 2.5k. Fuck this shit. Really? Two thousand five hundred dollars for a freaking video card? I will hang onto 4 series for awhile.

If this is what prices for video cards are going to be, I'll just quit pc gaming entirely. No way in hell would I ever pay that for a video card. Hells no. And the crazy thing is people will line up in droves to spend 1/4 of 10k to buy this card. No way in hell for me.
 
I call it quits at 1k. I like pc gaming but I don't like it that much. Hiking and camping equipment can go a long way with 2.5k. Fuck this shit. Really? Two thousand five hundred dollars for a freaking video card? I will hang onto 4 series for awhile.

If this is what prices for video cards are going to be, I'll just quit pc gaming entirely. No way in hell would I ever pay that for a video card. Hells no. And the crazy thing is people will line up in droves to spend 1/4 of 10k to buy this card. No way in hell for me.
Definitely with you on this sentiment for the most part, especially about how far that kind of money will go for hiking, camping, and my newer hobby, photography. The only way I justified the $1,500 4090 FE was the perception of a great deal (i.e., using the 10% Best Buy credit card code that no longer works, sadly), and committing to keeping it until the 6000-series. I don't think I could rationalize $2,500 on a GPU unless I were highly confident it would be good for 5ish years and viable for 8ish years.
 
I call it quits at 1k. I like pc gaming but I don't like it that much. Hiking and camping equipment can go a long way with 2.5k. Fuck this shit. Really? Two thousand five hundred dollars for a freaking video card? I will hang onto 4 series for awhile.

If this is what prices for video cards are going to be, I'll just quit pc gaming entirely. No way in hell would I ever pay that for a video card. Hells no. And the crazy thing is people will line up in droves to spend 1/4 of 10k to buy this card. No way in hell for me.
That's not what the cost of "video cards" is, that's the (presumed) cost of that video card. I'm not quite sure how the flagship price of the flagship card has any bearing on you enjoying video games as a hobby given there is an entire lineup of GPU's from the current and past generation at every price point you could imagine.

Unless you're saying you cannot enjoy video games without the top end GPU available?
 
Last edited:
flagship price of the flagship card
True enough outside chasing very high setting at 4k, say the high-1440p classic affair, how many worth to play game would not run perfectly fine on a 6800xt/7800xt ?

Maybe Hellblade 2 will be one will see (if 30 fps do not play well), end of the days all the new big games has to run on 2020 mid-tier PC hardware at least at 30 fps, buying twice the hardware (4070ti/7900 GRE with a 7700x) to have a lot of room is not cheap obviously, but far from $2500.... for a single item
 
That's not what the cost of "video cards" is, that's the (presumed) cost of that video card. I'm not quite sure how the flagship price of the flagship card has any bearing on you enjoying video games as a hobby given there is an entirely lineup of GPU's from the current and past generation at every price point you could imagine.

Unless you're saying you cannot enjoy video games without the top end GPU available?

It's like people forget that cards below the flagship tier actually exist :ROFLMAO: :ROFLMAO:
 
hey have always been poorly priced
There was window when they did not feel it was the case 9700 pro, 1080TI, 8800 GTX, 680

Take the 7800x3d arguably the best CPU to play game on that exist, better than the $10,000 cpu, they will say it is a reasonable price (even if it price by mm2 is way more than a 4090, the 4090 has like 8.5x the high quality node die size + a little computer with 24gb of really fast vram + giant cooler that come with it)
 
There was window when they did not feel it was the case 9700 pro, 1080TI, 8800 GTX, 680

Take the 7800x3d arguably the best CPU to play game on that exist, better than the $10,000 cpu, they will say it is a reasonable price (even if it price by mm2 is way more than a 4090, the 4090 has like 8.5x the high quality node die size + a little computer with 24gb of really fast vram + giant cooler that come with it)

I wasn't talking about CPUs. You could game just fine on a $200 CPU for like the past decade. And I guess I'll have to revise my original statement, flagship GPU prices have been poorly priced for about 10 years now starting with the original GTX Titan. $1000 in 2013 is about $1345 in today's money which isn't that far off from the $1600 4090.
 
That's not what the cost of "video cards" is, that's the (presumed) cost of that video card. I'm not quite sure how the flagship price of the flagship card has any bearing on you enjoying video games as a hobby given there is an entire lineup of GPU's from the current and past generation at every price point you could imagine.

Unless you're saying you cannot enjoy video games without the top end GPU available?
If NVIDIA has their way and if AMD stops competing, then the prices on all NVIDIA cards will keep going up such that it will be necessary to spend over $1k to play newer/recent games at native resolution on current gen 1440 and 4k tvs and monitors.
 
If NVIDIA has their way and if AMD stops competing, then the prices on all NVIDIA cards will keep going up such that it will be necessary to spend over $1k to play newer/recent games at native resolution on current gen 1440 and 4k tvs and monitors.

To a certain degree yes. The 4070 Ti was $800, so almost $1k for a xx70 series GPU which is pretty nuts. But I don't think we are going to get to the point where a xx50 tier GPU like say an RTX 6050 will cost $1k. Not anytime soon at least, but sure I can see that happening if AMD is out of the picture entirely.
 
If NVIDIA has their way and if AMD stops competing, then the prices on all NVIDIA cards will keep going up such that it will be necessary to spend over $1k to play newer/recent games at native resolution on current gen 1440 and 4k tvs and monitors.
Sure if you chase native 4k it can be quite something (always has). But at reasonable resolution and settings, as long cheap game consoles are popular and game studio make games that can run on them it is hard to imagine this ever be the case.

Special for PC edition niche title yes, maybe, mainstream title... doubtful, big game cost a lost, they need a large installation base, that expensive of GPU will probably stay somewhat rare.

This is probably has hard studio can go for game that will run on PS5-XboxX:

senuas-saga-hellblade-ii-pc-requirements.jpg


Until PS6 only game launch in 2028.... 3080/6800xt should do fine, 6700/2070 super should be able to run them upscaled.
 
That's not what the cost of "video cards" is, that's the (presumed) cost of that video card. I'm not quite sure how the flagship price of the flagship card has any bearing on you enjoying video games as a hobby given there is an entire lineup of GPU's from the current and past generation at every price point you could imagine.

Unless you're saying you cannot enjoy video games without the top end GPU available?
Well one can argue that what used to be mainstream price points is completely stagnating in the market for gen on gen uplift. Look at 4060 vs 3060, 4060 Ti vs 3060 Ti, etc. Sure there are feature differences but the overall gen over gen gain was minimal while the cost went up. So...yes. You kind of do need to be a high end buyer to get any worthwhile upgrade on the Nvidia side.
 
was minimal while the cost went up
Depends where in the stack the 4060 launched with a lower MSRP ($300) than the 3060 ($329), sometime it is minimal change while the cost stayed about the same, instead of minimal change + price creep like the 4060ti-4070.
 
Last edited:
I wasn't talking about CPUs. You could game just fine on a $200 CPU for like the past decade. And I guess I'll have to revise my original statement, flagship GPU prices have been poorly priced for about 10 years now starting with the original GTX Titan. $1000 in 2013 is about $1345 in today's money which isn't that far off from the $1600 4090.
And massively different from the $499 GTX 580 that came before the $1000 GTX Titan. I can buy that these days top end large silicon is a lot more expensive to make, but back then they sure did a great job of completely shifting the concept/branding/pricing of top end silicon. $499 in 2010 is about $718 in today's money.
 
And massively different from the $499 GTX 580 that came before the $1000 GTX Titan. I can buy that these days top end large silicon is a lot more expensive to make, but back then they sure did a great job of completely shifting the concept/branding/pricing of top end silicon. $499 in 2010 is about $718 in today's money.

Of course, but I'm just saying that overpriced flagship GPUs have been a thing for over a decade already, this isn't anything new. If one wanted to "call it quits at 1k", they were supposed to quit a decade ago when the Titan launched for $1000. And if you don't want to count the Titan because it's a Titan, then you still would have quit all the way back in 2018 when the 2080 Ti launched for $1200.
 
Back
Top