Jump to content

4080s are Overpriced and not selling


AbsolutSurgen

Recommended Posts

Apologies for making this thread my personal tech help page. 

 

That being said...

 

My card should be here this week, whereas my processor will still be a few weeks out. 

 

I'm curious - is it optimal to wait to install both pieces together, or is there really no reason I can't install the card when it gets here, then do the processor when it gets here? 

 

It seems like I shouldn't have to do anything on the OS / driver side, just pray everything works? 

Link to comment
Share on other sites

3 minutes ago, Paperclyp said:

I'm curious - is it optimal to wait to install both pieces together, or is there really no reason I can't install the card when it gets here, then do the processor when it gets here? 


There’s no reason to wait. In fact, better to do the gpu first so you can see the difference the new cpu adds.

 

4 minutes ago, Paperclyp said:

It seems like I shouldn't have to do anything on the OS / driver side, just pray everything works?


You’ll want to reinstall your gpu drivers (and with a clean install).

Link to comment
Share on other sites

3 hours ago, Paperclyp said:

Apologies for making this thread my personal tech help page. 

 

That being said...

 

My card should be here this week, whereas my processor will still be a few weeks out. 

 

I'm curious - is it optimal to wait to install both pieces together, or is there really no reason I can't install the card when it gets here, then do the processor when it gets here? 

 

It seems like I shouldn't have to do anything on the OS / driver side, just pray everything works? 

 

Use that week to get some before\after benchmarks, and if you want to post them here I'd be curious to see the difference. What games do you plan to play?

 

You shouldn't have to do any software\driver changes when changing the CPU. I'd suggest over clocking out of the box. It should do 4.2-4.4 GHz without any fine tweaking or voltage adjustments. If you have good cooling and want to spend more time tweaking, you could get 4.7-5 GHz max.

Link to comment
Share on other sites

1 hour ago, cusideabelincoln said:

 

Use that week to get some before\after benchmarks, and if you want to post them here I'd be curious to see the difference. What games do you plan to play?

 

You shouldn't have to do any software\driver changes when changing the CPU. I'd suggest over clocking out of the box. It should do 4.2-4.4 GHz without any fine tweaking or voltage adjustments. If you have good cooling and want to spend more time tweaking, you could get 4.7-5 GHz max.

Cool. 

 

Do I just go into the bios to overclock? Never done it before...

 

I'm not sure what I'll play. I'll def boot up the RE Village demo because it has some pretty cool modular settings so I just want to compared framerates and whatnot with certain things turned on. Otherwise, What I've been playing recently is pretty low-end. I have been toying around with starting a new Valheim game. Potentially could start Nier Automota or thinking about re-starting Doom Eternal. 

Link to comment
Share on other sites

    Hope this keeps up, I'm hoping AMD closes the gap making GPU's more affordable for performance dollars so I can buy a 5090 or 5090Ti {Most likely a 5080Ti but I can dream} for around $1500 and be done with it again for the next 3 generations at least, maybe even 4. I've gone from my 2080Ti till now and that's the longest EVER so I'm excited about the "NEXT" big GPU from Invidia. By that time RTX with DLSS 2/3 should be IRON out and everything will run 60-120 Locked with Vsync on. That's the dream. Though through the LGCx or LGC1 Vsync isn't needed!

Link to comment
Share on other sites

I think they're going to be at an interesting point. With the "next gen" console generation really just starting to get regular "current gen only" titles they are going to be the limiting factor when it comes to GPU value. We're already at the point where we can play damn near photo-realistic games in UE5.x at high frame-rates at 4K... 

As a 3080 owner, I have zero interest in a 40X0 card for gaming purposes. Additionally, now that crypto is struggling the artificial demand for gaming cards isn't there either.

 

The biggest thing is, at this point is where does Nvidia go from here beyond efficiency? Better graphics is darn near hit the diminishing returns point on more photo-realism vs animation IMHO.

Link to comment
Share on other sites

11 hours ago, Dexterryu said:

I think they're going to be at an interesting point. With the "next gen" console generation really just starting to get regular "current gen only" titles they are going to be the limiting factor when it comes to GPU value. We're already at the point where we can play damn near photo-realistic games in UE5.x at high frame-rates at 4K... 

As a 3080 owner, I have zero interest in a 40X0 card for gaming purposes. Additionally, now that crypto is struggling the artificial demand for gaming cards isn't there either.

 

The biggest thing is, at this point is where does Nvidia go from here beyond efficiency? Better graphics is darn near hit the diminishing returns point on more photo-realism vs animation IMHO.

 

Probably better depth rendering so environments can be more open with less restrictions on sightlines or better/complex VFX.

Link to comment
Share on other sites

17 hours ago, Dexterryu said:

I think they're going to be at an interesting point. With the "next gen" console generation really just starting to get regular "current gen only" titles they are going to be the limiting factor when it comes to GPU value. We're already at the point where we can play damn near photo-realistic games in UE5.x at high frame-rates at 4K... 

As a 3080 owner, I have zero interest in a 40X0 card for gaming purposes. Additionally, now that crypto is struggling the artificial demand for gaming cards isn't there either.

 

The biggest thing is, at this point is where does Nvidia go from here beyond efficiency? Better graphics is darn near hit the diminishing returns point on more photo-realism vs animation IMHO.

 

We need higher resolution still (12k), better RT performance so games can be 100% ray traced via path renders, and increases in overall performance so games can run at the aforementioned 12k at 120-360fps. There's still quite a bit of room for growth, but the main things are certainly resolution and framerate. If you're gaming below 4k, I can understand not wanting to upgrade your 3080, but for me, the 3080 was nowhere near cutting it for 4k 60fps most of the time, let alone 4k 120fps.

  • True 1
Link to comment
Share on other sites

23 minutes ago, Spork3245 said:

 

We need higher resolution still (12k), better RT performance so games can be 100% ray traced via path renders, and increases in overall performance so games can run at the aforementioned 12k at 120-360fps. There's still quite a bit of room for growth, but the main things are certainly resolution and framerate. If you're gaming below 4k, I can understand not wanting to upgrade your 3080, but for me, the 3080 was nowhere near cutting it for 4k 60fps most of the time, let alone 4k 120fps.

 

I do game at QHD, having had a 4k monitor in the past I just couldn't really tell the difference vs being able to go 144hz & 1000nit HDR.  Again, diminishing returns on higher resolutions.

 

6 hours ago, SuperSpreader said:

 

Probably better depth rendering so environments can be more open with less restrictions on sightlines or better/complex VFX.

 

And this is where it's going to be a tougher sell because games won't necessarily "look" better. So there's not going to be that visual jump like you'd expect.

 

Not saying that higher frames or more depth is bad. Just that for most people it's not going to make them do a $1000+ upgrade.

Link to comment
Share on other sites

8 minutes ago, Dexterryu said:

Again, diminishing returns on higher resolutions.

 

Strong disagree. I have a 1440p ultrawide next to my 4k 144hz 1000nit HDR monitor and the difference is super apparent to me. 12k is when aliasing is no longer perceptible without AA.

Link to comment
Share on other sites

13 minutes ago, Spork3245 said:

 

Strong disagree. I have a 1440p ultrawide next to my 4k 144hz 1000nit HDR monitor and the difference is super apparent to me. 12k is when aliasing is no longer perceptible without AA.

To each their own then. The biggest visual impact for me has been HDR vs resolution. Again, not saying they are bad things just not worth the asking price.

Link to comment
Share on other sites

I'm building my new PC this week, and I am sticking with 1440p (165hz) for the next 5-year cycle. 4k is impressive, but for me I'd much rather have high, stable refresh rates. 

 

My build (assembling this weekend):

  • 5800x3D
  • 6950 XT
  • 32GB DDR4 3600hz
  • MSI Mag B550 Tomahawk mobo
  • LG 27GP83B-B monitor (27" 1440p 165hz)
  • Bunch of M.2 and other SSDs
  • Halal 1
Link to comment
Share on other sites

1 minute ago, CitizenVectron said:

I'm building my new PC this week, and I am sticking with 1440p (165hz) for the next 5-year cycle. 4k is impressive, but for me I'd much rather have high, stable refresh rates. 

 

My build (assembling this weekend):

  • 5800x3D
  • 6950 XT
  • 32GB DDR4 3600hz
  • MSI Mag B550 Tomahawk mobo
  • LG 27GP83B-B monitor (27" 1440p 165hz)
  • Bunch of M.2 and other SSDs

 

Woot fellow AMD bro. 

 

I am waiting for CES to see what new monitors will come out. I'm at 120Hz 5120 by 1440p and would like to stay the same but with maybe an OLED display. I work from home though so I'm concerned about burn in. Lol gonna have to turn off my monitor on breaks and like make sure i move windows around a bit.

Link to comment
Share on other sites

2 minutes ago, Spork3245 said:

 

It's not particularly great, but it seems to OC very well like the 4080. Biggest issue is the price compared to previous xx70 Ti offerings. It's a better value than the 7900 XT, IMO.

 

Ya the 7900XT should be less in and of itself. With the 4070 it needs atleazt a 100 dollar drop.

Link to comment
Share on other sites

4 minutes ago, Zaku3 said:

 

Ya the 7900XT should be less in and of itself. With the 4070 it needs atleazt a 100 dollar drop.

 

Yea the XT is $100 more than the 4070 Ti. Without an OC and no RT the XT is around 6-9% faster (OC'd, they're pretty-much 1:1, but that's with the XT not OC'd and I don't know how well it OCs), but with RT enabled the 4070 Ti comes out on top with like +25%.

Link to comment
Share on other sites

6 minutes ago, Zaku3 said:

 

Woot fellow AMD bro. 

 

I am waiting for CES to see what new monitors will come out. I'm at 120Hz 5120 by 1440p and would like to stay the same but with maybe an OLED display. I work from home though so I'm concerned about burn in. Lol gonna have to turn off my monitor on breaks and like make sure i move windows around a bit.

 

I work from home 60% of the time, and with this new monitor I will have:

 

  • 1 x 27" 1440p (Freesync)
  • 1 x 27" 1440p (Gsync)

 

So, I'll be using the new monitor for gaming, but both monitors for productivity. I considered going ultrawide, but I prefer to just have two monitors.

Link to comment
Share on other sites

16 minutes ago, Spork3245 said:

 

It's not particularly great, but it seems to OC very well like the 4080. Biggest issue is the price compared to previous xx70 Ti offerings. It's a better value than the 7900 XT, IMO.

 

If I were to buy a card soon (I probably won't), I wouldn't consider that labeling relevant at all.  There's plenty of cards that can be ignored if you don't take the gen on gen comparisons too literally.

 

I'd sooner think of this card as a better 3080 with DLSS3.  Seems like a good card for someone like me primarily interested in 4k60 gaming, and a much better option for price vs. performance than the 4080.

 

$800 is still way too much for me to stomach though.  It's only a 'good' MSRP in a terrible time for GPU pricing.

Link to comment
Share on other sites

19 minutes ago, Spork3245 said:

 

I don't disagree, but that's been the main argument people make against the 4080, despite the performance-per-dollar being better than the 4090.


I thought the arguement was more that 4080 was a shit deal compared to 30xx cards on sale, performance per dollar.

 

I suppose now you could also say it’s a shit deal compared to a 4070 TI.  There’s no way a 4080 is worth $400 more.

Link to comment
Share on other sites

I honestly have no idea what nvidia and amd are doing this gen. It seems like the options are go sell a kidney and buy a 4090 or say fuck off and buy a PS5, the performance bump on 4080/4070 vs previous gen is not worth the cost. AMD is a bit better but still overpriced by at least 25%

Link to comment
Share on other sites

17 minutes ago, crispy4000 said:


I thought the arguement was more that 4080 was a shit deal compared to 30xx cards on sale, performance per dollar.

 

No, it has to do with the pricing vs previous xx80 cards. That's the main sticking point people are complaining about. The price-per-dollar works out about 1:1 with current 3080 and 3080 Ti prices, and superior to 3090/3090 Ti prices.

The 7900 XT/XTX truly aren't better values in all honesty by comparison, either.

 

17 minutes ago, crispy4000 said:

There’s no way a 4080 is worth $400 more.

 

That largely depends on your usage case (preferred resolution, settings, and framerate).

Link to comment
Share on other sites

I think the main thing here with all cards, is whether there is value in upgrading vs the asking price. Looking at the industry as a while, I think the crypto rush for GPUs inflated prices and the manufacturers saw this and adjusted MSRP. That said, with crypto crashing and cards being readily available people aren't willing to overpay.

 

There's still going to be enthusiasts that have to max out/have the best (I realize a lot of people posting here are in that club), but those aren't the every day gamers that previously had to overpay to have ANY card. It'll be interesting to see what happens now that supply seems to exceed demands.

Link to comment
Share on other sites

Ok got off the phone with AMD. They are offering refunds or replacement. I'm gonna just wait for a replacement cuz I don't want to jump through hoops for a AIB card or having to hunt down a 4080. The card even like this is a good upgrade over my 6800XT. It's more of a performance is being left on the table. The only negative is it doesn't like Steel Division 2 and crashes everyone once in a while. 

 

Guess I'll just play Warno. I feel bad joining MP games and dropping. I 

Link to comment
Share on other sites

16 minutes ago, Spork3245 said:

That largely depends on your usage case (preferred resolution, settings, and framerate).

 

I have really hard time envisioning the sort of PC gamer where a 4070ti wouldn’t meet their needs but a 4080 would.

 

Maybe some years in the future?  But then it might be best to save that $400 off the top for a new card.

Link to comment
Share on other sites

4 minutes ago, crispy4000 said:

 

I have really hard time envisioning the sort of PC gamer where a 4070ti wouldn’t meet their needs but a 4080 would.

 

Maybe some years in the future?  But then it might be best to save that $400 off the top for a new card.

 

At 4k there's a 25-30% performance improvement on the 4080 for 33% more money. When RT is enabled the 4070 Ti struggles at 4k whereas the 4080 is mostly able to maintain 60fps. If your goal is 4k max settings but don't want to spend beyond 1200, the 4080 is a better solution. At 1440p, the 4070 Ti shines. 

Link to comment
Share on other sites

3 minutes ago, Spork3245 said:

 

At 4k there's a 25-30% performance improvement on the 4080 for 33% more money. When RT is enabled the 4070 Ti struggles at 4k whereas the 4080 is mostly able to maintain 60fps. If your goal is 4k max settings but don't want to spend beyond 1200, the 4080 is a better solution. At 1440p, the 4070 Ti shines. 

 

We talking DLSS2 here?  Because if not, it negates so much of that talking point.

Link to comment
Share on other sites

1 minute ago, Spork3245 said:

 

Why would you not have DLSS or FSR enabled at 4k?

 

I find FSR1 and FSR2 worse then native. It's close with FSR2 but something is still off to me when playing dark tide. Based on my 1600p 16in laptop I want to say DLSS looks about as good as native. It does feel off at times but it generally felt as if I was playing at native. Only at 100FPS. 

 

Damn. I think Asus is gonna make the flow line Intel only for this gen. Guess a 2023 Lenovo Legion 7 is gonna be my next laptop. 

Link to comment
Share on other sites

Just now, Zaku3 said:

 

I find FSR1 and FSR2 worse then native. It's close with FSR2 but something is still off to me when playing dark tide. Based on my 1600p 16in laptop I want to say DLSS looks about as good as native. It does feel off at times but it generally felt as if I was playing at native. Only at 100FPS. 

 

Damn. I think Asus is gonna make the flow line Intel only for this gen. Guess a 2023 Lenovo Legion 7 is gonna be my next laptop. 

 

At 4k I'd argue that DLSS quality and balance are superior to native due to the removal of aliasing. FSR(2) I'd only use the "quality" settings, personally. At below 4k I have no idea how FSR fairs, but DLSS still looks better than native in most cases when set to quality mode, even down to 1080p.

Link to comment
Share on other sites

1 minute ago, Spork3245 said:

 

At 4k I'd argue that DLSS quality and balance are superior to native due to the removal of aliasing. FSR(2) I'd only use the "quality" settings, personally. At below 4k I have no idea how FSR fairs, but DLSS still looks better than native in most cases when set to quality mode, even down to 1080p.

 

I do use FSR at times on my Onexplayer. It is just fsr1.0 for Elden Ring. If you use the highest res option it doesn't look that bad. 

Link to comment
Share on other sites

Just now, Zaku3 said:

 

I do use FSR at times on my Onexplayer. It is just fsr1.0 for Elden Ring. If you use the highest res option it doesn't look that bad. 

 

I don't know if I ever used FSR1 tbh. I don't think there were any games (that I played, at least :p ) that offered FSR1 without also offering DLSS. I know Far Cry 6 only has FSR, but I'm 99% sure it's v2. FSR1 was a bit worse than OG DLSS from what I saw.

Link to comment
Share on other sites

Just now, Spork3245 said:

 

I don't know if I ever used FSR1 tbh. I don't think there were any games (that I played, at least :p ) that offered FSR1 without also offering DLSS. I know Far Cry 6 only has FSR, but I'm 99% sure it's v2. FSR1 was a bit worse than OG DLSS from what I saw.

FSR 1 made textures look soft, ver 2 is much improved and its quality, while lower than DLSS 2, is not off by much. Both are attractive propositions now. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...