this post was submitted on 08 Jul 2023
26 points (93.3% liked)

PC Master Race

15006 readers
108 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 1 year ago
MODERATORS
26
submitted 1 year ago* (last edited 1 year ago) by MHcharLEE to c/pcmasterrace
 

EDIT: A few days ago I bought a HP X34. It's a 3440x1440 ultrawide with 165Hz refresh rate. Obviously my setup can't hit constant 165fps in all games, but I'm comfortably getting 100+ in most games, 120+ in Forza Horizon 5 or Doom Eternal. Can't complain :)

I'm looking to buy a new monitor, making a switch from two 16:9 to a single 21:9. Everywhere I read the opinions are that on an ultra wide it doesn't make sense to go lower than 1440p, which I guess holds true for 34" monitors.

However, I'm worried my 3060 Ti won't be enough for that many pixels. Right now I'm enjoying uninterrupted framerates playing at 1920x1080.

How much should I really worry about making this switch? My other option is to go for a 2560x1080 monitor that's smaller (29").

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago (3 children)

I run a 3070 with the gigabyte G34WQC (21:9 1440p). The 3070 is basically the same card as the 3060 ti.

I would say the major limitation is VRAM. The card has enough horsepower to comfortably run most games. With DLSS I expect >100 fps.

However, it’s definitely showing its age. Current consoles have effectively 12 gb of VRAM so the 8 gb is sort of teetering on the edge. If it was a 16 gb card it would be fine for years to come.

If all you want is no stuttering, it will be great. I never see below 80 in demanding games. However, driving a full 144 fps will be an issue.

[–] MHcharLEE 1 points 1 year ago (2 children)

That's what I needed. Thank you. I'll upgrade the GPU sooner or later (probably later), I knew what I was getting myself into with 8GB of VRAM.

On the note of G34WQC, I'm eyeing that monitor among a few others. Is the antiglare coating really as grainy as some people claim it to be? Supposedly makes text look fuzzy on bright backgrounds.

[–] [email protected] 1 points 1 year ago (1 children)

I feel like a lot of the complaints about the G34WQC come from monitor snobs. It would argue it's one of the best bang-for-your-buck monitors on the market. Certainly it's the best budget ultrawide. Personally, I've yet to experience these issues. The coating being grainy is the first I'm hearing about it and I haven't experienced that with mine. I code and read on it without issues. In terms of text I've seen people complain about the curve but I handle that just by snapping the window to 1 side. It essentially just serves as two monitors.

Another big complaint is black smearing/ghosting. I haven't noticed this at all. I run dark mode on windows. I assume it's real but I just don't have an eye for it. I use an IPS monitor for work because I need color accuracy and switching back to the G34WQC, I notice that blacks look better but don't see any smearing or inconsistency.

Out of the box, the monitor looks not the best but this is true of all cheap-o monitors. However, it can be fixed through software. You have to calibrate it. I use the ICC profile from rtings.

According to their tests, this brings color accuracy up substantially. Though this isn't special to this monitor, they really should all be calibrated.

I got it for $350 on sale a few years ago and it was a great purchase.

[–] MHcharLEE 1 points 1 year ago

Oh I'm by no means a monitor snob. Took me two years to realize my IPS monitor wasn't true 8 bit, but FRC instead haha.

I checked again and that antiglare complaint is about the M34WQ - flat, IPS version of your monitor. I mixed them up.