• LAST CHANCE! Quick Shot Challenge: Caption This Sniper Fail Meme

    Drop your caption in the replies for the chance to win a free shirt!

    Join the contest

Anyone here know anything about ultrawide monitors?

TheGerman

Oberleutnant
Full Member
Minuteman
  • Jan 25, 2010
    10,595
    30,187
    the Westside
    I remember a while back when they came out reports of dead pixels and stuttering refresh rates was a thing. Haven't really looked at them since but was looking for something like a 34inch.

    I have a GEForce GTX1070 video card if that matters, as I don't want to get into a new monitor at higher resolutions and then find out its my system/card thats now needing an upgrade.

    Was looking at these:


     
    I have the ASUS ROG 34 ultra wide. It’s 3440x1440 or 2560 something like that lol. Anyways it’s a 1440P monitor and I have a 1080ti. It caps at 100 FPS but my 1080ti struggles to even reach that.

    Playing Modern Warefare I have to turn graphics down to medium to even get close.

    The large monitors seem to take a lot of GPU to push

    mine is also curved, which I really like BUT you have to sit directly behind it or you see the white back light bleeding through. You only notice it during black screens while watching movies. Never once noticed it while gaming.
     
    I remember a while back when they came out reports of dead pixels and stuttering refresh rates was a thing. Haven't really looked at them since but was looking for something like a 34inch.

    I have a GEForce GTX1070 video card if that matters, as I don't want to get into a new monitor at higher resolutions and then find out its my system/card thats now needing an upgrade.

    Was looking at these:


    Whats the use case for the monitor?
     
    I've used several of the UltraWide monitors and you just have to decide if that's what you want.
    Most I've used tend to be reliable.
    For me personally I prefer 4K monitors due to my workflow and the lost vertical resolution makes the Ultra Wide ones not work for me.

    Depending on what you play game wise (if it's for gaming), you'll probably want to upgrade your video card a fair bit to handle the higher resolution.
     
    Work (spreadsheets), internet cruising and I do play some video games at times.

    The video games are what have me concerned. Picked up yhe new Star Wars game and am running it maxed out with 0 graphics/GPU issues, but I'm at 1920x1080 and have no idea how that translates into stuff like 3440x1440.
    That 1070 is for the poors. Get a 2080ti or go be poor somewhere else.
     
    I personally prefer 4 (multiple monitors) vs 1 big one for my uses. Owning an IT company I'm typically remoted into 2 different clients on 2 of them, company email/outlook and file explorer split screen on the 3rd and my main/4th monitor for web browsing/SH and daily tasks.

    I do have a 46" Samsung 4K on the other side of my office with ESPN playing but I have a VANCO 4k HD BaseT HDMI over Cat6 running from my main rig to HDMI2 on that TV for playing videos, etc on the larger screen when needed.
     
    • Like
    Reactions: AIAW
    Depending on what youre playing stuttering can be the vsync, sometimes it better to turn it off. Depending on the resolution youre looking to run, the game/program, a beefier gpu with 4-8gb of ram might be needed, also a beefier cpu with Id say no less than 16gb main ram. Dead pixels, and other issues are with the monitor, not the gpu, Asus, sammy, LG, all make good ones. Go with higher refresh rate monitors. 144hz is MUCH better than 60hz. 4k/8k is fine, just make sure to have a gpu that can handle it, with vram that can back it, and also a cpu, with ram that can handle those resolutions. Also dont skimp on cooling. The bigger the screen+resolution the more horsepower itll take to maintain 50-60+fps without lag. Granted it all depends on what youre doing. Even if your system isnt up to snuff for 4k maxed out, on the newest game/ program. You can still set a resolution the computer can comfortably handle, then bump it up to the monitor max when you do have the hardware.
     
    I also have that Asus ROG 3440 curved monitor mentioned earlier. I absolutely love it. My graphics card is the GTX1080 FTW3. I play modern warfare, BF5, Division 2, Breakpoint.

    I would buy this monitor again without hesitation.
     
    Forgot to mention, I dont have a ultra wide so I personally cant attest, but my buddy has a LG one that handles Metro exodus maxed out at 4k no issues, cant see any dead pixels, shearing nor any stuttering. (Corei7, 32gb RTX2080 max q 8gb)
     
    Not a big fan of the ultra-wide aspect ratios myself. 16:9 is such a "standard", and it’s nice to have the vertical resolution. I run a 32" Asus 4K with G-Sync as primary and a 1080p 120 Hz secondary.

    Driving 4K at above 60 Hz right now is pretty much unrealistic at descent graphics settings. For gaming alone, 1440p at 120+ Hz is where it’s at. Either way it will require DisplayPort - HDMI 2.0/DVI isn’t going to cut it.

    I don’t game really (not in a long time anyhow) but I do like to build monster rigs, but even this beast will only drive 4K at 90 fps on most titles at nearly maximum graphics. 2080 Ti’s might do it, with the right title that properly supported SLi.

    Dedicated dual-loop, dual pump, dual radiator water-cooled, with dual 1080 Ti’s - heavily overclocked... including all 14 CPU cores.

    Draws near 13 amps at max CPU/GPU load ?

    A1ADA16B-95B4-4389-A475-636BBB406062.jpeg
     
    • Wow
    • Like
    Reactions: Barneybdb and Crang
    You need to put an edelbrock sticker on that thing.

    Hahaha, and Flowmaster! That would make sense too given that it holds a little under a gallon of 80/20 clear ethylene glycol.

    Believe it or not, this thing is nearly dead silent too. 12 very low RPM fans are quieter than even a single high-RPM fan, logically.
     
    Driving 4K at above 60 Hz right now is pretty much unrealistic at descent graphics settings. For gaming alone, 1440p at 120+ Hz is where it’s at. Either way it will require DisplayPort - HDMI 2.0/DVI isn’t going to cut it.

    Dedicated dual-loop, dual pump, dual radiator water-cooled, with dual 1080 Ti’s - heavily overclocked... including all 14 CPU cores.

    If you jump up to 2080Ti cards or Titan RTX cards you can probably get to your 4K just fine.
    But at this point in the lifecycle I'm assuming it's probably better to just wait for the launches next year.
     
    If you jump up to 2080Ti cards or Titan RTX cards you can probably get to your 4K just fine.
    But at this point in the lifecycle I'm assuming it's probably better to just wait for the launches next year.

    Very true. Even the 2080 Ti's are getting some age on them now. RTX turned out to be a joke really since there is like a massive 2 titles out there that actually support it. Need more horsepower and less marketing wank Nvidia!
     
    Get a 43" 4K monitor and you'll probably be happier with it for the mix of uses you have.

    This is what I'd probably do. I currently have this setup on a 50" LG plasma TV with Bose/Miler Kriesel/HK sound and a PS4 for movies, games, streaming and shit (the goddamn PS4 literally does it all, even if you don't play games on it they're still worth it!). I prefer LED though FWIW, between plasma, LCD and LED, LED has by far the best picture and color, even better than some that cost a lot more.

    Now I'd consider a 4k monitor but then I'd need to start using 4k service... As it stands I'm already using way more than 1TB per month so I always go over the limit. 4k data streaming would put that near 2-3TB. Something to keep in mind as you increase performance.

    Whatever you do, I'd shy from the latest and greatest unless you just like to throw cash away. Whatever is newest this year will be half price or less next year or two and be a better product by then. Plus you know which ones work and which ones failed during that period. Biggest waste of money ever was when I bought that Sharp Aquos when they FIRST came out like in 2004 or 2005. $5k, 100lbs, lags like a motherfucker, missing a couple pixels and then there's a fucked up spot in the top left corner... This free plasma and my cheap LED both work much better, are much thinner and weigh a fuckton less. All I had to do was just wait another year, but no, couldn't manage that.

    Have gotten KILLER deals on TV's though --when one company misprints an ad, then another is bound to honor that misprinted ad because the first company does too, well, let's just say Walmart was fucking giving 'em away one year. It gets easier and there's more opportunities for stuff like this near Thanksgiving too.
     
    Very true. Even the 2080 Ti's are getting some age on them now. RTX turned out to be a joke really since there is like a massive 2 titles out there that actually support it. Need more horsepower and less marketing wank Nvidia!

    This isnt actually 100% true. If your a gamer and talking strictly gamer performance then maybe. But These cards arent built and used 100% for gamers. Ive owned a commercial IT company for 14 years now. I specialize in high performance builds for industry specific needs. Small part of my business but something a lot of companies in my region come to me for over the Dell's, HP's, etc. Engineering, CAD, 3D modeling, multi output video. None of that is gaming. The RTX 2080 Ti is the king of the roost by a long shot in a lot of these applications designed around the GeForce architecure. Now Solidworks, Surfcam, AutoCad, they prefer and run better on Quadro. No others come close, next best is the RTX 2080. Especially when Im building a custom video management server with video outputs to 14, 16, 18 4K (4096x2160) at 60 fps Ultra Short Throw projectors running different content on every projector utilizing DeckLink 4K Extreme distribution hardware.

    When rendering video, which is what my daily office rig is built for, that rendering is offloaded to the GPU (RTX 2080 Ti) from the CPU and the rendering times are drastically shorter when this NVIDIA GPU hardware acceleration is utilized. Its not even close when rendering on the CPU vs high end GPU with 1080P and 4K rendering. Computer gaming is a small market these days when it comes to video card design. Some call dying due to consoles. Quardro designs and architecure trickle down to the GeForce lines and are utilized more for mining, cloud computing and rendering more than gaming these days.

    Ive been watercooling high end workstations for 10+ years. Closed loop watercooling has come a long way and there is very little advantage temp wise these days running custom cooling loops vs high end closed loop systems unless your cooling SLI 2080's and a 8-14 core cpu and all have heavy overclocks.


    I run a i9 9900KF @ 5Ghz 24/7 all cores with AVX, 32GB DDR4-4000Mhz memory, RTX2080 Ti, Asus ROG Maximus XI Code, Corsair 1200watt 80+ Platinum, Samsung nVME 970 Pro 1TB m.2 SSD and a Corsair H150i Pro and have no problems keeping load temps under TJ Max. Now Im also not gaming for hours a day, every day.


    Here is a quick video I took of a job I flew out to MN and did back in February utilizing a custom video management server I designed and built for a chapel. 11 4K Ultra Short throw 4K projectors playing the same content or independent content on every screen. 2080 GTX TI /DeckLink gets it done. Anything else falls on its face. 1080 couldnt even get it done without stuttering.






    Here's another client I did north of Denver in Greely, CO last December. 2 separate rooms, 2 independent systems.


     
    Last edited:
    Very true. Even the 2080 Ti's are getting some age on them now. RTX turned out to be a joke really since there is like a massive 2 titles out there that actually support it. Need more horsepower and less marketing wank Nvidia!
    I rather like RTX in Metro, works well, and adds to the creepiness atmosphere nicely. of course cant say much as Im too enthralled with FO4. I would bet FO5, and ES6 would use it more and to better potential, and in those kinds of games would be put to better use. (Of course granted Bethesda doesnt f it up). RTX is in it’s infancy, it will develop and benefit games and other software nicely. Then again any new gpu tech takes off slowly and gets scoffed at. I remember when unified vertex and pixel shaders were scoffed at. Same with hardware accelerated physics.
     
    • Like
    Reactions: AIAW
    Depending on what you're doing, I find that the 4k monitors with a rotation sensor built in are pretty badass.

    You can run landscape for graphics/video/whatever, then flip to portrait for reading contracts and things like that.
     
    • Like
    Reactions: AIAW
    You can run landscape for graphics/video/whatever, then flip to portrait for reading contracts and things like that.

    Now if you haven't ever tried "adult entertainment" of the still photography variety on a large screen monitor in portrait mode.... well it's an experience worth trying at least once......
     
    This isnt actually 100% true. If your a gamer and talking strictly gamer performance then maybe. But These cards arent built and used 100% for gamers. Ive owned a commercial IT company for 14 years now. I specialize in high performance builds for industry specific needs. Small part of my business but something a lot of companies in my region come to me for over the Dell's, HP's, etc. Engineering, CAD, 3D modeling, multi output video. None of that is gaming. The RTX 2080 Ti is the king of the roost by a long shot in a lot of these applications designed around the GeForce architecure. Now Solidworks, Surfcam, AutoCad, they prefer and run better on Quadro. No others come close, next best is the RTX 2080. Especially when Im building a custom video management server with video outputs to 14, 16, 18 4K (4096x2160) at 60 fps Ultra Short Throw projectors running different content on every projector utilizing DeckLink 4K Extreme distribution hardware.

    When rendering video, which is what my daily office rig is built for, that rendering is offloaded to the GPU (RTX 2080 Ti) from the CPU and the rendering times are drastically shorter when this NVIDIA GPU hardware acceleration is utilized. Its not even close when rendering on the CPU vs high end GPU with 1080P and 4K rendering. Computer gaming is a small market these days when it comes to video card design. Some call dying due to consoles. Quardro designs and architecure trickle down to the GeForce lines and are utilized more for mining, cloud computing and rendering more than gaming these days.

    Ive been watercooling high end workstations for 10+ years. Closed loop watercooling has come a long way and there is very little advantage temp wise these days running custom cooling loops vs high end closed loop systems unless your cooling SLI 2080's and a 8-14 core cpu and all have heavy overclocks.


    I run a i9 9900KF @ 5Ghz 24/7 all cores with AVX, 32GB DDR4-4000Mhz memory, RTX2080 Ti, Asus ROG Maximus XI Code, Corsair 1200watt 80+ Platinum, Samsung nVME 970 Pro 1TB m.2 SSD and a Corsair H150i Pro and have no problems keeping load temps under TJ Max. Now Im also not gaming for hours a day, every day.


    Here is a quick video I took of a job I flew out to MN and did back in February utilizing a custom video management server I designed and built for a chapel. 11 4K Ultra Short throw 4K projectors playing the same content or independent content on every screen. 2080 GTX TI /DeckLink gets it done. Anything else falls on its face. 1080 couldnt even get it done without stuttering.






    Here's another client I did north of Denver in Greely, CO last December. 2 separate rooms, 2 independent systems.




    That’s awesome! Totally agree. I meant if someone was looking to make a big purchase they might wait to see what was right around the corner so to speak.

    I’ve gone down the WC route mainly for the quiet builds, but the aesthetics are very nice I feel. Definitely something to pursue with quality parts I have learned over the years! The AIO (all-in-one) coolers have definitely come a long way for sure.

    Asus Rampage VI Extreme, i9 i7940X @ 4.5 GHz all-core, 32 GB Corsair Extreme @ 4 GHz, 2x 1080 Ti's with a slight OC. 2x Samsung 980 Pro nVME's. Same PS as yours. Corsair 900D case, modified. Can run the cores harder, but AVX instruction get the temps on up there if I do. I've got it rock-solid at full load using the least possible voltage I could - a lot of time invested in that discovery process!

    Primarily I run SolidWorks and a few other CAD/CAM apps on it these days.
     
    Last edited:
    So what video card do I need to run all this bullshit on a 34 or larger screen at the higher resolutions?

    My concern with the screens larger than 34" is that its like 2-3 feet away from my face.
     
    So what video card do I need to run all this bullshit on a 34 or larger screen at the higher resolutions?

    My concern with the screens larger than 34" is that its like 2-3 feet away from my face.

    I work on a 43" monitor about 3 to 4 feet away on both my home and work desks and don't have any problem.

    If you have the money and want something now, get the RTX 2080Ti
    There is also the Titan RTX but I don't think the small performance bump for games is worth the cost increase unless you plan to be using the other functions of it for compute / rendering etc.
     
    • Like
    Reactions: AIAW
    Here's a retarded question.

    Can you somehow link the 1080 to the 2080 and have super video card? lol

    No, unfortunately only the same generation GPU’s can SLi. Unfortunately only some applications will actually use SLi (must be specifically designed to utilize it).

    A lot of games and other apps have abandoned SLi due to the extra complexity and latency, unfortunately.

    One "beefy GPU" is Nvidia’s roadmap forward. 2080 Ti is the mainstream king right now. Titan is a bit more workstation focused, but would work equally as well, if not better - all about the $$$$.
     
    No, unfortunately only the same generation GPU’s can SLi. Unfortunately only some applications will actually use SLi (must be specifically designed to utilize it).

    A lot of games and other apps have abandoned SLi due to the extra complexity and latency, unfortunately.

    One "beefy GPU" is Nvidia’s roadmap forward. 2080 Ti is the mainstream king right now. Titan is a bit more workstation focused, but would work equally as well, if not better - all about the $$$$.

    This. I used to buy two of the mid grade cards and outperform the top of the line cards but then SLI went to shit.

    Now you buy the king card and just run one card.
     
    For most games, if you need more power than an overclocked 2080Ti can give you, then you overclock a Titan RTX.

    You can look at the benchmarks, but most of the current games don't benefit as much from dual video cards, but it does vary by game, so you may want to look specifically at the reviews for that game on the hardware.

    But I think for most things, if you get a high end watercooled RTX 2080Ti and overclock it, you'll be pretty well set.
     
    No, unfortunately only the same generation GPU’s can SLi. Unfortunately only some applications will actually use SLi (must be specifically designed to utilize it).

    A lot of games and other apps have abandoned SLi due to the extra complexity and latency, unfortunately.

    One "beefy GPU" is Nvidia’s roadmap forward. 2080 Ti is the mainstream king right now. Titan is a bit more workstation focused, but would work equally as well, if not better - all about the $$$$.
    I think the 2080 Super is the new alpha of the green pack.
     
    So what video card do I need to run all this bullshit on a 34 or larger screen at the higher resolutions?

    My concern with the screens larger than 34" is that its like 2-3 feet away from my face.
    A 2080 class card will run max for 2 years. I know the 2080 in my lappy will chew through anything I can throw at it, and still hold fairly cool temps.
     
    I work on a 43" monitor about 3 to 4 feet away on both my home and work desks and don't have any problem.

    If you have the money and want something now, get the RTX 2080Ti
    There is also the Titan RTX but I don't think the small performance bump for games is worth the cost increase unless you plan to be using the other functions of it for compute / rendering etc.
    Yeah titans are pretty much for those that want to pay out for the summit class. Not much more power per dollar for games, esp with everything now being geared towards ps/xbox, and the mediocre amd jaguar platform they run. However i dont knock titan buyers, it is one sweet piece of kit that funds R&D.
     
    That’s awesome! Totally agree. I meant if someone was looking to make a big purchase they might wait to see what was right around the corner so to speak.

    I’ve gone down the WC route mainly for the quiet builds, but the aesthetics are very nice I feel. Definitely something to pursue with quality parts I have learned over the years! The AIO (all-in-one) coolers have definitely come a long way for sure.

    Asus Rampage VI Extreme, i9 i7940X @ 4.5 GHz all-core, 32 GB Corsair Extreme @ 4 GHz, 2x 1080 Ti's with a slight OC. 2x Samsung 980 Pro nVME's. Same PS as yours. Corsair 900D case, modified. Can run the cores harder, but AVX instruction get the temps on up there if I do. I've got it rock-solid at full load using the least possible voltage I could - a lot of time invested in that discovery process!

    Primarily I run SolidWorks and a few other CAD/CAM apps on it these days.

    Samsung doesnt make a 980 Pro.....970 is their latest and greatest line. Ive been a Samsung SSD direct dealer for years. A 940XM is a mobile CPU and a 940 X is a socket 1366 cpu, both i7's not i9's which wouldnt work in that MB. What cpu are you running?
     
    Samsung doesnt make a 980 Pro.....970 is their latest and greatest line. Ive been a Samsung SSD direct dealer for years. A 940XM is a mobile CPU and a 940 X is a socket 1366 cpu, both i7's not i9's which wouldnt work in that MB. What cpu are you running?

    Sorry, Samsung 960 Pro.

    Asus Rampage VI Extreme - https://www.asus.com/us/Motherboards/ROG-RAMPAGE-VI-EXTREME/

    Intel i9-7940X - https://ark.intel.com/content/www/u...es-processor-19-25m-cache-up-to-4-30-ghz.html

    It’s LGA2066.
     
    • Like
    Reactions: padom
    I think the 2080 Super is the new alpha of the green pack.

    The 2080 Super is LESS powerful than the 2080Ti
    It's an attempt to slip something in between the 2080 and the hugely expensive 2080Ti to attract more sales


    The above gives you a pretty good overview.
    You'll notice however conspicuously missing is the 1080Ti card from the roundup, probably because it usually is just right behind the 2080 base model on many games.
     
    The 2080 Super is LESS powerful than the 2080Ti
    It's an attempt to slip something in between the 2080 and the hugely expensive 2080Ti to attract more sales


    The above gives you a pretty good overview.
    You'll notice however conspicuously missing is the 1080Ti card from the roundup, probably because it usually is just right behind the 2080 base model on many games.
    Ah gotcha, a mid tier filler.
     
    Hahaha! Funny you mention that, I did get one back when the i9’s came out! It overclocked like ass though. Played around with it for about a week before I had to give it back.

    I was talking about Samsung nVME. Haha. But retail samples have clocked better than ES samples the last few generations.
     
    • Like
    Reactions: AIAW
    I was talking about Samsung nVME. Haha. But retail samples have clocked better than ES samples the last few generations.

    Oh yeah, the 980's. I've heard those are somewhere out there. Most of the M.2 nVME stuff is still limited by the PCIe lanes (4x) though right? Unless you go with one of the true PCIe Intel cards (like a 660P)...
     
    So what would a 1080 + the 34" or so screen at the higher resolution hold up to as far as games, etc for now?

    I dont generally replace parts until they die or have a perfomance issue. I'm not doing crazy design stuff or trying to benchmark or anything, but was trying to figure out if I NEED something like a 2080 with a larger screen, or if I can get by with what I have on everything expect the newest stuff at max settings.

    I know I'm going to need a better card eventually given my weird interest in Star Citizen lol but I also dont want to buy shit just to buy it.