Upgrade from R9 290 Tri x

Aug 13, 2018
9
0
10
Hello everyone,

I am thinking about upgrading my graphics card and cannot decide between a few options. I currently have an R9 290 Tri x with a 144 hz 1080p Freesync monitor and an i5-8400.

I am thinking about grabbing a newer graphics card to squeeze out some more frames and some better graphics for games like The Witcher 3. The Vega 56 and the 1070 ti are very similar in prices I have found (both $350), but they are used at least slightly (I fear the Vega may have been used for mining but just speculating).

I am sort of stuck and don't know where to go with my build. I don't know what a difference Freesync makes. And I've seen other people say 1080p isn't even worth it nowadays.

Any insight on the topic is much appreciated.
 
Solution
Whether 1080p is "worth" it or not, its totally up to you. I game at 1080p and have no interest really in gaming at anything higher considering the monitor size I prefer, at the distance it sits from me. It is a 27" monitor about three feet away. At that distance and resolution, I like it.

I have viewed 1440p and 4k monitors, at 27", from the same distance, and everything is smaller than I'd like. So for me, 1080p is fine. A lot of professional competitive gamers are actually using 720 and 900p monitors to gain competitive advantages. I think that is too small, because it looks horrible, but they don't really care about that. I do. Higher resolutions WILL obviously look much better, but for me 1080p looks fine. It only matters what...
Whether 1080p is "worth" it or not, its totally up to you. I game at 1080p and have no interest really in gaming at anything higher considering the monitor size I prefer, at the distance it sits from me. It is a 27" monitor about three feet away. At that distance and resolution, I like it.

I have viewed 1440p and 4k monitors, at 27", from the same distance, and everything is smaller than I'd like. So for me, 1080p is fine. A lot of professional competitive gamers are actually using 720 and 900p monitors to gain competitive advantages. I think that is too small, because it looks horrible, but they don't really care about that. I do. Higher resolutions WILL obviously look much better, but for me 1080p looks fine. It only matters what YOU think. You might want to go to one of the electronics stores and look, and decide if that's a change you actually want to make or not.

As far as the GPU card is concerned, I would highly recommend not buying a used card right now. Many if not most of the ones flooding the market were used for 24/7 mining operations, and many have been modified with mining specific bios or just ridden hard. Since the warranties are almost universally NOT transferrable, you're buying a card hoping it will be fine. None of the major manufacturers are going to honor the warranty on a card somebody else bought.

It would make a lot more sense to buy a new card, and get the full warranty, than to shell out 60-75% of the cost of a new card, and get nothing in return. Might take longer to afford, but at least you'll get value for your investment.

Freesync and G-sync are both almost requirements these days. It makes a huge difference being able to lock the framerates down to a number that doesn't create terrible tearing problems if you're way off from the monitor refresh rate. I'm not going into that much, because there is already hundreds of pages via Google that explain exactly what those do and why you'd want one of them on any monitor or card and what can or can't be done, with or without them.

Plenty of people using Nvidia cards on freesync or non-syncing monitors all over the world, and visa versa.
 
Solution

King_V

Illustrious
Ambassador
Agreed with darkbreeze - I thought a 27" monitor at 1920x1080 was perfection, but many people seem to think it's too big for that resolution.


I think the first thing is really to, if at all possible, find a store that carries a lot of computer stuff, including having numerous monitors on display. I know "near" me (about 45 minutes away), Microcenter is an example.

Sit down in front of them, browse the web a little, poke around. See how your eyes feel, and what they like. After all, what I may like, you may hate, and vice versa.

Figure both what size of monitor you want, and what resolution for that size. Another example is my son's current monitor, 2560x1080 at 34 inches. Typically you'll see 3440x1440 at that size. Or, at least, typically, that's what it seems more people consider an appropriate resolution for that size.

Keep in mind, the more pixels that have to be manipulated, the more powerful of a GPU you'll need to maintain frame rates.

Once you know what size, and particularly, what resolution - figure out what cards are available that can get you there.

THEN - *probably* you'll choose a G-sync monitor for an Nvidia video card, and a FreeSync for AMD.

Oddly, my current monitor is FreeSync, but I have a GTX 1080. Admittedly, I had an AMD card at first with this monitor, but at 3840x1600, nothing AMD had would run it as well without consuming ridiculous power (and when I got my GTX 1080, I got it at MSRP, but everything was generally still elevated in price from the crypto-craze). With mine, though I just vsync it at 60Hz, and I also don't play the latest and greatest games.


I also agree on the new-with-warranty issue for video cards. Unless there's a HUGE discount (and at that point, I'd be suspicious), or it's from a trusted friend who'll be willing to file warranty claims for you, I would go with a new card with a full warranty.
 
I agree, 1440p on a 32-34" monitor would work. IF you can place it 3-4' away. Any closer and that size monitor is impossible to really "see" everything at once. I'd do that in an instant but it won't fit beneath the row of cubby shelves on the overhead part of my battlestation, so I'd either have to rebuild the top half of the L desk or use a 27". For me, at 27", 1440p and 4k is too small. But then again, I have older, tired-er eyes. LOL.