Upgrade GPU on Dell XPS 8500 or build a new computer?

Initially I was thinking of getting an Xbox One X ($500) to get into 4K gaming now that I have a 4K TV, but it sounds like that console leaves a lot to be desired based on early impressions.

My current thinking is to get the cheapest GTX 1080 I can find that should run at close to the same price. I have the impression that this will take me closer to 4K on modern games than Xbox One X. However, I fear my 2012 Dell XPS 8500 may not be up to the task, so please comment on any potential issues.

My current rig, Dell XPS 8500
i7-3770
12 GB DDR3 RAM
Stock power supply (460w if I'm not mistaken)
Sapphire RX 480 8 GB reference

I picked up the RX 480 prior to my TV upgrade, so I wasn't even thinking of 4K at the time. Prior to the RX 480, I had a reference EVGA GTX 680 that ran with no issue. That has higher power requirements than the reference GTX 1080, so I think I'm OK with the stock power supply, despite it being below the suggested 500w. I don't OC or anything. The machine is purely for gaming. I use a laptop for all other home computing needs.

The only annoyance is that the Dell mobos are a bit picky on compatible GPUs, so I'll have to hopefully find particular models that other people online have confirmed working on the stock Dell XPS 8500 mobo.
Reply to onoturtle
7 answers Last reply
More about upgrade gpu dell xps 8500 build computer
  1. The PSU is the issue as a 'stock' PSU are notoriously the minimal qualifications for getting the 'out of box' computer working, not in any consideration of GPU needs. While this is a beefy rig, as your noting it is now 5 years old, which is about the time to replace it. This is especially true as you can't buy 'new' memory or other components for it, as they are fitted for the new DDR, 7xxx and now released 9xxx Gen CPUs, etc.

    So this leaves you with a decision, replace BOTH the PSU (which has to fit that specific case) and GPU (my 1060 works at 4K too, you don't HAVE to have a 1080, just FYI) OR you can make the LONG term investment and start clean with a new current gen rig.

    The thing to remember with PC as compared to normal TV Electronic devices is, best practice for 4K is to use DP connection, NOT HDMI, due to the PC still catching up to TV Electronic standards for it.

    Lastly, just because you can make it do 4K doesn't mean everything WILL BE 4K. If a game title isn't specifically texture / coded for 4K not only will it look like "shit" but will perform horribly, verses being on normal 1080P. That is for PC, Console, whatever platform you want. Just to advise you, I am NOT aware of any normal title in 4K yet, but I do know some special editions were made (Doom using the Vulkan coding) that amped things up.
    Reply to Tom Tancredi
  2. As I mentioned, I'm not too concerned about the PSU as I've used a GTX 680 in the machine for years and that is more power hungry than reference GTX 1080.

    Indeed my current rig with a RX 480 can run older games at 4K 60fps (e.g. Tomb Raider 2013), but not so with modern games (e.g. Rise of the Tomb Raider). My 4K TV only has HDMI inputs so that's what I'm using. I do have DP to a monitor but that's only 1440p.

    Will the newer CPUs and memory really bring much to the 60Hz gaming table? I'm either going to spend around $600 to upgrade the GPU only, everything but the GPU, or just get an Xbox One X.

    Long term, I will completely replace the PC, but I'll do it in two steps. Upgrade the GPU this year and then everything else next year or so, or vice versa. What's the better order?
    Reply to onoturtle
  3. Based on what you said I would suggest the XBox One X.

    First, you need at LEAST 600W PSU, and that is highly recommended (for stability, aka not having random BSODs from random things) at least a Bronze, best Gold or Platinum level PSU. Otherwise as noted here:
    http://www.tomshardware.com/reviews/low-cost-psu-pc-power-supply,2862.html
    http://www.tomshardware.com/forum/id-2547993/psu-tier-list.html

    While your PSU has worked well with your system configuration, we are talking a change, and not just ONE generational change but 3 Generations between the 6xx and 10xx series. As such there is a whole new connectors the PSU needs to have INCLUDING the other connectors you have currently.
    http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,8.html
    "GeForce GTX 1070 / 1080 - On your average system the card requires you to have a 600 Watts power supply unit."


    Secondly, displaying games on a 4K screen is far different than textures and high-poly mesh models that are included in a game that make it hard to differentiate between a 'movie' model and a 'game' model. Sorry I wasn't clear that way. For example able to see individual beads of sweat on the skin of a AI and it rolls down over a rounded (not chunky) face.
    Here is clear examples of 4K in a game, namely GTA V
    http://kotaku.com/gta-v-mod-adds-4k-textures-game-looks-utterly-ridiculo-1745334649

    That said, do note, even on a XBox One X, the games being released are being UPSCALED 4K not true native 4K. This happens on both consoles (all the E3 reviews I read from multiple authors) until the games actually use 4K textures and high to ULTRA-poly models. http://www.red.com/learn/red-101/upscaled-1080P-vs-4K

    Well the newest cards seems to have HDMI 2.0 in them, so they SHOULD work, but depends on the implementation by the maker, so do keep a eye out, they may subtly put HDMI 1.4 instead of 2.0 to 'cut a corner' and cost. You should pay close attention to the very small print, and/or look up that model's support page and grab the manual and read how to do 4K gaming on it. If it does have a 1.4 you couldn't use the HDMI (which was normal) and be stuck doing the DP connection.

    As for newer CPU and memory (lets not forget taking a big lag out of a system by putting OS on a simple SSD) does change alot in more FPS easily. There is a LARGE difference between your old 3xxx and DDR3 as compared to DD4, the new chipsets, etc. all part of the current 7xxx and just announced 9xxx CPUs (for example the latter actually will go up to 16 CORES, and Hyperthread up to 32 for example). Again alot of remodelling the chip itself, how it passes and takes in traffic (I/O), etc. all add to more more more to do more more more - LOL.

    Now you having a 1440p usually is 144Hz, which you will see a DRAMATIC difference over 60Hz so will be more performance-wise (more realism in movement actions) over your 4K TV. Here is a example https://www.youtube.com/watch?v=a2IF9ZPwgDM . I say to all the CS:GO players, even if your getting 200FPS your 60Hz screen will still only show 60 frames of data, and just drop the rest, eventually causing ripping and tears on the screen display (and then we are discussing the use of hardware embedded in the monitor with G Sync to resolve all those and more issues).

    Conversely if you improve the visual details (like I pointed out above) and using actual 4K textures and high poly models onto a 4K TV and still at 60Hz, you will still make a 1080 crawl in performance because 4K is EXTREMELY demanding, in my opinion more than trying to increase FPS. You would need to get actually 1080Ti or best yet SLI 1080Ti (which your talking at least a 1200K PSU to support that) due to the demands trying to make NATIVE 4K work.

    http://www.pcgamer.com/mass-effect-andromeda-pc-performance-analysis/ "1440p ultra will need a GTX 1080 or higher for 60 fps, though the 1070 comes close. And if you demand 4K at 60 frames per second, you're going to want at least a GTX 1080, and probably a 1080 Ti—and then drop some of the settings to medium/high."

    http://www.pcgamer.com/geforce-gtx-1080-ti-review/ "But does that mean the 1080 Ti is capable of running any and every current game at 4K ultra settings and 60+ fps? No. Because there are still beastly games like Deus Ex, which even with the very high preset only manages 43 fps—without 4xMSAA. Another game in that same ballpark is Ghost Recon Wildlands, at least in its just-launched state, which averages just 37 fps on the 1080 Ti at 4K using the ultra preset. Update: and the new Mass Effect: Andromeda also routinely falls below 60 fps at 4K ultra."
    Reply to Tom Tancredi
  4. Thank you for the lengthy response. Let me clarify some points on what I'm looking for.

    I'm coming for a console perspective, as noted by the Xbox One, so the desired FPS I will target for is 30 or 60 FPS, depending on the game. I am not aiming for nor willing to spend to work toward higher FPS. I don't have Gsync/Freesync hardware. My TV and monitor are 120Hz and 60Hz, respectively, IIRC. The closer to 4K resolution, the better, but this won't be achievable on all games, e.g. the few reviews you cite, without heavy hardware investment and that's not going to happen.

    I'm not sure what new connectors the PSU would need for a modern GPU that you are referring to. The base EVGA GTX 1080 for example requires a single 8-pin power connector. It comes with a dual 6-pin adapter. I have dual 6-pins that I used on the GTX 680. Dual 8-pin GTX 1080 do indeed require a PSU upgrade.
    Reply to onoturtle
  5. Hi there,

    The length was due to both show source material as well as the intricacies on the PC we are delving deep in the Tech side of things. Console is very easy, as you know, just flip the switch and it just plays as the programming is adverted to do. With the myriad of the PC side options, you have ALOT of work to get into to achieve specific goals and directions; at a serious cost, that consoles can't compare.

    That said, yes depending on the GTX 1080 you get and then know the proper connectors for it. It is very BAD option to use any '6pin>8PIN' conversions, as noted in many threads "things happen", and it best to have a 'pure' connection (a 6pin for a 6 Pin or 8 pin for a 8pin). I can attest my own situation I too found out the hardware, and yes a new PSU with proper cabled connections was required to alleviate all my issues.

    So in summary, you can get a 1080, but you need at least 600W pumping out (which you don't) and have the proper 'pure' connections to power the 1080. With that you will constantly overbeef the graphics (all games on Ultra) with 60FPS or better on a 1080P display. If you push to 4K, you need to get the 'mods' (if any) or alternative version of the game 'graphics mode' that will use 4K textures and ULTRA POLY models that will make it look as shown in the GTA V mod as compared to the 'normal' textures and poly models you will see on the advertized video game play. You won't "see" that 4K detail, but will just be 'stretching' things into a 4K SIZE display http://blog.artbeats.com/blog/wp-content/uploads/2013/10/Size-Comparison-Overlay-630.jpg .


    =========================================================================

    ADDITIONAL DETAIL to assist and clarifying some important parts.


    There is more to the issue than "the closer to 4K resolution, the better" and that is why I brought up FPS. We aren't talking "playing at 40FPS is unplayable" elitism, what we are talking about is single to 10s FPS or worse (as mentioned) tearing, skipping, 'teleporting', lags, etc. "in the game" due to the difference between what is being SENT to display and what IS displayed.

    I am not sure how much you know (coming from console perspective) about the difference on PC, which most people in the PC community really don't get either, about "FPS" and "Hz". The FPS counters displayed is about what the CPU and GPU are rendering to 'send' to a display. The Hz is the actual number of frames displayed on the screen you actually see, no matter what was SENT to it. So for your "30FPS" for example, the monitor would then take time to actually calculate and "add" duplicate frames of the 30 it got and then fill in for the actual 60Hz it displays (60 Frames per second of display). Conversely if you were sending 120FPS it would 'drop' every other frame (skipping across screen) or drop HALF (sudden teleport) the frames to make it 'fit' on the 60Hz. Add in the variable 43, 89FPs and so on, and you can see how messy this gets, and in turn affects your game play.

    So the remedy has been to get the GPU pumping enough (60FPS) all the time (even if you had 20 helicopters firing 20 Rockets each all at the same time exploding on the screen) so there wouldn't be that much of an issue. Because they got very good at pumping out that much power and could achieve now 100+ FPS if they wanted to the display (100, 120, 144Hz) got bumped up so that 'action' (running, flying, moving in anyway) seemed realistic and "fluid".

    Now with the move to 2K, 4K and then 8K displays (larger size AND much more detail aka pixels to 'fill' in) and, this impacts all those "gains" they just made. It isn't (as I mentioned) just getting a "HIGH" FPS, but to match the HZ consistently so you don't get bad performance in the game itself, and having a larger SIZE screen as well a MUCH more detail to also 'fill in' is a ton more data to render.

    4K does require higher poly model rendering. For example look closely at the ears and nose in comparison, to achieve the smoother realistic not jaggy parts requires alot more rendering by the GPU and CPU just one that ONE head, and now multiply that times the number of 'people' on the screen and add in the eyes, the hands, the body, clothing, buildings, etc. etc. etc. all in one simple second on the screen. https://tplinnovator.files.wordpress.com/2015/09/female_head_bases___low__mid_n_high_poly_by_lbg44-d4l3ecx.jpg

    4K does require 4K rendered textures, as shown in the previous GTA V 4K mod above, as compared to what you see if you review the normal GTA V Video Game Trailer, you see a TON of difference on how 'detailed' the water is, the paint on the car is, how the Sun looks, etc. etc. etc. What been happening though is upscaling as I mentioned, where your making a 4K SIZE screen, but your using lower end poly models and textures to 'stretch' to fit that screen.
    This is what people 'want' when they say 4K
    https://s3.amazonaws.com/red_3/uploads/asset_image/image/519a4b3aa48c373941000957/scaling3.jpg
    This is what they get when they can't afford the investment (as we both mentioned) when they play
    https://s3.amazonaws.com/red_3/uploads/asset_image/image/5196af342f74a94eef001224/scaling2.jpg
    AND the game has lag, low FPS, "issues" etc. when your playing.

    So right now, if you wish to play 1080P 60Hz / 60 FPS on a SINGLE screen PC, that is what the GAMES are set for and the hardware is tuned for. If you wish to push PAST those, then you need to modify some other parts lower (less FPS, garbage looking graphics, etc.) to compensate if your not willing to pay for the hardware costs to make that standard FIT into the goal (4K, Multiple screens, 240HZ screens, 200FPS, etc.).
    Reply to Tom Tancredi
  6. Thank you for the details. So my current plan is to get EVGA's cheapest GTX 1080 (or other manufacturer's equivalent) and probably a Corsair CX650M. That PSU cost $60 on Amazon, depending on wattage, so that's doable (cost of a game). Though I will need to double check on the dimensions. The review states it comes with 2x 6+2 PCIe cables. Is 6+2 pin a strange thing compared to straight 8 pin? http://www.tomshardware.com/reviews/corsair-cx650m-psu,4770.html

    I haven't yet bought a graphically intensive game on PC this year, so the latest game I have is last year's Rise of the Tomb Raider. Running in DX11 so that RivaTuner/HWinfo displays on my current box with a RX 480, I get high 20's FPS on Ultra at 4K resolution. Utilization is 100% GPU and 50-60% utilization for one logical CPU core. The remaining cores are a mix of 40% to little usage at 10%. Not a very multithreaded game it seems. So looks like my i7 3770 still has a lot more it can output if the GPU wasn't topped off. I get similar results booting up older games I had installed, The Witcher 3 and Shadow of Mordor (except that they use less CPU that Rise of the TR). Maybe because I originally played TW3 on console, which was 30 FPS at best, that the game feels so foreign and strange to me when playing it at 1080p 60 FPS.

    I also tried the free Forza Motorsport 6 APEX in the Windows 10 store. While that's DX12 and therefore I can't see how my GPU and CPU is doing, it runs a smooth 60 fps at 4K. Granted this game is dynamic and is probably curbing some effects to hit that target. Forza 7 is one of the games I would get on the Xbox One X, so I should be set with the GTX 1080 with this game. Even with my current system even since the game was designed for console hardware.

    I finally got around to checking out that GTA V mod and yeah, I'm not doing SLI (or a Titan) so "4K mods" are totally not in the realm of what I want to do. I in general don't mess with mods (except for Skyrim years ago).
    Reply to onoturtle
  7. onoturtle said:
    Thank you for the details. So my current plan is to get EVGA's cheapest GTX 1080 (or other manufacturer's equivalent) and probably a Corsair CX650M. That PSU cost $60 on Amazon, depending on wattage, so that's doable (cost of a game). Though I will need to double check on the dimensions.


    Great! Also double check your AIRFLOW and make sure it will be well cooled. The thing generates ALOT of heat and easily cause the rest of the PC (Oh my GU runs at 70C, yeah but how hot is it IN the case, Mobo temp, etc.?) to suffer as well. So just be on the wise to make sure to tie off cabling etc. I had to invest in a better 'gamers case' which maximixed airflow and I never had a nicer solution before. Cables all UNDER the Mobo, the Mobo actually mounted horizontal instead of vertical, drives way below, and easy to see airflow.

    onoturtle said:

    The review states it comes with 2x 6+2 PCIe cables. Is 6+2 pin a strange thing compared to straight 8 pin? http://www.tomshardware.com/reviews/corsair-cx650m-psu,4770.html


    No they make them 'flexible' so they can be backwards compatible with those people with older GPUs that only have a 6Pin. The really odd part (and keep a eye out) is the proper and enough connections for your Mobo. I have a 4thGen Board (Haswell Refresh) and ran into the connections on it were a bit more,.... odd.. than I was used to including a extra need for a extra 4Pin to power the actual board ONTOP of the whole power row normally for Haswell based boards.

    onoturtle said:

    I haven't yet bought a graphically intensive game on PC this year, so the latest game I have is last year's Rise of the Tomb Raider. Running in DX11 so that RivaTuner/HWinfo displays on my current box with a RX 480, I get high 20's FPS on Ultra at 4K resolution.


    Yep typical, and you have to either down grade the ULTRA to Med / Highto increase the FPS or downgrade from 4K to 1080P to get the better FPS performance (no lagging, stutters, etc.).

    onoturtle said:

    Utilization is 100% GPU and 50-60% utilization for one logical CPU core. The remaining cores are a mix of 40% to little usage at 10%. Not a very multithreaded game it seems. So looks like my i7 3770 still has a lot more it can output if the GPU wasn't topped off. I get similar results booting up older games I had installed, The Witcher 3 and Shadow of Mordor (except that they use less CPU that Rise of the TR). Maybe because I originally played TW3 on console, which was 30 FPS at best, that the game feels so foreign and strange to me when playing it at 1080p 60 FPS.


    Well your numbers show that the GPU is the bottleneck, and while it is 'busy' the rest of the cores are twiddling their thumbs (unless you turned off Hyperthreading which is key to using the cores even for a single threaded programs like games).

    Foreign and strange? How do you mean???? *perks interested*

    onoturtle said:

    I also tried the free Forza Motorsport 6 APEX in the Windows 10 store. While that's DX12 and therefore I can't see how my GPU and CPU is doing, it runs a smooth 60 fps at 4K. Granted this game is dynamic and is probably curbing some effects to hit that target. Forza 7 is one of the games I would get on the Xbox One X, so I should be set with the GTX 1080 with this game. Even with my current system even since the game was designed for console hardware.


    You can monitor ALL games by using MSI Afterburner. Go into the settings and you have to manually select what displayed in the OSD (On Screen Display) overlaying on your games, well any 3D app actually. And you can monitor (as I done) Temps, usage, etc. I wouldn't use Task Manager as a measure as it isn't reliable when actually IN the game itself, when you 'switch' to look at the scores it tends to 'drop' the 'task' of the game into the background and you get less reliable results. Just FYI.

    onoturtle said:

    I finally got around to checking out that GTA V mod and yeah, I'm not doing SLI (or a Titan) so "4K mods" are totally not in the realm of what I want to do. I in general don't mess with mods (except for Skyrim years ago).


    Nods, As you stated you do "Ultra at 4K Resolution", I was pointing out that is a big misnomer for people as they seens this AWESOME 4K videos on the TV Screen, because those videos are filmed in 4K resolution and thus 1:1 ratio on 4K resolution. In Gaming though the computer has to render both the object (person, car, sword, etc.) and the texture to overlay on that model. Skyrim for example used 512 textures (hence why it looks like 'crap' as compared to modern games), and GTA V out of the box is using 1K-2K textures and models (which is the norm). So running GTA V "ultra at 4K resolution" really means the 'Screen size' is 4K sized, not really seeing 4K Gaming UNLESS the game includes natively those 4K textures and poly modelling (which I believe RoTR actually does include it) or you mod it (which you can on Skyrim Special Edition also modding community has been pushing 4K textures).

    http://wccftech.com/rise-of-the-tomb-raider-4k-screenshot-comparison-native-maxed-pc-vs-ps4pro/
    https://www.youtube.com/watch?v=DZIKkDn6prs
    Reply to Tom Tancredi
Ask a new question Answer

Read More

Dell Studio Xps Computers xps 8500 Xbox One gtx 1080