All my computer questions

Eroge

Distinguished
Jul 9, 2009
74
0
18,630
MONITORS

(1) I spent some time looking for future monitors for myself. I think this one was 22'' or so but 1080p 1920x1080 (16:9). I learned that LCD monitors look best displayed at their native resolutions. On the 17 inch CRT I'm using now, if I try to change my resolution from 1024x768 to 1280x1024, everything including my desktop and text on firefox ect. is so tiny and unreadable. This worried me that having an LCD at 1920x1080 would be even smaller and make looking at the desktop and web browser impossible. Maybe a bigger monitor fixes that problem. Some other problems though are I'm not sure how many games, especially MMORPG even offer 1920x1080 resolutions. So if I did buy that monitor and ended up having to lower the resolution on every game I play, I heard it would look fuzzy and bad. I have no idea and will probably have to later research what the max resolutions are of the games I play. Also, what would happen if I played a game that could run in high res, only not a widescreen res but 4:3? Would this inevitably make the game look ugly? I'm just trying to decide which sized WS-LCD and with which native resolution I'm best off getting to play my games in max resolution. I never tried this, but would watching a DVD on the computer look different when the monitor is at a low resolution like 800x600, than when it's at it's max? Or does watching a full screen video automatically set it to max?

(2) My friend told me that instead of buying a $250 monitor, he's going to buy a $500 HDTV and just hook it up to his PC and play his First person shooters and WoW on that. I don't know much obviously, but just from the sounds of it I feel like that's a bad idea. For one, when hooked up to an HDTV, doesn't your computers desktop appear as a small box inside of the TV with most of it being a black border? And then to make that full screen, he could have to stretch all those pixels right? If you hooked it up to a 720 or 1080p HDTV, say 40inch, what's the highest resolution you'd even be able to set it too?

CPU and Graphics

(1) Someone with an AMD Athlon 64 X2 Dual Core Processor 5600+ 2.80 GHz was asking if he should buy the GTX295 and the first reply he got was "That CPU is going to bottleneck that card a fair bit." What knowledge does it take to be able to look at these 2 pieces of hardware and be able to automatically know whether or not they would run good together? I would love to be able to know how to choose which CPUs can handle which video cards .

(2) I was at my friends house playing a pretty old MMO, Flyff, on his computer and noticed something that's had me thinking for weeks. His system consists of an Nvidia 7900 GS AGP x4, Some Celeron single core 2.5Ghz, and 2gigs of RAM. Now running Flyff at 1024x768 with lowest graphics settings played perfectly. On my older computer with onboard Intel graphics I'm able to play Flyff on max settings so I figured a 7900 GS could handle it no problem. Setting everything to max (Still at 1024x768 though) and the game ran so laggy. How can I tell what the cause of this is? I really don't think I can blame the graphics card, but since the processor ran the game fine with low graphics settings, can I really blame the CPU? Does higher graphics only put more work on the graphics card or does it also put more work on the CPU? My instinct tells me a Celeron 2.5ghz would bottleneck the 7900, but I lack the knowledge and experience to know why.

(3) I do a lot of forum reading on many different hardware sites. I read way too many posts asking "Which CPU should I buy so I can play "X" game?" Then the person will get some replies suggesting different chips saying "It's so fast" or "a much faster CPU than the XXX so get this one." I never run into any posts that get into the details I'm looking for. If I can play World of Warcraft (as an example) with a single core 2.5 processor, how much better performance can I expect when playing with a 2.0/2.5/3.0 dual core or quad core processor? Most people will just post "A lot better" but I'm always looking for a more detailed answer to better understand what roles a CPU has when gaming. I also look at benchmarks but I don't get how it helps knowing that one chip scores a 1200 while another scores a 3500. The benchmark list does let me see which chips perform better overall though which I guess helps. When I shop for a processor I would like to know what kind of gaming frame rate I can expect out of it. I then find people who say that your CPU doesn't even matter in gaming as long as you have lots of RAM and a good graphics card. -___-

(4) On one of my computers, the onboard graphics chip always read 100-something MBs VRAM in DXDIAG. One day a game I was playing no longer worked as I would BSOD when trying to log in. I then downloaded a graphics driver from MajorGeek.com which fixed the BSOD problem, but now in DXDIAG it's telling me I have 230mb of VRAM. How could updating my driver have given more 100mb or so of VRAM? And does this 230mb VRAM automatically deduct from my measly 1GB system memory since it's shared?

(5) I'm not sure how to compare the FSB on Intel processors to AMDs confusing Hyper Transport speeds, if I chose to go with Intel I'd have a 1,333Mhz CPU and MoBo with 800mhz DDR2 RAM, where with AMD I'd have a 2000mhz CPU, 2600mhz MoBo, and still 800mhz RAM. Do I even have to really worry about the difference between FSB and Hyper Transport? Sure AMDs have higher frequencies, but in use does it even make a noticeable difference? Or atleast to someone who would mostly be gaming on the PC? It seems like the 800mhz RAM would hold back the 2600mhz+ speeds anyways.

Downloading and Streaming


(1) I've spent so many hours pondering about the downloading process and what your hardware does. When you download anything, does the data go to your modem first, then through the router, to your computers RAM, and then your CPU moves it from RAM to harddrive? Or does your RAM and CPU sit back while it goes straight from Router to harddrive?

(2) Is there a difference in hardware usage when streaming video from Youtube then when streaming live video from a webcam or a website that streams live? Please explain what the process is and how the hardware is used? My moms always watching live video talk shows online, and someone my dad knows they are surprised my mom is able to do that with a Celeron D and 533mhz DDR RAM. I figured you can stream video from youtube on any old computer since I've done it with 100mhz RAM and a Celeron 733mhz CPU, but I never thought about live streaming. I mean, why is my dads friend who supposedly knows a lot about computers so surprised that a Single Core 2.8ghz, 1.5GB of 533mhz RAM Compaq Presario desktop can stream live video? x.x

Harddrive

(1) Would having a harddrive with more space, like 500gb, cause longer loading times in games than if the game was installed on an 80gb harddrive? Or if so, even a noticeable difference? One day I planned on just filling up a 500gig with installed games. I'm also wondering since I see 1.5TB harddrives are getting affordable now.

(2) When you play a game, and see a loading screen anywhere, is that when the harddrive is loading all the maps data into your systems memory so that the CPU can then just quickly receive data whenever needed from the memory and never have to wait on the harddrive? Does the CPU then forward this data to the graphics card, or does that do it's own work buy receiving it from the memory also? *Trying to understand how a computer operates when playing a game*

Drivers

(1) My Dad is buying a new laptop next week. I always read "Go to the manufactures website and download the latest graphics, network, and sound drivers." I'm wondering if it is necessary though to download the latest motherboard chipset drivers for his laptop since I don't hear much talk about it? And if that chipset driver would also cover his network card? (Don't know what the inside of a laptop is like) No uninstalling drivers before hand is necessary I hope right? I've never actually uninstalled one before on any of my computers.
 
I will answer question 1. The 5600 may not be the fastest, but it's no slouch. You could spend double or tripple on the cpu, and maybe get 10-20% faster frame rates, but do you really want to do it? Other consideration is your power supply. Be sure to save enough for a decent 750w ps if you get the 295. It will cost you $100-150 after rebate, but it's worth it.
 

Eroge

Distinguished
Jul 9, 2009
74
0
18,630
But how is it that people know the 5600 can't send information to the 295 fast enough? What do I need to study to be able to have this knowledge of what would bottleneck what? What on the CPU and what on the graphics card am I supposed to be comparing?

I'm looking to fully understand everything instead of just taking peoples words for it. For example, I want to know how that more expensive CPU provides that 10-20% boost instead of just taking your word for it.
 

shrex

Distinguished
May 19, 2008
154
0
18,690
Im using a 32 inch TV as a computer monitor, the TV specs will state what resolutions it supports, my tv has a resolution of 1366x768 and the pc is outputting 1360x768 and it is achieving 1:1 pixel mapping so there are 3 black pixels on either side, i find it very usable and i like the size of the text, nice size and easy to read and games look great.



 

nosnitrousx

Distinguished
Jul 6, 2009
20
0
18,510
I´m going to answer you some of your questions..based on personal experiences.....

- About the CPU and graphics: before my current rig, I had a intel C2D E6600 (very very good processor btw - too bad I hadn´t a good mobo which let me overclock it), and a found a super cheap price on a GTX 280 SC edition, so I took it.
When playing FarCry 2... (20" - 1680*1050p)I was getting around 72-76 FPS (top), didn´t make any difference wheter I overclocked the GPU or not. Now I´m running a Phenom II 940, before I overclocked it (stock), I got 100+ FPS (top..not average of course) and when I did in some scenarios got 110+ FPS...Does the CPU makes a difference in graphic performance?? OF COURSE.....Although I couldn´t see how the game would´ve performed if I had the E6600 overclocked, maybe that way the difference wouldn´t be that big, even though it´s not a fair comparition between them.



- About drivers: when installing the 280 and the drivers that came with the CD, in the previous rig, (I also had to buy a PSU of course)the system didn´t booted up, so I updated the BIOS and downloaded the latest drivers at that moment........and it started perfectly.

In the other hand...in my current rig, I updated the latest audio drivers of the MOBO from the manufacturer´s website, and there was a kind of noise at low frequencies, like the one when you touch the (+) and (-) cables of a speaker between them, I tried with the CD drivers, and the wieird sound was gone.

So, I´d say drivers it´s a matter of luck, there are many factors of compatibility, components etc...each system is different. If your running without issues with the currents one, leave it that way, if something shows up, like bugs, blue screens...etc etc...then make the update.

Hope it helps....
Regards!
 
About the monitor question - dunno what OS you are using but in XP, you can adjust the screen item size in the display properties settings by clicking on the advanced button, then changing the DPI (dots per inch) setting to large or custom from the normal 96 dpi setting. Or if its just the font size, then go to the appearance tab and select a larger font size. Vista and W7 have similar functions.

As for using an HDTV as a monitor, as somebody else said you have to make sure the monitor is set to "dot for dot" or no internal scaling mode, and select the appropriate resolution (1366 x 768 for 720P, 1920 x 1080 for 1080P TVs) again using the display properties utility through either control panel or by right-clicking on an empty spot of the desktop and then selecting properties in the popup window. Once you have the resolution correctly set, if the picure doesn't fit the screen (either too small or too big), try adjusting the overscan setting on the TV if possible, or even better read the manual and see if there's a PC mode.

I have a 50" Pioneer Kuro Elite plasma that I use as a second monitor for games on my XPS laptop, connected via a DVI-to-HDMI cable with separate stereo cable, and the Pioneer recognizes the PC input mode so the picture exactly fills the display, no overlap or underlap at all. Bluray movies look stunning as well (the Pioneer was top-rated for cinema on CNet and elsewhere). If BD playback is also your intent, make sure you get a newer HDMI 1.3 cable so that it is HDCP compliant, and you may have to disable your primary display if you use the TV as a secondary one during playback.
 
Its alot to do with the software. In games in particular. Just being a game, you simplu cant say what the needs of the gfx card, or the cpu will be, as each game is different, and requires different things.
GFX cards render your scenes, but also included in that are AI calls for the cpu to do as well. Some games are filled with textures, while others require alot of AI, some both. The clock speed is generally regarded as the common denominator of how fast a game runs, mostly on the cpu side, but gains are obviously seen from having higher clocks on gpus as well.
But here it gets tricky, as each archetecture is different, wether its a AMD/Intel cpu, or a ATI/nVidia gpu. Each arch is different, and runs at different speeds, with different effectiveness, which on cpus is called IPC, or instructions per clock. Theres many things to be considered in all this, such as cache, latency, ram, etc, but again, for gaming, we usually look at clock speeds per arch, as to how the cpu will perform while gaming.
GPU side, its less about the clock, and more about the arch. Its why we see such a huge disparity between games between ATI and nVidia,even on like/competitive cards, costing the same. Each arch has its strengths and weaknesses, and according to the games needs, those strengths or weaknesses show up. Since gpu's dont have the same structure as cpus, they do things like hide latency with clever usage of their arch' etc. Ram, or GDDR ram, plays into it alot as well, as varying ram speeds determine output size and speed (availability)
Putting them both together, you want balance. You want a gpu that can handle your resolution, which has enough GDDR on it, and an arch and core that can deliver all the info in a timely manor, while, at the same time, you want a cpu to be able to keep up with the data flow, while also performing its required duties, such as AI, or physics, while at the same time, be able to run any other system apps in the back ground.

Theres alot more to it than what Ive said here, as each type of core, be it cpu or gpu has many things that make or break performance, such as cache, cache latencies, and beyond that, system ram etc, even your HDD. Take time to learn it, it is rewarding