is this good to run todays games

godlyatheist

Distinguished
Nov 13, 2005
439
0
18,780
good multi-media PC, not for gaming and certainly will not do that great with the integrated graphics. take that money and build yourself a budget gamer with 7800GT, which will rock the HP in gaming
 

parlee

Distinguished
Dec 16, 2005
1,149
0
19,280
if u upgrade to an x800xl u can have pretty good gaming, x800xl is a pretty beefy card even still with the newer gens comming out, itll run games at medium/high settings, but if u have money to buy a 7800gt, then thats a better deal, but if u have the x800xl laying around or are pawning it off a friend then by all means go for it
 

godlyatheist

Distinguished
Nov 13, 2005
439
0
18,780
forgo the x800xl and get the new GTO or GTO2's. they are all 16 pipers now and OC like mad. PLus they can all be gotten for less than 2 franklins. If you want eye candies like HDR, then the 6800GS is another option for 200.
 

kais

Distinguished
Jul 6, 2004
256
0
18,780
i guess il go for a hp pc with a x2 4200 and maybe a 7800gtx and a 500 watt power supply and 2 gigs and i guess i could be set although i might get a x1900xtx which is 2 times better than the gtx but thanks for the advice and also what if i get a 3700+,or X23800+ will those be good to
 

kais

Distinguished
Jul 6, 2004
256
0
18,780
i would build my self a custom but it costs more and if i put a nice card and more ram and of course a better psu i should be able to run any game out today. does anyone disagree?
 

parlee

Distinguished
Dec 16, 2005
1,149
0
19,280
yea normally building is cheaper, but with some of the deals computer places have been having lately it can come close to even, but building has problems too, like picking compatable parts, the warrenty per each item, and having to build it urself can be a pain if u havnt done it before... the comp u have is very solid, the amd 3700 is a fast single core, but the x4200 dual core has the same clock speed i believe as a 3700... (2.2ghz) so the 4200 will run as fast as the 3700... the x2 3800 is a bit slower at 2.0 ghz but still pretty fast, dual core is the way to go for future computers, as for this moment its not really a huge plus strictly in gaming, but if u plan on having this comp for over 2 years, get the dual core
 

kais

Distinguished
Jul 6, 2004
256
0
18,780
thanks for the help guys i picked up and hp pavilion with a amd 64 x2 4200, 2 gigs, 250 gig hdd, ati x800xl, and a 500w psu its very fast and amzing graphics in hf 2
 

BitFiSH

Distinguished
Jan 28, 2006
12
0
18,510
Obviously you will have issues with max graphics settings on the newest of games. Stick the x800XL in the box and you'll have a respectable gaming machine.
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
You might want to double check a industry standard PSU will fit in the HP case if you decide to upgrade it.

Sure, they might be using ATX 24/20 pin power, etc on the mobo, but that doesn't mean the physical dimentions of their PSU's will match a 'typical' one.

Expect to pay a price premium to upgrade to a HP desktop PSU if you are planning on needing more juice than their 'stock configuration' provides. Assuming a industry std PSU won't fit, or work as expected for some reason.

Also check their PCIe x16 slot is rated for 75 Watts and not 25 Watts like some 'brand name' PCs are offering. This is off the top of my head, but I am pretty damn sure the industry spec is 75w on the PCIe x16 slot, but some 'brand name' mainboards only supply up to 25w. Makes the boards cheaper to mass produce / increases profits.

It might even all look the same, but when you scratch under the surface of the specs on some of these 'brand name' mainboards they ain't quite up to industry expectations. Then you pay extra for support costs, etc just to find you can't get what you wanted in the end anyway... that is how they make their money.

PS: I am not kidding about the 75 Watt vs 25 Watt PCIe x16 slots either.
This isn't from a bad experience either, I swear I've seen some mobos that only do 25w to the PCIe x16 slot.

Ain't saying it'll be an issue for sure either, just that I would double check these two things
 

parlee

Distinguished
Dec 16, 2005
1,149
0
19,280
never heard of such a thing... although the hp psu may be smaller than the 500 watt one you want... but im sure 400 watts will be enough to power ur computer, as long as you dont have very many harddrives...
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
never heard of such a thing...

When was the last time you checked the wattage output on a PCIe x16 slot anyway Parlee ? :p (Kidding)

True, 350w is enough for most PCs only running one video card. The exceptions being GeForce 7800 and Radeon X1800 or higher video cards, you'd want 400w just to play it safe.

Like I said, might not be an issue, but double check just to play it safe. Don't go off a sales persons advise either as they'll say anything to get a sale (usually, their h/w knowledge is basic anyway), go off a real tech who knows, or can look it up in the whitepapers / documentation for the board they are offering in the system.

1st hit I could find on Google that comes close:
http://www.amd.com/us-en/Processors/ComputingSolutions/0,,30_288_13265_13295%5E13334,00.html

Quote from the AMD website: (Not exactly what I was after though)
* AGP, which was designed to take some of the load off PCI, is similar to PCI and shares some of the same problems – however, it does have some distinctive features. AGP was created specifically for graphics cards and designed to share a portion of main memory to store rendering data, rather than having to load the data into the onboard video memory. New generation video cards sometimes require two separate power connectors, because they outstrip AGP's energy supply capabilities. While the AGP bus can supply a maximum of 25 watts, PCIE can supply 60 watts now and up to 75 watts in the future, thus cutting the need for extra cables.

... as such some 'brand name' mainboards are only outputting 25 watts to the PCIe x16 slot. As (1) It is usually enough, and (2) It permits the cutting of costs as power regulation circuitry isn't cheap.

Not all Radeon X800XL cards have PCIe 6-pin power connectors (my GeCube one doesn't) and thus need a 'real' 60-75 watt PCIe x16 slot. Not the cheap, low wattage imitation. (Physically there is no difference betweent the two either, so denying they exist is easy).

9/10 techs can't tell the difference, that is, until they try a high-end video card and it doesn't work as expected on their mainboards, even with a 500 watt PSU, for a reason they can't explain.... other than "it isn't my fault, the video card must be stuffed", when it actual fact it is the mainboard causing the issue.

Further digging on search engines would confirm the issue does exist. Just download the documentation for a few 'brand name' mainboards.... hard to find but not impossible.
 

parlee

Distinguished
Dec 16, 2005
1,149
0
19,280
well honestly i never checked wattage, didnt really care much... i guess it makes sense though and this could be the result of many peoples problems with pci cards complaining they dont work, and RMA and 2nd doesnt work either... i guess just make sure that the card has the power adapter thing on it... :p so if the mobo doesnt support 75 watts then itll work with the adapter
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
350W could be enough for 7800 and X1800's

350w would be enough, yes, if the absolute minimum efficiency of the PSU was 87% .... even after 2-3 years, which will never happen.

Considering most PSUs range from 75% - 90% even checking the specs of the best brand name 350 Watt PSUs will confirm this.

Depending on power load, they are only at maximum efficiency at exactly one load, every other load could be 10%, 15% or even less effective. Idle might be 95%, but max load on CPU and GPU might only be 80%. PSUs use some of their rating to run and cool themselves. Thus they ain't 100% efficient.... almost all electrical devices aren't.

A Radeon X800XL would be OK on a 350w PSU, all other things meeting spec, (as above, eg: PCIe x16 slot on real OEM boards), but GeForce 7800GT and Radeon X1800XL/XT is where most system builders (with their customers best intentions in mind) draw the line at a 400w minimum PSU.

Let them buy a new PC when they need to, not because their old one failed after apx 2 years I say.

Add to this problem that as PSUs get older (2-3 years down the track) they also drop in efficiency. This is one of the reasons why PC hardware starts to electronically fail after 2-3 years....

Think about it / Check the numbers / Heck é-mail Tom (or any reputable) tech, such as Scott Mueller, or people who design PCs to fail 6 months after the warranty expires so people need to buy new gear (at some discounted price for being a previous customer).

You can almost set your watch to some of these 'failures' people encounter with their machines.... almost.

Check this thread on HP:
http://forums1.itrc.hp.com/service/forums/bizsupport/questionanswer.do?threadId=948591

I'll try and find another one that shows it in a clearer light.
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
Cool,

You downloaded HL2: Lost Coast yet ?, it has a benchmark in it, just curious as to what you get.

Settings: 1280 x 1024, High Detail, Reflect All On, Vsync off, Sound On, HDR: Disabled, Catalyst AI on Standard

Forced 4-way affinity (it defaults to 1-way for HL2: Lost Coast for some reason, with HL2.EXE on High Priority, running in '64-bit Mode')

66.92 fps - No FSAA, 4x Ansio

Should be similar to my system (specs in my sig).
 

parlee

Distinguished
Dec 16, 2005
1,149
0
19,280
tabris whats ur score in 3dmark05 and 06? i thought itd be higher than a single 4200 with 2 dual core opterons :p and 4 gigs of ram, nice system btw..
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
I'll run 3D Mark 05 in a sec under Windows XP (32 bit). Can only address 2.75 GB while there for technical reasons I won't go into. :p

3D Mark 05 doesn't like running under WOW64 (Windows on Windows 64) under x64 Edition. I've run it a few times without it failing but it is "just one of those apps" that really only work under 'real' Win32 I guess.

Half-Life 2 has a x64 update (automatic via steam) if you are running WinXP x64 Edition. Only active in Solo Player and Loast Coast though, not in DeathMatch. :( , so WinXP x64 Edition performance in HL2: Lost Coast is a shade higher than Win32.

Check here: http://service.futuremark.com/compare?3dm05=1656988

3DMark05:
3DMark Score: 5197
CPU Score: 7374

That is with the stock 'free' version of 3DMark05 which has locked settings.
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
Overall 3DMark05 score was: 5197 ; as above :p , when it ran under WinXP x64 a weeks or so ago. (Check link on ORB for date).

Just ran it in Win32 and only got: 4078 ; but underclocking the card to 95% of stock (380/465) as it has issues at (400/490) stock speeds, possibly after a overclocking misadventure it was damaged a little. CPU Score ('05) just then was 5895 under Windows XP Pro (x86 - 32 bit). But 3DMark05 only used 2 cores anyway :p

SiSoft SANDRA shows 4-way systems in their real light.

Little surprised it scores so lowly, as only 95% of stock speeds, would expect to score: 4937 ; give or take.

CPU usage during the test was: 1 core at 100% for first half (0% - 50%), then 1 and a half cores for 50% - 75% of the test, then just shy of 2 cores for 75% - 100% completion of test. This would be reflected the in CPU Score no doubt.

Under 3DMark06:
3DMarks: 1271
SM2.0 Score: 593
CPU Score: 2649

Compare URL: http://service.futuremark.com/compare?3dm06=84438

Bear in mind my Radeon X800XL was clocked at 380/465 (not 400/490)
Test was performed in WinXP Pro
First 80% of the test used only 1 core, the last 2 tests used all 4 cores at about 97% load each, the CPU Score above for 3DMark06 will most likely reflect this (I would hope anyway).

I think some of my earlier tests on ORB (back in 400/490) where with Quality texture filtering, instead of High Quality texture filtering forced in the driver. This would have inflated the score back then perhaps. (?). Could always run with High Performance filtering forced in the driver and see what it scores. :p

Update: Also only running Catalyst 5.8 in WinXP 32, so I'll update some stuff and check if results improve.

I am not expert on 3DMark btw, haven't used it seriously since 3DMark 2003SE. More into SiSoft SANDRA myself, and trying to develop software for 4-way to 32-way systems. (ie: The concept of FutureProof software, with seperate code paths for various systems, aswell as seperate EM64T vs AMD64 paths, each compiler optimized :) ).
 

parlee

Distinguished
Dec 16, 2005
1,149
0
19,280
rofl so ur saying that ur card runs faster underclocked? that sounds wrong :)... but i did the same thing as u with my 9600, i just added 200 mhz to mem and core and system just restarted, never went to stock speeds again :p (it was a boring day)
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
No, I am saying it runs underclocked.

I used to get 5,000+ in 3DMark'05, now I only get 4,000~.

When at stock speeds it isn't 100% stable anymore (think I overclocked it a little to far, appeared stable initially for days.... now after a few hours of play at stock speeds (400/490) I get noticable rendering errors in Half-Life 2: Loast Coast and also Battlefield 2: Special Forces. Sometimes the PC would reboot itself if the card was left at 400/490 (after intensive 3D work), or 3D apps would just CTD. Underclocking it very slightly fixed the issue, but my 3D performance dropped 5% in the process..... of course I have been 'abusing' the box recently, so I don't blame it.

Although the results above indicate it dropped 20%, which is wrong. I don't use 3DMark often,... So now I am updating to Catalyst 6.1 on both OS, and updating the nVidia Pro chipset drivers to see if it helps performance. I think 3DMark shows exponential gains when performance only scales in a linear fashion. The method they use to calcuate the final scores confirms this to some level aswell. :p

It could be something to do with NUMA though, I've got 2 dual-channel memory controllers and 3D video does weird things using memory address space, NUMA may actually reduce performance in 3D apps for all I know.... it helps in everything else though. ;) I'll have to test a few configurations to see what it best for gaming.... I've never really given it much thought before. Thanks for (re)opening my eyes to 3DMark I guess. ;)

EDIT: Back on 5,000+ in 3DMark05 under WinXP/32
http://service.futuremark.com/compare?3dm05=1755889

EDIT2: Got 1,850 in 3DMark06 just then under WinXP/32
http://service.futuremark.com/compare?3dm06=84767

Nothing SLI and some 7800GTX/512's couldn't fix I 'spose, not that I can really justify the cost for 3D performance - hehehe. :p
 

parlee

Distinguished
Dec 16, 2005
1,149
0
19,280
lol i get it now, it was confusing when u said it :p i got up early too, that cant help... hows bf2 special forces? i was thinkin bout picking it up, bf2 was a good game... but if its just as buggy maybe ill wait a while till they get it patched up instead of having to dl a new one everytime i play :p
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
BF2: SF is pretty good. The new gear makes it fun to play, lacks some realsim though, but fun.

For ATI X800 series cards I think you need to disable or run shadows / lighting on Medium or even Off. Unfortunately the game designers thought it woule be a good idea to let people select settings that cause rendering errors (native to BF2) on ATI cards. Instead of just greying out options that don't work like a sensible developer would do..... something to do with "Plays best on nVidia" would be my guess. Even though BF2 runs faster on ATI cards for a given price point (funny that).

The shadow / lighting thing only seams to affect ATI cards with certain options in BF2: SF though, easy to fix if you ain't daft. Only seams troublesome on night time missions aswell. ATI offered to help them fix it months ago, but I think they refused. ("Plays best on nVidia" contract still in effect no doubt)

I am / was more into realism ala: http://www.virtualbattlespace.com/screenshots.htm or http://www.virtualbattlefieldsystems.com/screenshots.html (depending on country you're in) and Operation Flashpoint: Resistance ages ago, as it handles over 64 players and maps so large they take 2+ hours to walk over w/o transportation. (and has since: 2001 given the server hardware).

Now to see if I can get closer to 1,800 in 3DMark06 on my Radeon X800XL :) - BRB (EDIT: Done, got 1850 in '06, updated my above post and cleaned it up a little)