The New PCI-Express 2.0 Standard, is your PSU ready?

In November we will see a new standard evolve, PCI-Express 2.0 or DXX.

PCI Express 2.0 was announced on January 15 by the PCI-SIG, an industry group of currently more than 850 companies like AMD, Broadcom, HP, IBM, Intel, LSI Logic, Microsoft and Nvidia. It defines the standards for the next generation of motherboards and graphics cards, commonly dubbed as DirectX 10 capable. This new architecture effectively boosts their performance and utilization compared to their DirectX 9 / PCI Express 1.0 predecesser enormously and requires a new 8P power connector.

For those building new systems be sure that if your going to be spending your hard earned money be sure that you power supply supports PCI Express 2.0 or DXX. There are many companies that are just coming out with these new breed of PSU's. I just didn't want you guys & gal's to make the same mistake I did by not buying a PSU that was ready for PCI Express 2.0 or DXX. If you care to add a comment please do as being prepared is better than not being prepared at all. Here is a link for more info. http://www.pcisig.com/news_room/faqs/pcie2.0_faq/
47 answers Last reply
More about express standard ready
  1. If you want a good power supply the 1200 watt PSU from Thermaltake is compatible.
    It's a modular psu and have the cables for 8 pin video cards.
  2. airblazer said:
    If you want a good power supply the 1200 watt PSU from Thermaltake is compatible.
    It's a modular psu and have the cables for 8 pin video cards.


    Thats one that I was looking at because 1200W isn't going to be overkill in a year. Your have the PC case (TJ09) that I want, did you find that your PSU cables were long enough? Thanks.
  3. 1200W will always be overkill. More than 90% of the systems out there right now could get by with a solid 550W power supply, people have just over hyped the power draw of all these graphics cards.
  4. 550w? what are you running? a single core with a agp card with one hard drive???
  5. Right now, I am running a 750w w/72A, but he is right most systems could get by with a 550w. About 80%+ systems. There are core2's with 450w and like 8600,7800 and others. My friend has an 8800GTX with a 550w and it works fine.
  6. 500W PSU, generic.

    Intel Pentium 4 HT 631, 3GHz stock speed.
    ECS PF5 Extreme ATX mobo (Crossfire supported, though not used).
    Samsung 512MB DDR2-533 x2 (1GB total, 3-3-3-8, 400MHz).
    XFX GeForce 8800GTS, w/6pin power connector connected.
    WD 80GB SATA2 HDD, nothing fancy.
    Samsung SATA DVD-RW.

    CPU is OCed to 3.6GHz (240MHz x15), RAM is at 480MHz DDR, GPU is at 770 core and 1100 mem.

    Stuffing HDDs in there hasn't made a bit of difference, and that system is rock-solid stable under maximum loads.

    However, if I want to use SLI or Crossfire, then I'll have to go up to 650~800W, no more.

    These really big (>1KW) are completely over-specified. But then, if you've got a 1.2KW PSU and your components pull at most 450W, then it won't draw more, and you're future-proof i.t.o. PSU - unless they change the standard again...
  7. javimars said:
    550w? what are you running? a single core with a agp card with one hard drive???



    Yes 550W, like it's been mentioned before a solid/quality psu manufacturer with a 550W psu would be able to run a Core 2 Duo with an 8800GTX. There is just a knee jerk reaction among many people that we need these huge psu's to power these systems. When in reality time and time again it's been proven most systems don't consume over 400W.

    http://www.anandtech.com/systems/showdoc.aspx?i=2818&p=2

    This machine is run by a 620W psu, it consists of 2xX1900XTs, Core 2 Duo clocked at 3.5Ghz, a watercooling pump, 4 fans, 2 optical drives, x-fi soundcard, overclocked ram and 2 raptor hard drives. Basically this system has just about peripheral you could have and ABS thinks it's just fine to put a 620W power supply in there.

    So yes a 550W could run most if not all single card setups out there right now.
  8. It's all marketing I tell ya.

    My old Athlon64 3500+ (newcastle) probably used more power than a q6600 and people think these chips are so hot. And you probably wouldnt know but I have an celeron (prescot) 2.6GHz which spits more fire than a Athlon 3500+ & q6600 together (at least feels that way).

    Who wants a machine that consumes 1200W ??? thats insane!!!
    The industry cant go this way. Intel has turned away from netburst the graphic industry must follow.
    From what I have read on the inq it says the g92 wont be so hot.
    But ionq gets them only 50% right!

    Those numbers are so inflated. Its absurd.

    I see 2kW PSUs on the horizon. whos buying???
  9. systemlord said:
    Thats one that I was looking at because 1200W isn't going to be overkill in a year.

    Not if Al Gore can help it :p
  10. One thing to keep in mind: Just because a PSU is rated for >1kW, it's not going to be putting that out constantly. Lots of people seem to have that misperception. But getting a 550W PSU for a system that pulls 525W isn't exactly ideal either. Most PSU's have their highest efficiency when they're at between roughly 25 and 60 percent load. So if you run a 525W system from a 1000W PSU you'll be at about 50% load and very close to peak efficiency. How much higher is peak efficiency? Not much, in any quality CPU there's a 5 - 10% difference between full load efficiency and peak efficiency, so it probably won't be readily apparent on your next power bill. But there's another big advantage to running at a lower load - less stress, thermal and otherwise, on your PSU, and a very real possibility for longer component lifespans.

    So if you can afford a big honkin' PSU, go for it. If not, make sure you leave yourself at least a little wiggle room between your peak system load and your PSU's rated output. You'll actually pay less in kWh for the big honker than for a smaller one. Until you run triple crossfire and two watercooling loops and 5 TECs and a huge overclock, and a 15 disc raid array and 4 blu-ray drives and 20 fans and a sweet cold cathode light. Then your power bill might go up....noticably.
  11. NaDa said:
    It's all marketing I tell ya.

    My old Athlon64 3500+ (newcastle) probably used more power than a q6600 and people think these chips are so hot. And you probably wouldnt know but I have an celeron (prescot) 2.6GHz which spits more fire than a Athlon 3500+ & q6600 together (at least feels that way).

    Who wants a machine that consumes 1200W ??? thats insane!!!
    The industry cant go this way. Intel has turned away from netburst the graphic industry must follow.
    From what I have read on the inq it says the g92 wont be so hot.
    But ionq gets them only 50% right!

    Those numbers are so inflated. Its absurd.

    I see 2kW PSUs on the horizon. whos buying???


    It's not entirely the fault of the GPU producers. They're struggling to keep up with software advances that place a huge load on the modern GPU. Massive power and high efficiency are extraordinarily difficult to achieve at the same time. Think of internal combustion motors, HP came long before MPG. This is the same deal. Because the advances are coming pretty quickly, the GPU makers must apply a brute force approach. The deal was the same for CPU's not too terribly long ago, but if you look, CPU requirements have hardly budged for the last 4 years or so. That's the primary reason that CPU producers have been able to increase the efficiency of their designs. Unfortunately, I don't see the situation resolving itself, there's simply a huge demand for gorgeous graphics. So every shrink will be clocked higher and volted to the max. There's certainly some room for them to increase their efficiency, but it would take time and effort that they would rather allocate to sheer power.
  12. What about these new PCI Express 2.0 graphics cards that will consume 225-300W per card, can you imagine running two of them in SLI thats up to 600W. Granted I will never support SLI in my future.
  13. Well personally if I'm going to throw down that amount of money on a new PCIe 2.0 motherboard and card(s), I'll more than likely spend a bit extra on a new power supply then anyway.
  14. systemlord said:
    Thats one that I was looking at because 1200W isn't going to be overkill in a year. Your have the PC case (TJ09) that I want, did you find that your PSU cables were long enough? Thanks.


    I find the cables too long in my case with that psu, i can say, it sturdily powers my system like a dream...*mind wanders to psu slowly chugging along*, from what i have experienced and have read in reviews, the psu is great for its price compared to those by pc p & c
  15. My new OCZ PROXSTREAM 1000WAT is ready !:d
  16. 1 card using 300watts? It'll never happen.
  17. Hatman said:
    1 card using 300watts? It'll never happen.


    You need to reread the OP as you have missed a very important link which shows proof of what I am saying. Then you'll see that your statement above was without thought. I really wish people would read (including links) the hole post before posting. Oh yeah never say never.
  18. What did I miss?

    It states that it will be ABLE to gvie them that much power, which card uses that? None use that much! And I doubt they will! Highest power hungry card about is the 2900xt, which uses 80nm process, which is 225watts, nowhere near 300watts, and the cards are getting a die shrink to 65nm and beyond for the next gen.

    Kinda funny mate, type your system into teh PSU calc, and it reccomends about 350watts :D
  19. Hatman said:
    What did I miss?

    It states that it will be ABLE to gvie them that much power, which card uses that? None use that much!


    In september we will see X38 with PCI-e 2.0, then in November you'll see something from Nvidia to back X38's launch and Crysis. Theres a new standard upon us PCI-E 2.0 and if you want to ignor it fine, but in its spec's are increased 2x the bandwidth and 225-300W of power. Did you really read the hole link yet? Some people are affraid of change and will not except it either.

    Maybe right away the cards won't use 300W but to say, "it will never happen" is a like saying progress will never happen. No offence, but you seem to be the only one so far having trouble accepting the new standard which is coming weather you agree or not.:) 9800 anyone.
  20. Progress.

    300W per a graphic card isnt progress in my opinion.
  21. IcY18 said:
    1200W will always be overkill. More than 90% of the systems out there right now could get by with a solid 550W power supply, people have just over hyped the power draw of all these graphics cards.

    I have a device that measures the power usage of any device plugged into it. I have a Core 2 Duo E6600 o/c to 3.38GHz, nVidia 8800 GTX, 2GB RAM @ 2.3V, Creative X-Fi, 3 HDDs in RAID 0, 2 optical drives, a water cooling kit, various system fans, and only a 550W power supply. The power usage while playing games is only ~360 watts. I don't think you'd need 1200 watt power supply in a long time.
  22. IcY18 said:
    Yes 550W, like it's been mentioned before a solid/quality psu manufacturer with a 550W psu would be able to run a Core 2 Duo with an 8800GTX. There is just a knee jerk reaction among many people that we need these huge psu's to power these systems. When in reality time and time again it's been proven most systems don't consume over 400W.

    http://www.anandtech.com/systems/showdoc.aspx?i=2818&p=2

    This machine is run by a 620W psu, it consists of 2xX1900XTs, Core 2 Duo clocked at 3.5Ghz, a watercooling pump, 4 fans, 2 optical drives, x-fi soundcard, overclocked ram and 2 raptor hard drives. Basically this system has just about peripheral you could have and ABS thinks it's just fine to put a 620W power supply in there.

    So yes a 550W could run most if not all single card setups out there right now.

    Well put, I am an example of that with facts to prove it. I agree 100%.
  23. It will be interresting to see what will happen when people start trying to plug their 6P PCI-E 1.0 into an 8P socket designed for a PCI-E 2.0 graphics card.
  24. LoL @ this thread

    Can hardly type.. :D :D :D :D :D

    Consider this:

    Upcoming CPU's from both AMD and Intel will consume less power then what we have today.

    Nvidia has promised that 9800 GTX will consume less power then 8800 GTX.

    Core 2 consumes less power then P4-D while making a joke of it performance wise.

    Why do you assume more power = better performance? P4 would destroy Core Quad if that was true.

    Remeber the wires that go from PSU to GPU.. Those carry electricity.. could it be that the power that PCI 2.0 can provide is meant to get rid of those pesky wires other then triple the power that GPU's will use in the future?

    I am kind of shocked that no one has suggested this in so many posts :ouch:
  25. systemlord said:
    It will be interresting to see what will happen when people start trying to plug their 6P PCI-E 1.0 into an 8P socket designed for a PCI-E 2.0 graphics card.



    PCI-E 2 will be backwards compatible with PCI-E so the cards will work fine.
  26. systemlord said:
    You need to reread the OP as you have missed a very important link which shows proof of what I am saying. Then you'll see that your statement above was without thought. I really wish people would read (including links) the hole post before posting. Oh yeah never say never.


    Did you read it yourself? Is his statement really without thought or are you just beeing a jerk?

    Where does it say that new GPU's are going to use 300W? It just states that it CAN power a card up to 300W. That means that it can power any card that we have today without extra wires going from PSU to the card?
    Good chance is that 9800 GTX is going to use less then 8800 GTX does.

    I am going to get some sleep, G'night :sleep:
  27. xela said:
    Did you read it yourself? Is his statement really without thought or are you just beeing a jerk?

    Where does it say that new GPU's are going to use 300W? It just states that it CAN power a card up to 300W.


    I'm just stating what the spec's are going to be for the new standard, don't shoot the messenger! Also it says 225-300W, up to 300W. Your the one thats being a "JERK" by saying, "LoL @ this thread", "can hardly type... :D :D :D " Your the one that posted two out of three different posts insulting me and my thread. Wow people really are affraid of change! I never said, "using 300W", I said "consume 225 to 300W" as its stated in the link. Get your facts strait.
  28. systemlord said:
    I'm just stating what the spec's are going to be for the new standard, don't shoot the messenger! Also it says 225-300W, up to 300W. Your the one thats being a "JERK" by saying, "LoL @ this thread", "can hardly type... :D :D :D " Your the one that posted two out of three different posts insulting me and my thread. Wow people really are affraid of change! I never said, "using 300W", I said "consume 225 to 300W" as its stated in the link. Get your facts strait.


    PCI-e 2.0 can provide 225 to 300W. It is not going to if you stick in a GPU in that only needs 100W. You make this change sound dramatic, it isn't. The only thing that might change is the lack of wires from PSU - GPU.

    - I said lol because people are talking about upcoming 300W cards because a port that provides this amount of juice was out, contradicting all info we get from manufacturers and the progress in hardware up to date.

    - I was really laughing and couldn't type for a while (weird sense of humor, don't hang me for that either)

    - He was right in saying that there is not likely going to be a 300W card and you ripped on him. I found that an unwise statement from you in the first place. A powersupply that provides 1200W in no way means that all PC's will start using 1200W, neither does the port that can provide 300W to GPU mean that new gen of card is going to use it's full potential.
  29. xela said:
    He was right in saying that there is not likely going to be a 300W card and you ripped on him. I found that an unwise statement from you in the first place. A powersupply that provides 1200W in no way means that all PC's will start using 1200W, neither does the port that can provide 300W to GPU mean that new gen of card is going to use it's full potential.


    No what he said was, "it'll never happen" which I thought was an ignorant statement. It wasn't my intention to rip anybody. Current tech is at a peek as far as PCI-E 1.0 is... I nor anybody have ever stated that PCI-E will actually use 300W, but between 225 to 300W. Why is it that almost everyone confuses 225W-300W with 300W? Were talking "BETWEEN" 225 to 300W, example 238W to power a card six months to a year in the future is not all that unreasonable. And I'm excited not dramatic about the new standard, please quit with the assumptions.
  30. No, it is not at it's "peak". There's plenty of bandwidth left for current gen cards, and likely for the next gen oens aswell.

    And you DID say that they'll need 300watts, dont just turn round and lie you idiot. I say it will NEVER use 300watts of power and i'll happily stand by that. 300watts IS COMPLETLY UNREASONABLE!! It just wont happen, and im telling you that now.

    Thats why they shrink the die's, DUH!

    Your just so strange lol.. saying people cant accept change.. its a vague description of the new pci-e2 slot, not even the cards, and your just being a complete idiot about it.

    Not that I expect you to say you were wrong anyway, you just cant change your opinion in fear of embarrasment, then again, some people just cant accept change at all.
  31. i have one of them devices too, it is on the end of my power board, my computer is at idle and it is reading 450w... lets open a copy of supreme commander at 2560x1600
  32. ok now it is reading 1040w, i'd say 150~ would be my monitor, and hat power draw is all for my computer, there may be some idle draw of my speakers too...
  33. Hatman said:
    No, it is not at it's "peak". There's plenty of bandwidth left for current gen cards, and likely for the next gen oens aswell.

    And you DID say that they'll need 300watts, dont just turn round and lie you idiot. I say it will NEVER use 300watts of power and i'll happily stand by that. 300watts IS COMPLETLY UNREASONABLE!! It just wont happen, and im telling you that now.

    Thats why they shrink the die's, DUH!

    Your just so strange lol.. saying people cant accept change.. its a vague description of the new pci-e2 slot, not even the cards, and your just being a complete idiot about it.

    Not that I expect you to say you were wrong anyway, you just cant change your opinion in fear of embarrasment, then again, some people just cant accept change at all.


    Ok tell me where I said that they will use 300W? :heink: Also my opinion will not change, and its true that people have a hard time accepting change after spending their hard earned money on a new system only to be outdated a few months later,its a fact of life. I said 238W in a year would be reasonable, can't you read? And whats with the childish name calling, its time to grow up. Also what was I wrong about, remember don't shoot the messenger.
  34. rammedstein said:
    ok now it is reading 1040w, i'd say 150~ would be my monitor, and hat power draw is all for my computer, there may be some idle draw of my speakers too...



    Not surprised its that much with 2x 2900t's an a quad :D

    Systemlord mate you started the insults, dont cry about it.When someoen actually invents something that sues 300watts on its own, which no single card will do, well then you got some proof to back you, until then, im saying it will never happen. And I dont think it ever will.
  35. Hatman said:
    Not surprised its that much with 2x 2900t's an a quad :D

    Systemlord mate you started the insults, dont cry about it.When someoen actually invents something that sues 300watts on its own, which no single card will do, well then you got some proof to back you, until then, im saying it will never happen. And I dont think it ever will.


    You show me where I insulted you. I'll take your silence as an indication that I really didn't insult you.
  36. "I really wish people would read (including links) the hole post before posting."

    I took that as a bit of an insult, and the whole way thru you were implying that I was an idiot.

    Im not replying to you anymore.
  37. Hatman said:
    "I really wish people would read (including links) the hole post before posting."

    I took that as a bit of an insult, and the whole way thru you were implying that I was an idiot.

    Im not replying to you anymore.


    I "never" implyed or said you were an idiot, maybe you thought that but I didn't. I'm sorry you think that. :)
  38. systemlord said:
    You show me where I insulted you. I'll take your silence as an indication that I really didn't insult you.


    ...

    Right..

    You tell him:

    You need to reread the OP as you have missed a very important link which shows proof of what I am saying. Then you'll see that your statement above was without thought. I really wish people would read (including links) the hole post before posting. Oh yeah never say never.

    - His statement is without thought while the link has nothing conclusive on the subject.
    - He needs to read links twice.
    - You wish that people would read the whole posts while there is no indication that he hasn't done so.

    In my book.. that qualifies as being a **** and I am sorry if you don't see that. I can also clearly see how someone can be offended by that.

    About the post in general.

    - I would like to say that no company in there right minds will release a GPU/Mobo/CPU witch will not have the ability to be powered by any high brand PSU released this or last year. Cutting off such a large segment of clients is a suicide and I am 100% sure that there will be other ways to power a PCI-E 2.0 GPUs then the new 2x4 pins connector.

    - PC parts in general will require less power in the future as Intel and Nvidia have both already proven that they can do more with less. AMD will prove that as well with Barcelona and upcoming GPUs.

    - There may or may not be a 300W card. Maybe Ati decides to stick 2 GB of RAM and 2 or 4 R600 chips on a single card. But there are not many that need an oven inside PCs so that will not stick. My guess would be never.

    - CPUs and GPUs in the future are reported to use less juice then what we have today
  39. Allow me just one more reply, I would like to add:

    This post started with an acknowledgement of a new technology being released. It seems to be a good technology but in my opinion you have presented it wrong. In the link you provided it doesn’t state anywhere that new kind of PSU will be required, you added that to the post. What do we really know about the new 2x4 pin connector? I assumed it was a part of PCI-E 2.0 port but if that is the case then why do we need any kind of modification to today’s power supplies at all? How will the new power supplies that you mentioned be different from what we have today? Even if they meant that they are working on a new kind of PSU wire ending in 2x4 pins (other then the one that goes from PSU to XD2900XT today) there will by other ways to power upcoming GPUs as Nvidia and ATI cannot expect everyone to get a new PSU for a new GPU generation.

    How much power would one expect a 9800 GTX to use and how much power will systems draw in the future based on penryn and barcelona previews? Do we really need all the new bandwidth of PCI-E 2.0 and how much of the bandwidth of PCI-E 1.0/1.1 16X does a modern card like 8800 GTX really use?

    * Those in my opinion should have been the questions asked in the first place instead of falling on 300W GPUs and 1200W PSUs and I would still love to hear those answers if someone can provide them.

    The thread was funny to me because of the direction it has taken as I do strongly believe that there will never be a 300W GPU worth buying and time when a 1200W PSU is a requirement for a good gaming rig will never come (the very thought is laughable) and this post has taken such an awkward and inflammatory direction.

    Edit: I figured that a 2x4 pin connector is the very same one that is used on X2900XT. So they already have a card with PCI-E 2.0 connector and it works just fine without it (lower OC's are reported on the other hand). Intresting as the card is outpreformed by a 8800GTX witch has/needs no 2x4 pins :)
  40. even if you buy a graphic card that needs a "8pin pci connector from the powersupply" i m sure you can buy a 6 to 8pin powersupply adapter to make it fit.
  41. All I was doing was informing people of a new standard coming our way, thats it. O I'm somewhat sure that there will be adaptors for others that lack the 8P plug. The responces that we got were met with, "never going to happen", and "it a marketing ploy" people saying all sorts of neggetive remarks. In the not so distant future this "PCI-E 2.0" will become the new standard weather any of you like it or not.

    The spec's have been layed out for all of you and coming from 850 companies, unless you believe PCI-e 2.0 is a big fat LIE and anybody that has to do with the new standard must be lieing there pants off. I expected way more more positive posts about that new standard because change in tech is exciting for me and maybe I'm the only one who thinks that. There were a few just a few that had anything good to say about this post. But thats all I going to say about it.
  42. systemlord said:
    Thats one that I was looking at because 1200W isn't going to be overkill in a year. Your have the PC case (TJ09) that I want, did you find that your PSU cables were long enough? Thanks.


    Only prob was with the 12v connection for the board..mine is in the farthest corner near the cpu on the Asus P5N-32SLI but managed it. Every other connection had loads of room.
  43. 300W graphic card is really stupid.

    You need to consider the cooling. You cant really put that heavy and noisy cooler on a graphic card. For 300W you would need water cooling.

    If cooling wasnt a problem they would be making 500W graphic cards already.
  44. ..i remember a guy who stuck a zalman 9700 on his 7950gt so he could oc it better, it worked a charm too supposedly...
  45. I see maybe in the future graphics cards being externally housed in there own case. Some water cooling systems are external, so its not far fetched as it sounds.
  46. NaDa said:
    300W graphic card is really stupid.

    You need to consider the cooling. You cant really put that heavy and noisy cooler on a graphic card. For 300W you would need water cooling.

    If cooling wasnt a problem they would be making 500W graphic cards already.


    I don't seem to need water cooling for my 720W PSU. When you consider that graphics cards are having more and more Vram with almost each genneration with new fancy graphic effects added 225W doesn't sound far fetched. I agree with you that water cooling will become a must if companies are going to use that kind of power. Just look at how much is being used now, do you know how much two 2900XT's are using? I don't like the sound of having two 8800 or 9800's in the same space that houses my CPU ,mobo and ram unless they are water cooled.
  47. when can we tell.........I just bought a corsAIR 620hx..ALRIGHT?
Ask a new question

Read More

Power Supplies PCI Express Components Product