Sign in with
Sign up | Sign in
Your question

PCI Express 2.0

Tags:
  • Graphics Cards
  • Build
  • PCI Express
  • Graphics
  • Product
Last response: in Graphics & Displays
Share
July 21, 2007 6:59:48 AM

I am very worried about PCI Express 2.0. I am wanting to build a new system soon, sooner than September when the x38 and G35 boards will be out; I am wanting this system to be upgradeable a year or two later. I feel that PCI Express 2.0 will bring about all the issues faced in the 4x/8x AGP battle. Someone tell me I'm wrong.

More about : pci express

Related resources
July 21, 2007 8:58:25 AM

Hmmm, okay I just did some reading up on x38 and PCI-E 2... I doubt PCI-E 2 can be fully utilized as of now, but the x38 chipset does look very promising. My opinion is if you're worried and you don't absolutely NEED the system now then wait a bit...
July 21, 2007 8:59:32 AM

Late August, into September is the release of the x38 and G35 which will both be the dawn of PCI Express 2.0.

With the notion of the G90 series video cards being released in like October/November, being said to be 200-300% more powerful than the G80 series, that's what worries me. The G90 series is going to be PCI-E 2.0.

Edit:

Say the bit-tech story is true. Would it still not require physical improvements to the wiring and slots of PCI Express motherboards?
July 21, 2007 9:13:12 AM

Yeah, it looks like the choice is completely up to you depending on if you think it's worth it or not. Does anyone know if the x38 will support the new 45nm CPUs, plus DDR2/DDR3 just like the GIGABYTE GA-P35C-DS3R?

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

Fr om what I've seen from the previews, companies like Abit and Gigabyte are making two different versions: one for DDR2 and one for 3 but not one compatible with both. It would be nice to have a board capable of both to easily upgrade when DDR3 becomes more commonplace.

On a side note, the nvidia 9 series is worrying me. It supposedly has 1 teraflop of computing power and I'm skeptical that any of the current PSU's on the market can handle such a beast. Something like that would generate a lot of noise, heat, and may require its own power supply.
July 21, 2007 9:17:16 AM

Right here's THE board to get, IF in fact it is really PCI-E 2.0.

http://www.newegg.com/Product/Product.aspx?Item=N82E16813128048

Edit:

In order to enable PCI-E 2.0, they'd have to at least enable more power to the socket right? So, that's not something they could do with a BIOS upgrade correct? This is going to be really confusing to consumers.
July 21, 2007 11:06:40 AM

not only intel has the PCI-E GeN2

amd/ati is working on rd790 16x16
nvidia is working on C72 and C73 along with MCP 78. im pretty sure its amd only tho
and x38 will def support pcie2

benifits of pcie2 are:

increased bandwith

doubled power output ( 75w compared to 150w )

and thats all i know up to here.. http://img.tomshardware.com/forum/uk/icones/message/ico...
http://img.tomsh ardware.com/forum/uk/icones/message/icon14.gif


* not 100% sure on those new nvidia chipsets tho, but theyll come around when g90/g92 come out nov/december)
July 21, 2007 11:29:10 AM

So, we need the question answered:

Do any of the current P35 motherboards fully support PCI Express 2.0?
July 22, 2007 10:44:09 AM

yes it supports pcie 2 because ALL bearlake chipsets support it. if you need a link

: http://www.bit-tech.net/news/2007/06/06/p35_supports_pc...

if it was me, id wait for x38.

note*

just because the chipset supports it, doesnt mean the slot will. they may need more circuitry on the board itself.. iuno


hope this answered your question.

*LOL* ssorry but some1 has already put up the link. you mustve missed it too =]

alsoo! id suggest you call them to make sure.
July 22, 2007 6:53:07 PM

You should not worry.

As far as I understand there are just a couple of practical benefits this first generation of PCIe 2.0 will provide:

1. It will catch up with power requirements of higher end cards, and;

2. Allow better X1 performance. (fewer PCIe lanes required = reduction in production costs)


Gfx cards won't be capable of fully utilizing the increased bandwidth of a PCIe 2.0 16X slot, as they are not capable of that with the current version of PCIe.
July 23, 2007 12:54:26 AM

Quote:
as nice as all that is, what does it have to do with the topic of pci-e express 2 in relation to gfx cards?

well.... ummm... bandwidth? :p  yeah joking aside that was pretty random to state it mildly
July 23, 2007 2:05:24 AM

Quote:
IMO, don't worry. people have been going on about bandwidth for ages now and i have yet to see anything to suggest it really matters all that much.

also, i have not heard anything abou the g90 being that powerful except from nvidia and even then it is very vague about how you measure such things. I am 100% certain that it will not be that powerful in games.

as with PCie- vs AGP the best thing will be the extra power it can handle which should mean not having to hook up sic power cables to the gfx cards.

.

just to move them to the MB right next to the slots.
a b U Graphics card
July 23, 2007 6:25:59 AM

EV700 said:
I feel that PCI Express 2.0 will bring about all the issues faced in the 4x/8x AGP battle.


Why?
It's backward compatible in both directions.
The only 8X / 4x issues were early poor MoBo support, which lead to the stickers proclaiming R9700/FX5800 compatible MoBos like the sticker on my old Gigabyte board. And the big MoBo makers like Gigabyte fixed the problem when it arose. So be sure to get a quality board if you're concerned.

At/near launch R600 was shown running on a PCIe 2.0 PEG slotted MoBo running with only 1 6pin power connector plugged in, and it also works in older MoBos obviously. Proving some cross-platform inter-operability.

It's really a question of providing power if you design a PCIe2.0 part and you want it to be backwards compatible it better be ready to run without the extra power and thus grab it from another source/connector, since the slots are physically the same only supplying an increase in frequency (not more pins).

Most cards will be backwards compatible leading into the middle of next generation simply because of the fact that there's not enough of an install base. After that you just have to worry when you NEED to change, not about future possibilities/probabilities.
!