PCI Express 4.0 to Support 16 Gigatransfers a Second
When you absolutely, positively need to transfer lots.
We may only be on the brink of PCI Express 3.0 right now, but the PCI-SIG, the organization responsible for the PCIe standard, today announced the approval of 16 gigatransfers per second (GT/s) as the bit rate for the next generation of PCIe architecture, PCIe 4.0.
The PCI-SIG said that it determined "after technical analysis" that the standard will support 16 GT/s on copper, which will double the bandwidth over the PCIe 3.0 specification, is technically feasible at approximately PCIe 3.0 power levels. The group also said that the new bandwidth can be easily integrated into existing manufacturing technologies and materials and infrastructure, also while maintaining backwards compatibility, all of which are a big part of the adoption of a new technology.
"Experts in the PCIe Electrical Workgroup carefully analyzed a number of target bit rates for the next generation of PCIe architecture, taking into consideration several key factors, including our ability to continue using low-cost materials. We have concluded that 16 GT/s is a feasible technical solution that satisfies our member companies’ requirements," said Al Yanes, PCI-SIG chairman. "While the preliminary analysis is encouraging, a lot more challenging work lies ahead in developing the specifications. The PCI-SIG looks forward to providing our members with a specification that not only satisfies their high performance requirements but also meets their power, cost and compatibility goals."
The PCIe 4.0 specification won't just be for next-gen graphics, however, as even tablets, embedded systems, and peripheral devices can benefit from increased bandwidth at low costs.
Yes, details are light, but that's because the final PCIe 4.0 specifications, including form factor specification updates, are expected to be available sometime in the 2014-2015 timeframe. For at least the next couple of years, PCIe 3.0 will be where it's at.
For more on PCI Express 4.0, check out our previous article.

I think I see your problem. You only have 3 gigabits of video memory and 8 gigabits of system memory. BF3 requires 512 megabytes video memory and 2 gigabytes system memory.
ONtopic: Shocking that this kind of thing is planned so far ahead
ONtopic: Shocking that this kind of thing is planned so far ahead
What issues? My two 470's played the game without problems. :\
Same here. No problems except for the occasional (not so much after patch) crash. I've got two 470's, an i5-750 @ 3.6 GHz, and 12 gigs of RAM.
/ontopic: More bandwidth is always good, but I am more interested in whether or not we will have video cards that can make use of the extra bandwidth. I would rather see more bandwidth being developed (and utilized) for plug-n-play interfaces such as USB, Thunderbolt, and whatever is next.
What issues? I have basically the same setup and I have no issues at all. I average 80-100fps with max settings and it is extremely smooth.
I'm not sure what issues you are having, but I'm able to play BF3 fine. You're missing out
Read about this not too long ago. A bit surprising they already have PCI-E 4.0 planned when PCI-E 3.0 has yet to be put into use.
I think I see your problem. You only have 3 gigabits of video memory and 8 gigabits of system memory. BF3 requires 512 megabytes video memory and 2 gigabytes system memory.
thumbs up for sarcasm.... i really hope that was your angle.......
But will it run Crysis?
That' assuming you aren't running an overheating single core Celeron and relying on its integrated GPU...
No seriously, I know a friend who wondered why all of the newer games ran like a slideshow edition when ran at high graphic/resolution settings on such hardware.
Then why didnt they make that the pci-e 3.0 spec? I dont understand....
@applefairyboy: http://en.wikipedia.org/wiki/Transfer_%28computing%29 that should help you a bit.
PCI-E 4.0 has yet to be developed. PCI-E 3.0 already has. We are just waiting on GPUs to finally utilize it.
The 16 GT/s bit rate has only been approved anyway, not developed...
2010 Called, even they don't want the joke back, they just want you to stop using it.
Why is it every article about a new piece of fast hardware has to have some idiot using the geek version of "Git'er done!"??