Sign in with
Sign up | Sign in
Your question

NEXT GENERATION MOTHERBOARDS

Last response: in CPUs
Share
January 20, 2010 10:16:47 AM

Dear Tom's,

When will the next-generation motherboards become standard?

What i mean by next-gen, i mean motherboards with all the new connection standards, which are:-

1 - USB 3.0
2 - SATA 6Gb/s
3 - PCI Express 3.0

Thanks.
January 20, 2010 11:20:36 AM

hi
for now usb 3.0 and sata 3 6gb/s is already here. dont know when pci express 3.0 will come. these features are only in the motherboard supporting intel core i5 and i7 1136 socket. if u wait a bit, u can get these features on other platforms as well. in my opinion pci express 3.0 is not worth waiting because current vga cards for pci express 2.0 dont show noticeable improvements when running them on pci express 1.0. and i think apart from pci express 3.0 all the rest will be standard in late this year or the next year .
a b à CPUs
a b V Motherboard
January 20, 2010 11:59:23 AM

I'm more excited for usb 3.0 than anything... Really fast plug and play devices, mmm
Related resources
January 20, 2010 12:13:48 PM

Im thinking the pci standards may be raised not for bandwidth, but for power reasons, time will tell tho
a b à CPUs
a b V Motherboard
January 20, 2010 12:27:29 PM

Power savings is a good thing as long as performance doesn't suffer. It took the auto industry for ever to figure it out maybe the computer industry will as well. A corvette with 500hp that gets 25-30mpg highway... Maybe a video card that can run on one 6 pin connector, and crunch modern games. Wishful thinking perhaps.
a b à CPUs
January 20, 2010 12:36:14 PM

sportsfanboy said:
Maybe a video card that can run on one 6 pin connector, and crunch modern games. Wishful thinking perhaps.


Like a 5770? :p 
a b à CPUs
a b V Motherboard
January 20, 2010 12:43:14 PM

Yeah until DX11 gets moving. What setting do you play at, and what games?
January 20, 2010 1:06:15 PM

Yo, OP here.

For USB 3.0 and SATA 6Gb/s, they may have started coming out, but on which boards? On none that i know of...
except some Gigabyte P55 boards, and i just had a look at them, they have 2 SATA 6Gb/s and 6 SATA 3Gb/s and 2 USB 3.0 connections and 6 or 8 USB 2.0 connections...
thats not exactly standard, is it? :s

But doesn't the ATI 5970 saturate the PCI-E 2.0 slot just a little? it said so in its review on Tom's, i believe... as Two 5870's in Crossfire surpassed the 5970 a little as each 5870 had a full 16 lanes each, wheres the 5970 had to share...
a b à CPUs
January 20, 2010 5:39:38 PM

sportsfanboy said:
Yeah until DX11 gets moving. What setting do you play at, and what games?


I'm playing 5040x1050, using 2 5770's.

Lotro on 'Very High', dx9 though. High on dx10.
WoW runs perfectly on max pretty much.
DA: Origins looks great, no fps totals but its smooth anyway.

Going on that, not much won't run near max on a single 5770 @ 1680 or 1920 resolution.
a b à CPUs
a b V Motherboard
January 20, 2010 5:45:51 PM

Hmmm you using 3 monitors or something? never seen that res.
a b à CPUs
January 20, 2010 5:50:28 PM

Yep three 1680x1050 in eyefinity.

Oblivion runs near 55-60fps max also in first person, but falls a bit when zoomed out. That's 8xAA etc.
a b à CPUs
January 20, 2010 5:51:06 PM

jennyh said:
I'm playing 5040x1050, using 2 5770's.

Lotro on 'Very High', dx9 though. High on dx10.
WoW runs perfectly on max pretty much.
DA: Origins looks great, no fps totals but its smooth anyway.

Going on that, not much won't run near max on a single 5770 @ 1680 or 1920 resolution.



Its a shame your AMD processors holding you back....:) 
January 20, 2010 6:23:06 PM

jennyh said:
I'm playing 5040x1050, using 2 5770's.

Lotro on 'Very High', dx9 though. High on dx10.
WoW runs perfectly on max pretty much.
DA: Origins looks great, no fps totals but its smooth anyway.

Going on that, not much won't run near max on a single 5770 @ 1680 or 1920 resolution.


Why 2 5770's instead of a single 5850? :??: 
a c 96 à CPUs
a b V Motherboard
January 20, 2010 7:08:18 PM

lordszone said:
hi
for now usb 3.0 and sata 3 6gb/s is already here. dont know when pci express 3.0 will come. these features are only in the motherboard supporting intel core i5 and i7 1136 socket. if u wait a bit, u can get these features on other platforms as well. in my opinion pci express 3.0 is not worth waiting because current vga cards for pci express 2.0 dont show noticeable improvements when running them on pci express 1.0. and i think apart from pci express 3.0 all the rest will be standard in late this year or the next year .


That would be incorrect.

Gigabyte GA-790XTA-UD4 AM3 790X SATA 6Gb/s USB 3.0 as an example.

PCIe Gen3 has been delayed because PCI-SIG said so.


a b à CPUs
January 20, 2010 7:12:07 PM

Wisecracker said:
That would be incorrect.

Gigabyte GA-790XTA-UD4 AM3 790X SATA 6Gb/s USB 3.0 as an example.

PCIe Gen3 has been delayed because PCI-SIG said so.




no one has bothered with the bandwidth on pci express 2 let alone 3 - if we ever get any more pc games.


3d displays with display port technology should be interesting
January 20, 2010 7:12:51 PM

Wisecracker said:
That would be incorrect.

Gigabyte GA-790XTA-UD4 AM3 790X SATA 6Gb/s USB 3.0 as an example.

PCIe Gen3 has been delayed because PCI-SIG said so.


sry, ure right, theres at least 1 gigbyte motherboard with AMD sockets that have them.

However, still only 2 6 SATA 6Gb/s and only 2 USB 3.0 connections... thats not making them standard...



When will all the connections be SATA 6Gb/s and USB 3.0, with PCI-E 3.0? Q4 2010? Q1 2011? Q3 2011?...

a c 96 à CPUs
a b V Motherboard
January 20, 2010 7:25:12 PM

Hellboy said:
no one has bothered with the bandwidth on pci express 2 let alone 3 - if we ever get any more pc games.


3d displays with display port technology should be interesting


What I think (which is pretty messed up anyway) is that the debate is raging over wattage for Gen3. We have now hit the 300w spec limit for the slot. That comes out to a 19a rail feeding your card with 75w from the slot.

For some reason I'm thinking the 'powers that be' are pressing for more jigga-watts.


seerwan said:
sry, ure right, theres at least 1 gigbyte motherboard with AMD sockets that have them.

However, still only 2 6 SATA 6Gb/s and only 2 USB 3.0 connections... thats not making them standard...



Need more components that meet the higher spec. Chicken or Egg? :p 

January 20, 2010 9:56:55 PM

Hellboy said:
Its a shame your AMD processors holding you back....:) 


.....how?
January 20, 2010 11:06:48 PM

JAYDEEJOHN said:
Im thinking the pci standards may be raised not for bandwidth, but for power reasons, time will tell tho


I'm curious... are you saying that future GPUs will require more than they require now or that things are getting a little ridiculous with plugging in multiple power cables to our video cards? Just for the sake of simplicity, I'd like to see more power going to the GPU slots just so we don't have to worry about the 6 and 8 pin connections.
a b à CPUs
January 20, 2010 11:24:31 PM

rodney_ws said:
I'm curious... are you saying that future GPUs will require more than they require now or that things are getting a little ridiculous with plugging in multiple power cables to our video cards? Just for the sake of simplicity, I'd like to see more power going to the GPU slots just so we don't have to worry about the 6 and 8 pin connections.



Simplicity? Yes, but I dont want that much power going through my board, if it's even possible.
January 21, 2010 12:04:15 AM

Its far cheaper to use cables than traces in mobos
January 21, 2010 12:07:57 AM

yannifb said:
.....how?


Exactly!!!; you might want to read this review Hellboy:

CPU scaling with the radeon 5970
Round 2: Phenom II x4 scaling.

http://www.legionhardware.com/document.php?id=869&p=12

"The Phenom II X4 results were quite different to those recorded when testing with the Core i7 processors, though this was not necessarily a bad thing. When operating at lower clock speeds, the Phenom II X4 did not fair all that well, as we saw a sharp decline in performance. However when clocked at 3.0GHz and beyond, the Phenom II X4 really picked up the pace, and in many cases was able to outclass the Core i7.

In games such as Wolfenstein, Call of Duty: Modern Warfare 2, Tom Clancy’s H.A.W.X, BattleForge and Far Cry 2 the Phenom II X4 processors were actually faster when clocked up near 4GHz! This is quite amazing as out of the 9 games tested, the Phenom II X4 series was faster than the Core i7’s in 5 of them. Although the margins were very limited, the Phenom II X4 was found to be faster, and had it just managed to match the Core i7 series with the Radeon HD 5970, we would have been impressed."
January 21, 2010 12:15:02 AM

Seriously I don't see why people don't read these articles.
January 21, 2010 12:16:48 AM

My guess is, keep an eye on this gens high end gfx cards. Currently it a downclocked 5970, at 288 watts.
If nVidia wants that halo product, theres nothing holding back a non pci standard card, much like the MARS card, sporting 2 full 285s, if this continues, itll move the current metric for power
January 21, 2010 12:30:37 AM

There's 1500MB/sec SSDs coming out soon. How long until 1500MB/sec is standard?

I can see a a large change in computers in the next 10 years that will dwarf any historical computer tech related jump.

Some people see PCIe 2.0 and say "why do we need 3.0?". Well, in order reach those radical tipping points, you need to take the smaller evolutionary steps.

As IO bandwidth increases and latency drops, locality starts to become less important. What sounds stupid now may be a major breakthrough in a few years.

Some day we may even see a complete merger of GPUs and CPUs. Computing on the GPU may become about as transparent as computing on the CPU.

There's a huge push to help simplify the basics and management of multi-threaded programming. I bet some of these new new libraries may soon be able to detect GPU friendly work loads and transparently schedule work on the GPU(s).

I can't wait to see what we have in 10 years.
a b à CPUs
January 21, 2010 6:49:13 AM

DarkMantle said:
Exactly!!!; you might want to read this review Hellboy:

CPU scaling with the radeon 5970
Round 2: Phenom II x4 scaling.

http://www.legionhardware.com/document.php?id=869&p=12

"The Phenom II X4 results were quite different to those recorded when testing with the Core i7 processors, though this was not necessarily a bad thing. When operating at lower clock speeds, the Phenom II X4 did not fair all that well, as we saw a sharp decline in performance. However when clocked at 3.0GHz and beyond, the Phenom II X4 really picked up the pace, and in many cases was able to outclass the Core i7.

In games such as Wolfenstein, Call of Duty: Modern Warfare 2, Tom Clancy’s H.A.W.X, BattleForge and Far Cry 2 the Phenom II X4 processors were actually faster when clocked up near 4GHz! This is quite amazing as out of the 9 games tested, the Phenom II X4 series was faster than the Core i7’s in 5 of them. Although the margins were very limited, the Phenom II X4 was found to be faster, and had it just managed to match the Core i7 series with the Radeon HD 5970, we would have been impressed."



That was optimized drivers by AMD to work better on AMD processors, with AMD chipsets..

if intel did this you lot would be up in arms

The bandwidth on core i 7 is higher on two card setups but the speed goes when 3 or more are installed to intel with i7.

AMD i feel have left Intel optimization out on purpose, dont know why ???
January 21, 2010 9:05:19 AM

In a few short years, the gpu will be on chip, which may radically change the high end pci discrete as we know it today, making for a wider divide on what goes on pci, and eliminating low end discrete altogether.
How this effects it, and how newer cards effect it as well, only time will tell.
Weve reached a point where a heavily powered gpu having 3+ billion transistors are currently the norm thru 1 slot, and possibly much higher, and this has also brought with it the ability to use more power than ever before, as the efficiency of gpus goes up.
Thats why I think we may see several cards exceeding the pci standards this gen, and may shape its future
January 21, 2010 11:00:30 PM

Hellboy said:
That was optimized drivers by AMD to work better on AMD processors, with AMD chipsets..

if intel did this you lot would be up in arms

The bandwidth on core i 7 is higher on two card setups but the speed goes when 3 or more are installed to intel with i7.

AMD i feel have left Intel optimization out on purpose, dont know why ???


You've got to be kidding... And can I see proof of this optimization?
:pt1cable: 
January 25, 2010 9:28:20 PM

I don't see any mention of AMD Fusion Utility on that article at all, why would they use it without saying it somewhere on the article?, and even if they did, what would be the problem?.
AMD processors + AMD chipset + AMD graphic card, why can't they optimize their platform? everyone else does.
January 25, 2010 9:35:00 PM

Hellboy said:
That was optimized drivers by AMD to work better on AMD processors, with AMD chipsets..

if intel did this you lot would be up in arms

The bandwidth on core i 7 is higher on two card setups but the speed goes when 3 or more are installed to intel with i7.

AMD i feel have left Intel optimization out on purpose, dont know why ???


Well Ill say that it doesnt matter if it was an optimization or not ................. The AMDS do better ...... that is what counts! Quit whinning about so called cheating and get on with it already! What ever a company can do to make the APps run better is a great thing as far as I am concerned! Lets say I am running a my 2002 ford pickup truck with a V6 against my 2002 camaro with a V8. Let me guess you think the V8 in the camaro would win right? Wrong because I have OPTIMIZED my V6 in my truck it will kick the living sh## out of my camaro. Am I cheating? No absolutely not. Oh and by the way All I did was tweak the Computer programming. OH but thats cheating in the PC world, eh? If Intel did the optimizations, well then AMD would get skunked some more. BUT THEY DONT!!
January 25, 2010 9:42:30 PM

DarkMantle said:
I don't see any mention of AMD Fusion Utility on that article at all, why would they use it without saying it somewhere on the article?, and even if they did, what would be the problem?.
AMD processors + AMD chipset + AMD graphic card, why can't they optimize their platform? everyone else does.

^^WORD to your mother!!!!
January 26, 2010 12:30:33 AM

Intel doesnt have a whole platform, and cant do it. Intel has denied nVidia an x86 license from rumors, so whats worse?
Intels denied selling Atoms are lower pricing than a SoC, which was cheaper, according to nVidia, so again, whats worse?
Intels being sued and investigated for the 2 actions above, points to whats worse
January 26, 2010 12:36:56 PM

Spintel mmmm mmmm mmmmm!! Bad puppy! Or for our cat lovers stinky Sh*ty kitty!
a b à CPUs
January 26, 2010 8:26:14 PM

xtc28 said:
Well Ill say that it doesnt matter if it was an optimization or not ................. The AMDS do better ...... that is what counts! Quit whinning about so called cheating and get on with it already! What ever a company can do to make the APps run better is a great thing as far as I am concerned! Lets say I am running a my 2002 ford pickup truck with a V6 against my 2002 camaro with a V8. Let me guess you think the V8 in the camaro would win right? Wrong because I have OPTIMIZED my V6 in my truck it will kick the living sh## out of my camaro. Am I cheating? No absolutely not. Oh and by the way All I did was tweak the Computer programming. OH but thats cheating in the PC world, eh? If Intel did the optimizations, well then AMD would get skunked some more. BUT THEY DONT!!



Dont do drugs !


All im saying is that AMD has a discrete video card, a chipset and a processor system which Intel does not..

I am saying the software which connects all these items is engineered to perform better than if would be a Intel Processor, a intel based motherboard and a ATI video card...

But as fanbois are fanbois they still cant remove the crap from their eyes and take this as what is, as apposed to what does.


AMD optimizes their fusion software all the way when you have a complete AMD system. I would be nice to see the benchmarks if amd optimized it for a intel system as much.. Thats all im saying and this helps the products benchmarks which are enclosed in the package.

and again someone jumps on their horse.

and this goes for all fricken fanboys weather your intel or amd

January 26, 2010 11:00:16 PM

Woah BUB! Im no fan boy of either, I am a realist.... PERIOD! If AMD were to optimize for intel ....... well lets just say thats bad business! If intel would go about things slightly the way that AMD has there would be a difference. As for the drug comment ...... please grow up!!!
January 26, 2010 11:19:35 PM

Ummm hellboy, you realize the system doesnt optimize itself automatically?
a b à CPUs
January 27, 2010 12:55:29 AM

xtc28 said:
Woah BUB! Im no fan boy of either, I am a realist.... PERIOD! If AMD were to optimize for intel ....... well lets just say thats bad business! If intel would go about things slightly the way that AMD has there would be a difference. As for the drug comment ...... please grow up!!!



You gotta excuse those Brits, ;) 

a b à CPUs
January 27, 2010 6:39:05 AM

BadTrip said:
You gotta excuse those Brits, ;) 




yeah grow up...



Thats the whole point - AMD would not optimize for Intel..

Yes i realise AMD doesnt optimize automatically but are benchmarks optimized with this software.. More than likely

AMD and Nvidia did this with Futuremark and both were found guilty of cheating.
January 27, 2010 8:20:19 AM

No, ATI did this a long time ago, and hasnt since, unlike the Crysis shimmer etc from nVidia, or many others.
AMD didnt do this
January 27, 2010 12:11:08 PM

Again can someone please explain why an optimization is cheating! If it actually does make things faster HOW COULD it be cheating?!!! Fusion is designed to do just that ... speed things up!!!!! Why in the Hell would it matter how they are accomplishing it. Now a benchmark on the other hand is a different story. If a benchmark gives you one score that suggests the hardware will perform better than it actually does then shame on whomever is producing the crap ass benchmark. Then we have actual performance, When the optimization "Fusion" is turned on we see an increase in performance from shut downs of background services, some minor tweaks, and depending on the level of Fusion setting you could actually get some OC in there too!!!! Now if you want to do that with intel and Nvidia go right ahead it is possible, Its just not the fusion software. You could do it with another FEW programs if you like. This is not about being a fan just pointing out that the logic behind what is being said HAS NO GROUND TO STAND ON!

I say again

If AMD were to optimize for Intel as well then THAT would be bad business .......... PERIOD!!!
January 27, 2010 12:39:53 PM

Intel must be lazy err I mean crazy not to do this
January 27, 2010 12:53:21 PM

they all do it every chance they get. if you aren't optimizing for your own products best interest, you deserve to be out of business.
January 27, 2010 1:38:50 PM

Intel does not optimize in the way that AMD does with Fusion. They just cannot at the moment do this. They need an entire platform to do so. There are in fact internal optimizations in windows and other applications that take better advantage of Intel hardware over AMD. The sad thing is that Microsoft doesnt optimize for both Intel and AMD, other programs as well.
January 27, 2010 2:10:59 PM

Ahhh, not so fast my friend.
What does Intel do with its IGPs? They CHEAT!!
January 27, 2010 2:30:25 PM

Explain please.
January 27, 2010 2:59:11 PM

They use their cpus to boost perf on them
a b à CPUs
January 27, 2010 3:42:35 PM

Yup true. They also fiddled the numbers in 3dmark, Crysis and other games - something I didn't see being tested for with the new cpu's btw.
January 27, 2010 5:13:14 PM

Ahh!! Ok then well I find that acceptable hardware to boost hardware!!! I cant really say thats cheating it does what they say it will consistantly, correct?! As long as the benchmarks arent saying one thing and performance says another. I still dont find that to be a cheat. I think that was part of the point to have gpu/cpu integrated yes/no?
!