Money-No-Object D3D10 Video Card in Q2 07: Rumors, Guesses??

orsino

Distinguished
Aug 29, 2006
268
0
18,780
If I should be foolish enough (and I probably will be) to blow the wad in Q2 07 on a video card for a single monitor (approx 24" flat, likely no higher than 1600x1200) but I wanted the absolute killer take-no-prisoners fastest video card practicable does anyone have any rumors or estimates of what that would be? G80? Higher?
 

Morton

Distinguished
Aug 19, 2005
404
0
18,790
R600 will probably be better than G80, but it will appear 3-4 months later. Gforce 8800 GTX is going to be the best DX10 card in 2006.
 

fonzy

Distinguished
Dec 23, 2005
398
1
18,785
R600 will probably be better than G80, but it will appear 3-4 months later. Gforce 8800 GTX is going to be the best DX10 card in 2006.

It's going to be the only DX10 card in 2006...but yeah I would wait for R600 cards to come out.
 

mkaibear

Distinguished
Sep 5, 2006
678
0
18,990
nVidia's will be the first - it will be power hungry (up to 150W is the figure I've heard bandied around), with an external power brick. Should be here before the end of 2006, but I wouldn't hold my breath!

ATi's will be the fastest - though will be even more power hungry (up to 250W, allegedly!!!), with an external power brick. Should be here in Q1 '07, maybe Q2 '07.

If I was going to buy a 24" monitor in Q2 '07 (and oh how I wish I could ;)), I would wait for the first revision of the cards, which should reduce power consumption, etc, whilst keeping the decent speeds.
 

DaveUK

Distinguished
Apr 23, 2006
383
0
18,790
Gforce 8800 GTX is going to be the best DX10 card in 2006.
It's going to be the only DX10 card in 2006

I love how these guys can see into the future and apparently hold jobs at both ATI and nVidia.

Bottom line - if nVidia get their card out soon enough and it performs by a big margin ahead of ATI, then ATI may be forced to lay their cards earlier.

No manufacturer is going to want to be utterly dominated over the christmas period.
 

orsino

Distinguished
Aug 29, 2006
268
0
18,780
Well, money may be no object to get a great card, but I'm not ready to pay to be a card tester! :D

R600, huh? Does the external power brick plug directly into the AC to keep the drain off the PSU? That might obviate my desire to blow $500 on that PCP&C 1KW PSU. What are the chances are that it will be priced in the range currently occupied by the X1900 XTX (and the less buggy revisions) by the time Summer '07 comes around?
 

cleeve

Illustrious
Bottom line - if nVidia get their card out soon enough and it performs by a big margin ahead of ATI, then ATI may be forced to lay their cards earlier.

Isn't the bottom line that DirectX 10 won't be available for Windows XP, and it looks like Vista is going to be horribly late...? :p
 

orsino

Distinguished
Aug 29, 2006
268
0
18,780
Isn't the bottom line that DirectX 10 won't be available for Windows XP, and it looks like Vista is going to be horribly late...? :p

Personally I don't expect all this stuff to be on the market until it's beach time again, so I think I'm being realistic. I'm sure that I can keep the Prescooker 3.6 from melting until then. I think that the wishful thinkers who expect M$ to pull a rabbit out of its butt and come out with Vista in time for the Xmas PC sales are somewhat deluded.
 

zagor

Distinguished
Sep 27, 2006
3
0
18,510
I don-t get it,why everyone are thinking R600 is going to be better then G80??!! There are just speculations on both of the new cores and nVidia is very quiet as allways.Remember what was it like when NV40 was in preparations-nVidia was quiet and no one expected it would be the beast with 16 pipelines!
 

orsino

Distinguished
Aug 29, 2006
268
0
18,780
Well, if I'm gonna wait for Rev. 1 on these cards anyway, the benchmarks will be very clear by then between R600 and G80. I'll definitely go for the hottest one!
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
nVidia's will be the first - it will be power hungry (up to 150W is the figure I've heard bandied around), with an external power brick. Should be here before the end of 2006, but I wouldn't hold my breath!

ATi's will be the fastest - though will be even more power hungry (up to 250W, allegedly!!!), with an external power brick. Should be here in Q1 '07, maybe Q2 '07.

If I was going to buy a 24" monitor in Q2 '07 (and oh how I wish I could ;)), I would wait for the first revision of the cards, which should reduce power consumption, etc, whilst keeping the decent speeds.

quit these rumors of an external brick, that just aint gonna happen
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
Well, money may be no object to get a great card, but I'm not ready to pay to be a card tester! :D

R600, huh? Does the external power brick plug directly into the AC to keep the drain off the PSU? That might obviate my desire to blow $500 on that PCP&C 1KW PSU. What are the chances are that it will be priced in the range currently occupied by the X1900 XTX (and the less buggy revisions) by the time Summer '07 comes around?

if you get any 600W or above psu by a brand name you shouldn't worry about the power requirements of future dx10 cards
 

Doughbuy

Distinguished
Jul 25, 2006
2,079
0
19,780
Like I said many times again, there should be a bot that auto merges any thread with DX10 in the title together...

On to the topic of discussion, if money was no object, you'd buy out both Nvidia and ATI for around 6-7 billion each, and then pocket the prototypes for yourself. Then, you have the biggest e-peen in the world and bragging rights.
 

orsino

Distinguished
Aug 29, 2006
268
0
18,780
@IcY18:

Think 600W will do it? Kinda worried when you're looking at C2Q sucking up almost 200W, five big HDs, and these estimates being tossed around of some cards soaking up 250W! Maybe that new Intel PSU 90% efficiency on one rail might do the job!

@Doughbuy:

If I had $15 billion, I'd buy a South Pacific island and stock it with hot and cold running blondes. Let's put a Money Is No Object As Long As It's Not Over A Grand addendum on this! :D
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
@IcY18:

Think 600W will do it? Kinda worried when you're looking at C2Q sucking up almost 200W, five big HDs, and these estimates being tossed around of some cards soaking up 250W! Maybe that new Intel PSU 90% efficiency on one rail might do the job!

@Doughbuy:

If I had $15 billion, I'd buy a South Pacific island and stock it with hot and cold running blondes. Let's put a Money Is No Object As Long As It's Not Over A Grand addendum on this! :D

CPUs are not sucking up almost 200W, most people out there will never use their CPU at maximun load unless they are stress testing or benchmarking there and with the new Core 2 Quad rated at 125W i don't see any CPUs nearing the 200W mark since Intel's new goal is all about price per watt, and the only reason the Core 2 Quad is has that high of a rating is because its 2, mind you 2 Core 2 Duos, the fact that they kept the rating that low is great in itself, don't expect cpu wattage ratings to go any higher than 150W in the near future

expect new native quad cores to suck even less wattage, the cpu wattage requirement should not ever worry you unless you plan on overclocking,

five big HDs...i don't think so, with Seagate and Western Digital pushing the limits of gb/platter you'll be seeing 1.5TB harddrives soon enough, we already have a 750GB HD, and that along with flash memory in combination with regular harddrives to make a hybrid drive would allow the hd to consume even less power.

what people lack to understand is that a cpu and gpu and hds will never run at full maximun load all at the same time for even a short time, and ratings by nvidia and ATi are on the safe side to prevent the community to go up in arms because they said you need a 27A psu, when you really needed a 30A+, with the pc usually never at full load even if your psu is rated at 600W and somehow you manage to get your wattage rating

no one really knows how much wattage their computer actually draws?

how much would a a power hungry (just as power hungry as the C2Q)FX-60, 2*7900GTXs, draw at full maximun load? 500W? 450W?
a measly 371W, bump that up some for the X1900XTXs, to say about 410W(overestimating) then say dx10 takes the an overestimated big leap of 150W, (thats 75W extra watts per card) and that puts you at 560W, even then after over estimating your still under 600W, granted this is when the gfx card are at full load

so in the end i think a 600W psu will be fine, but if you wanted to be on the safe side a OCZ GamerXtreme 750W would be overkill , but if you want to be safe there you go, anyone with a 1KW PSU has gone way over the top and then some...
 

SEALBoy

Distinguished
Aug 17, 2006
1,303
0
19,290
If memory serves me right, the 3dfx Voodoo5 card (the one with 4 chips on it) was the last video card to come out with an external power supply. Or was that the Voodoo6 prototype that never hit the market? Maybe both.
 

orsino

Distinguished
Aug 29, 2006
268
0
18,780
@ IcY18, thanks for the clarifications. I want to run RAID 1, so I'll have to have at least two monster HDs and a third for offsite backup, but I'll still be ok powerwise. And please let me know which PSU you have in mind that's better and cheaper than that 1.1KW killerthingy? I'd love to check it out.

@ SEALBoy, I've just quickly run through some 3dfx Voodoo 5 sites and didn't see an external brick. The card seems to have a molex adaptor wire that could plug into one, but then again, my 6600 AGP has a molex on it too. It's plugged into the PSU but would there really be anything wrong with running a separate PSU for the videocard alone in that case, rather than an external brick?
 

orsino

Distinguished
Aug 29, 2006
268
0
18,780
That 700 looks great. And only $130. Not bad. Good thing also it only has one 120mm fan so it would be easy to disable and place in front of my 305mm. Thanks!
 

SEALBoy

Distinguished
Aug 17, 2006
1,303
0
19,290
Orsino, I did a bit of researching myself... it turns out the the Voodoo5 5500, which had 2 VSA-100 chips on it, did not need an external power brick, but the never-released Voodoo5 6000 prototype did, since its 4 VSA-100 chips drew nearly 100W of power, and in those days of 250-300W power supplies, that was too much.

Check out the proof here
 

orsino

Distinguished
Aug 29, 2006
268
0
18,780
Very interesting! I wonder how long it will be until ATI or Nvidia are forced to come out with their own version of Voodoo Volts!!!