John Carmack's official comments on NV40 (GeForce 6800 fam..

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

From the nVidia news release:-
----------------------------------------------------------------------------------------------------------------
NVIDIA Corporation ( NASDAQ: NVDA), the worldwide leader in visual
processing solutions, introduced today the NVIDIA(R) GeForce(TM) 6800
models of graphics processing units (GPUs) for high-performance
desktop computers. The NVIDIA GeForce 6 Series, which includes the
flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:

-- Industry-leading 3D performance -- new superscalar 16-pipe
architecture delivers more than twice that of current industry leading
NVIDIA GPUs

-- New features, including Microsoft DirectX(R) 9.0 Shader Model 3.0
feature set -- for ultra-realistic cinematic effects

-- Unprecedented on-chip video processing engine -- enabling high-
definition video and DVD playback

"This is the biggest generation-to-generation performance leap that we
have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
of NVIDIA. "In addition to the raw performance increase, we had two
fundamental strategies with the 6800 models. First was to take
programmability to the next level with the industry's only GPU with
Shader Model 3.0. Second was to extend the reach of GPUs to the
consumer electronics market with a powerful and fully programmable
video processor capable of multiple video formats and 'prosumer' level
image processing."

"As DOOM 3 development winds to a close, my work has turned to
development of the next generation rendering technology. The NV40 is
my platform of choice due to its support of very long fragment
programs, generalized floating point blending and filtering, and the
extremely high performance," said John Carmack, president and
technical director of id Software
-------------------------------------------------------------------------------------------------------

Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.

For me personally the 6800 is as exciting a step forward in PC
peripherals as the Voodoo1 was when it first emerged. Not only for
the 6800s enormous graphical power, but also for its potential
contribution to PC-based video production and editing, which is an
active business for me. The very powerful integrated video processor
is as important to me as the graphics capability, particularly the
MPEG-2 encoding hardware elements. Adobe After Effects have
already declared support for the NV40 and no doubt other video
toolmakers like Pinnacle are looking hard at its capability. Now if
Intel would only reduce the price of the P4 EE to that of the retail
list of the 6800Ultra, or less, instead of fleecing potential
customers at $999 a pop, then I would be very happy indeed with my
video production/editing hardware after those two were installed.

John Lewis
 

skippy

Distinguished
Apr 16, 2004
18
0
18,510
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

$7 million, actually...


"K" <kayjaybee@clara.net> wrote in message
news:pan.2004.04.15.21.55.29.331455@clara.net...
> On Thu, 15 Apr 2004 18:59:37 +0000, John Lewis wrote:
>
>
> Gabe's still counting his $5,000,000 in change he got for selling ATI
> worthless pieces of paper to bundle in with their cards :)
>
> K
 
D

Deleted member k

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Thu, 15 Apr 2004 18:59:37 +0000, John Lewis wrote:


> Still have to hear from Gabe@Valve. All quiet from him, so
> far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
> Shaders 2.0 must now be just a little passe..... Far Cry V1.1
> implementation of Shader 3.0 is apparently little rough at the
> moment, but Crytek says that they are working on it. No doubt
> it will be in a polished patch by the time the NV40 is retail
> available.
>


Gabe's still counting his $5,000,000 in change he got for selling ATI
worthless pieces of paper to bundle in with their cards :)

K
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:407ed33b.8528778@news.verizon.net...
>
> "This is the biggest generation-to-generation performance leap that we
> have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
> of NVIDIA.
>


I hope so, since nVidia's reputation depends on it. That said, I have
read what Tech Report had to say about the 6800 Ultra and it looks as
though nVidia has really smartened up with their technology this time
around. I'm still left with a bad taste the way their PR department
handled the NV30 series though. I also look forward to seeing
comparitive reviews once both nVidia and ATI have all of their new cards
out.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:407f418e.36776422@news.verizon.net...
>
> You are forgiven. Very easy to happen when you lose your reading
> glasses....
>


LOL, I know the feeling as well, except that my glasses are far more
thick than just ordinary reading glasses. :)


> So, better start saving now for the 6800U ??? And power-supply.
> Might as well just get a new case and power supply of the style
> with the fan (hopefully quiet) in the middle of the cover and
> close to the 6800U.........
>


Personally, I'll probably wait for the product refresh and grab a
first-gen R420 or NV40 (probably R420) when they drop in price sharply.
I too look like I'm in the position of needing a new power supply for it
though! I'm just glad that I already have a decent case with good
cooling (Lian-Li PC-60).
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"Skippy" <what@email.com> wrote in message
news:6pDfc.19391$B%4.11013@fe2.columbus.rr.com...
> $7 million, actually...
>
>
> "K" <kayjaybee@clara.net> wrote in message
> news:pan.2004.04.15.21.55.29.331455@clara.net...
> > On Thu, 15 Apr 2004 18:59:37 +0000, John Lewis wrote:
> >
> >
> > Gabe's still counting his $5,000,000 in change he got for selling ATI
> > worthless pieces of paper to bundle in with their cards :)
> >
> > K

And how much did nVidia pay Activison/id for the Doom 3 deal? I've heard
4-5 million. Gwarsch! Carmack has sold out to nVidia!! He's deh devil!!

John
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

> And how much did nVidia pay Activison/id for the Doom 3 deal? I've heard
> 4-5 million. Gwarsch! Carmack has sold out to nVidia!! He's deh devil!!

He also seems to be in bed with Intel. His engines always run better on
nonAMD systems.
 

NaDa

Distinguished
Mar 30, 2004
574
0
18,980
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

Carmack announced:
> "As DOOM 3 development winds to a close, my work has turned to
> development of the next generation rendering technology. The NV40 is
> my platform of choice due to its support of very long fragment
> programs, generalized floating point blending and filtering, and the
> extremely high performance," said John Carmack, president and
> technical director of id Software

When Carmack speaketh, the world stands still and the games will stutter.
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Thu, 15 Apr 2004 23:16:01 GMT, Destroy <no@thanks.com> wrote:

>He also seems to be in bed with Intel. His engines always run better on
>nonAMD systems.

Because SSE is mysteriously disabled in Q3 engined games when run on
AMD CPU's.
--
Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevent text.
Check groups.google.com before asking a question.
 

rms

Distinguished
Aug 20, 2003
463
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

> >He also seems to be in bed with Intel. His engines always run better on
> >nonAMD systems.
>
> Because SSE is mysteriously disabled in Q3 engined games when run on
> AMD CPU's.

Exactly. To be fair, the SSE-enabled AthlonXP did not exist when Q3 was
first programmed, and you can't blame Carmack for not wishing to revisit old
code.

There's a good chance he used a no doubt convenient Intel compiler
instruction that tests for the presence of an Intel cpu, rather than
actually test for SSE itself.

rms
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message >
> So, better start saving now for the 6800U ??? And power-supply.
> Might as well just get a new case and power supply of the style
> with the fan (hopefully quiet) in the middle of the cover and
> close to the 6800U.........
>
> John Lewis

Before the R420 previews are even out? Nope.

John
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:407ed33b.8528778@news.verizon.net...
> From the nVidia news release:-

> "As DOOM 3 development winds to a close, my work has turned to
> development of the next generation rendering technology. The NV40 is
> my platform of choice due to its support of very long fragment
> programs, generalized floating point blending and filtering, and the
> extremely high performance," said John Carmack, president and
> technical director of id Software
> --------------------------------------------------------------------------
-----------------------------

Here is another John Carmack quote:

"The G-Force 3 is going to be THE card to run our next game, Doom3!"
 

PapaSurf

Distinguished
Apr 16, 2004
28
0
18,530
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

ATI has been rather quiet so far but I will hold onto my $ till they have
spoken. May the best card win...


"John Lewis" <john.dsl@verizon.net> wrote in message
news:407ed33b.8528778@news.verizon.net...
> From the nVidia news release:-
> --------------------------------------------------------------------------
--------------------------------------
> NVIDIA Corporation ( NASDAQ: NVDA), the worldwide leader in visual
> processing solutions, introduced today the NVIDIA(R) GeForce(TM) 6800
> models of graphics processing units (GPUs) for high-performance
> desktop computers. The NVIDIA GeForce 6 Series, which includes the
> flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:
>
> -- Industry-leading 3D performance -- new superscalar 16-pipe
> architecture delivers more than twice that of current industry leading
> NVIDIA GPUs
>
> -- New features, including Microsoft DirectX(R) 9.0 Shader Model 3.0
> feature set -- for ultra-realistic cinematic effects
>
> -- Unprecedented on-chip video processing engine -- enabling high-
> definition video and DVD playback
>
> "This is the biggest generation-to-generation performance leap that we
> have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
> of NVIDIA. "In addition to the raw performance increase, we had two
> fundamental strategies with the 6800 models. First was to take
> programmability to the next level with the industry's only GPU with
> Shader Model 3.0. Second was to extend the reach of GPUs to the
> consumer electronics market with a powerful and fully programmable
> video processor capable of multiple video formats and 'prosumer' level
> image processing."
>
> "As DOOM 3 development winds to a close, my work has turned to
> development of the next generation rendering technology. The NV40 is
> my platform of choice due to its support of very long fragment
> programs, generalized floating point blending and filtering, and the
> extremely high performance," said John Carmack, president and
> technical director of id Software
> --------------------------------------------------------------------------
-----------------------------
>
> Still have to hear from Gabe@Valve. All quiet from him, so
> far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
> Shaders 2.0 must now be just a little passe..... Far Cry V1.1
> implementation of Shader 3.0 is apparently little rough at the
> moment, but Crytek says that they are working on it. No doubt
> it will be in a polished patch by the time the NV40 is retail
> available.
>
> For me personally the 6800 is as exciting a step forward in PC
> peripherals as the Voodoo1 was when it first emerged. Not only for
> the 6800s enormous graphical power, but also for its potential
> contribution to PC-based video production and editing, which is an
> active business for me. The very powerful integrated video processor
> is as important to me as the graphics capability, particularly the
> MPEG-2 encoding hardware elements. Adobe After Effects have
> already declared support for the NV40 and no doubt other video
> toolmakers like Pinnacle are looking hard at its capability. Now if
> Intel would only reduce the price of the P4 EE to that of the retail
> list of the 6800Ultra, or less, instead of fleecing potential
> customers at $999 a pop, then I would be very happy indeed with my
> video production/editing hardware after those two were installed.
>
> John Lewis


---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.657 / Virus Database: 422 - Release Date: 4/13/2004
 
D

Deleted member k

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Fri, 16 Apr 2004 03:05:56 +0000, teqguy wrote:


>
>
> I'll take SSE2 over 3DNOW! any day....
>
>

SSE2 is 2nd generation SIMD instructions, 3DNow is first generation
and were the first FP SIMD instructions ever used on a CPU. You're
comparing old and new.

Besides, all AMD64 CPUs support SSE2 anyway.

K
 

Pluvious

Distinguished
Oct 12, 2002
263
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Fri, 16 Apr 2004 11:52:07 -0400, "Zimmy" <zimmy@msn.com> wrote:

||"John Lewis" <john.dsl@verizon.net> wrote in message
||news:407ed33b.8528778@news.verizon.net...
||> From the nVidia news release:-
||
||> "As DOOM 3 development winds to a close, my work has turned to
||> development of the next generation rendering technology. The NV40 is
||> my platform of choice due to its support of very long fragment
||> programs, generalized floating point blending and filtering, and the
||> extremely high performance," said John Carmack, president and
||> technical director of id Software
||> --------------------------------------------------------------------------
||-----------------------------
||
||Here is another John Carmack quote:
||
||"The G-Force 3 is going to be THE card to run our next game, Doom3!"
||

I don't care what Carmack or Sweeny say about video cards folks...
they are in bed together.. so its invalid. I'll let the specs and head
to head reviews make my decisions for me thank you. Use your heads
guys..

Actually I was rather amused to see the preview at HardOCP
(http://www.hardocp.com/article.html?art=NjA2). The performance is
now-where near where I expected. 10-15 fps better then the ATI 9800 XT
is a joke. Of course I expect the 3rd party companies to juice up the
card a bit. We'll see.. but my money is on ATI for retaining the crown
this next generation.

Pluvious
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

>I don't care what Carmack or Sweeny say about video cards folks...
they are in bed together.. so its invalid. I'll let the specs and head
to head reviews make my decisions for me thank you. Use your heads
>guys..

>Actually I was rather amused to see the preview at HardOCP
(http://www.hardocp.com/article.html?art=NjA2). The performance is
now-where near where I expected. 10-15 fps better then the ATI 9800 >is a joke.
Of course I expect the 3rd party companies to juice up the
card a bit. We'll see.. but my money is on ATI for retaining the crown
this next generation.

>Pluvious


If you noticed
the 5950 beat the 9800 on most of the benchmarks also and the 6800 killed the
9800 on a lot of the games.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"Pluvious" <Pluvious@nowhere.com> wrote in message
news:lb2080h4koekvadj6c7pbfik8kjs2kp2pt@4ax.com...
> On Fri, 16 Apr 2004 11:52:07 -0400, "Zimmy" <zimmy@msn.com> wrote:
>
> ||"John Lewis" <john.dsl@verizon.net> wrote in message
> ||news:407ed33b.8528778@news.verizon.net...
> ||> From the nVidia news release:-
> ||
> ||> "As DOOM 3 development winds to a close, my work has turned to
> ||> development of the next generation rendering technology. The NV40 is
> ||> my platform of choice due to its support of very long fragment
> ||> programs, generalized floating point blending and filtering, and the
> ||> extremely high performance," said John Carmack, president and
> ||> technical director of id Software
>
||> ------------------------------------------------------------------------
--
> ||-----------------------------
> ||
> ||Here is another John Carmack quote:
> ||
> ||"The G-Force 3 is going to be THE card to run our next game, Doom3!"
> ||
>
> I don't care what Carmack or Sweeny say about video cards folks...
> they are in bed together.. so its invalid. I'll let the specs and head
> to head reviews make my decisions for me thank you. Use your heads
> guys..
>
> Actually I was rather amused to see the preview at HardOCP
> (http://www.hardocp.com/article.html?art=NjA2). The performance is
> now-where near where I expected. 10-15 fps better then the ATI 9800 XT
> is a joke. Of course I expect the 3rd party companies to juice up the
> card a bit. We'll see.. but my money is on ATI for retaining the crown
> this next generation.
>
> Pluvious

Yes, but note how HardOCP test - with different settings for different cards
for many of the tests.

Look here for a case of a doubling of performance:

http://www.anandtech.com/video/showdoc.html?i=2023&p=13

--
Derek
 

BP

Distinguished
Apr 2, 2004
264
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Fri, 16 Apr 2004 20:36:41 +0100, "Derek Baker"
<me@XYZderekbaker.eclipse.co.uk> wrote:

>"Pluvious" <Pluvious@nowhere.com> wrote in message
>news:lb2080h4koekvadj6c7pbfik8kjs2kp2pt@4ax.com...
>> On Fri, 16 Apr 2004 11:52:07 -0400, "Zimmy" <zimmy@msn.com> wrote:
>>
>> ||"John Lewis" <john.dsl@verizon.net> wrote in message
>> ||news:407ed33b.8528778@news.verizon.net...
>> ||> From the nVidia news release:-
>> ||
>> ||> "As DOOM 3 development winds to a close, my work has turned to
>> ||> development of the next generation rendering technology. The NV40 is
>> ||> my platform of choice due to its support of very long fragment
>> ||> programs, generalized floating point blending and filtering, and the
>> ||> extremely high performance," said John Carmack, president and
>> ||> technical director of id Software
>>
>||> ------------------------------------------------------------------------
>--
>> ||-----------------------------
>> ||
>> ||Here is another John Carmack quote:
>> ||
>> ||"The G-Force 3 is going to be THE card to run our next game, Doom3!"
>> ||
>>
>> I don't care what Carmack or Sweeny say about video cards folks...
>> they are in bed together.. so its invalid. I'll let the specs and head
>> to head reviews make my decisions for me thank you. Use your heads
>> guys..
>>
>> Actually I was rather amused to see the preview at HardOCP
>> (http://www.hardocp.com/article.html?art=NjA2). The performance is
>> now-where near where I expected. 10-15 fps better then the ATI 9800 XT
>> is a joke. Of course I expect the 3rd party companies to juice up the
>> card a bit. We'll see.. but my money is on ATI for retaining the crown
>> this next generation.
>>
>> Pluvious
>
>Yes, but note how HardOCP test - with different settings for different cards
>for many of the tests.
>
>Look here for a case of a doubling of performance:
>
>http://www.anandtech.com/video/showdoc.html?i=2023&p=13

So how much are these new super cards going to cost ?
Twice the cost of the PC I install it in ?
 

Tim

Distinguished
Mar 31, 2004
1,833
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"Destroy" <no@thanks.com> wrote in message
news:RIEfc.98178$4B1.51451@twister.rdc-kc.rr.com...
> > And how much did nVidia pay Activison/id for the Doom 3 deal? I've
heard
> > 4-5 million. Gwarsch! Carmack has sold out to nVidia!! He's deh
devil!!
>
> He also seems to be in bed with Intel. His engines always run better on
> nonAMD systems.
>

LOL I know you're kidding. I hear he's also cozy with Microsoft since his
games are first released on their platforms. Like Intel, their lion's share
of the market has *nothing* to do with it.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Fri, 16 Apr 2004 11:06:11 GMT, "John Reynolds"
<JohnrReynolds@woh.rr.com> wrote:

>"John Lewis" <john.dsl@verizon.net> wrote in message >
>> So, better start saving now for the 6800U ??? And power-supply.
>> Might as well just get a new case and power supply of the style
>> with the fan (hopefully quiet) in the middle of the cover and
>> close to the 6800U.........
>>
>> John Lewis
>
>Before the R420 previews are even out? Nope.
>

Ah, and at your age, I was rather hoping that a residual spark of true
adventure still existed......

As for me, when I can afford it, the 6800U, regardless of the X800 ---
not only for the graphics power, but the very powerful VPU and the
strong likelihood that it will perform excellently with both current
and my classic-legacy software, just as my current FX5900/56.72 does.
Excluding Glide games, of course. My second machine has a V5 5500
with Win Me just for that purpose.

John Lewis


>John
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Fri, 16 Apr 2004 16:35:08 GMT, Pluvious <Pluvious@nowhere.com>
wrote:

>On Fri, 16 Apr 2004 11:52:07 -0400, "Zimmy" <zimmy@msn.com> wrote:
>
>||"John Lewis" <john.dsl@verizon.net> wrote in message
>||news:407ed33b.8528778@news.verizon.net...
>||> From the nVidia news release:-
>||
>||> "As DOOM 3 development winds to a close, my work has turned to
>||> development of the next generation rendering technology. The NV40 is
>||> my platform of choice due to its support of very long fragment
>||> programs, generalized floating point blending and filtering, and the
>||> extremely high performance," said John Carmack, president and
>||> technical director of id Software
>||> --------------------------------------------------------------------------
>||-----------------------------
>||
>||Here is another John Carmack quote:
>||
>||"The G-Force 3 is going to be THE card to run our next game, Doom3!"
>||
>
>I don't care what Carmack or Sweeny say about video cards folks...
>they are in bed together.. so its invalid. I'll let the specs and head
>to head reviews make my decisions for me thank you. Use your heads
>guys..
>
>Actually I was rather amused to see the preview at HardOCP
>(http://www.hardocp.com/article.html?art=NjA2). The performance is
>now-where near where I expected. 10-15 fps better then the ATI 9800 XT
>is a joke.

a) Different AA and Aniso settings.

b) 60.72 is the VERY FIRST PUBLISHED driver for this card.
It is probably going to take a year to fully optimise the
compiler, so what you are seeing is the WORST performance
for this card. Also, it is going to take a while for game-
developers to "get their arms around" the new features.

>Of course I expect the 3rd party companies to juice up the
>card a bit.

Not much, if you are thinking hardware, like overclock. Silicon design
tools have got more accurate and process variations smaller. I suspect
that 450 max overclock will be a norm on this giant part without
water-cooling or similar. Hoiwever, this is a hugely-programmable
part. Any spectacular performance improvements will come from
software magic.

> We'll see.. but my money is on ATI for retaining the crown
>this next generation.

It will be a little tilted regardless. NVidia has executed a
brilliant design; let's hope for Ati's sake that their crown
doesn't fall off and be trampled by the herd queueing up
to buy variants of the 6800.

John Lewis

>
>Pluvious
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:408055c9.24991940@news.verizon.net...
> Ah, and at your age, I was rather hoping that a residual spark of true
> adventure still existed......
>
> As for me, when I can afford it, the 6800U, regardless of the X800 ---
> not only for the graphics power, but the very powerful VPU and the
> strong likelihood that it will perform excellently with both current
> and my classic-legacy software, just as my current FX5900/56.72 does.
> Excluding Glide games, of course. My second machine has a V5 5500
> with Win Me just for that purpose.
>
> John Lewis

The wise man waits, while fools rush in. ;)

Besides, both parts are liable to reach the market relatively close to one
another. No sense making up my mind when neither are available yet.

John
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
> It will be a little tilted regardless. NVidia has executed a
> brilliant design; let's hope for Ati's sake that their crown
> doesn't fall off and be trampled by the herd queueing up
> to buy variants of the 6800.
>
> John Lewis

The herd is a good word to use to describe anyone who's already made up
their minds at this point in time.

John
 
D

Deleted member k

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

On Fri, 16 Apr 2004 22:46:42 +0000, John Reynolds wrote:
>
> The herd is a good word to use to describe anyone who's already made up
> their minds at this point in time.
>
>

Much rather have a herd of people who want the best for their money rather
than a herd of fanboys saying that they'll wait for their beloved gfx
company to come out with something better.

As far as I'm concerned the numbers speak for themselves and my next card
will be by Nvidia. I currently own a R9600 Pro, it was the best bang for
the buck at the time but ATI's shitty Linux support has left a bad taste
in my mouth.

K
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia,comp.sys.ibm.pc.games.action (More info?)

"K" <kayjaybee@clara.net> wrote in message
news:pan.2004.04.17.00.59.54.601442@clara.net...
> On Fri, 16 Apr 2004 22:46:42 +0000, John Reynolds wrote:
> >
> > The herd is a good word to use to describe anyone who's already made up
> > their minds at this point in time.
> >
> >
>
> Much rather have a herd of people who want the best for their money rather
> than a herd of fanboys saying that they'll wait for their beloved gfx
> company to come out with something better.

And where have I written that R420 will be better? I merely suggested
waiting until both parts were announced, previewed, and then making a
decision. If you can find a fanboy-like behavior in that course of action,
more power to 'ya.

> As far as I'm concerned the numbers speak for themselves and my next card
> will be by Nvidia. I currently own a R9600 Pro, it was the best bang for
> the buck at the time but ATI's shitty Linux support has left a bad taste
> in my mouth.

The #s speak for themselves, but against what? Last generation's products?
Well, I would certainly expect a next gen. product to outperform last gen.
parts, but I'm just funny that way.

John